Irving Wladawsky-Bergertag:typepad.com,2003:weblog-1565492016-12-05T06:57:52-05:00A collection of observations, news and resources on the changing nature of innovation, technology, leadership, and other subjects.TypePadHow Will AI Likely Impact How We Work, Live and Play?tag:typepad.com,2003:post-6a00d8341f443c53ef01bb0944c949970d2016-12-05T06:57:52-05:002016-12-05T06:57:52-05:00A few weeks ago I discussed whether AI is finally reaching a tipping point, mostly based on a recently published report, - Artificial Intelligence and Life in 2030. The report was developed by a study panel of AI experts convened...IWB

A few weeks ago I discussed whether AI is finally reaching a tipping point, mostly based on a recently published report, - Artificial Intelligence and Life in 2030.The report was developed by a study panel of AI experts convened by the One Hundred Year Study of AI (AI100), an initiative launched at Stanford University in December, 2014 “to study and anticipate how the effects of artificial intelligence will ripple through every aspect of how people work, live and play.”To better understand the future impact of AI on everyday lives, the panel focused the study on the likely influence of AI on a typical North American city by the year 2030.

The report is organized into three main sections.Section I, - What is Artificial Intelligence?, - describes how researchers and practitioners define AI, as well as the key research trends that will likely influence AI’s future.Section II looks into AI’s overall impact on various sectors of the economy, while the third Section, examines AI issues related to public policy.

My previous discussion was primarily focused on Section I.I’d now like to turn my attention to Section II, - AI by Domain.To help analyze where AI might be heading, the study panel narrowed its explorations to the eight domains most likely to be impacted by AI:

Transportation.“Autonomous transportation will soon be commonplace and, as most people’s first experience with physically embodied AI systems, will strongly influence the public’s perception of AI.”

Home/Service Robots.“Over the next fifteen years, coincident advances in mechanical and AI technologies promise to increase the safe and reliable use and utility of home robots in a typical North American city.”

Healthcare.“AI-based applications could improve health outcomes and quality of life for millions of people in the coming years - but only if they gain the trust of doctors, nurses, and patients.”

Public safety and security.“One of the more successful uses of AI analytics is in detecting white collar crime, such as credit card fraud.Cybersecurity (including spam) is a widely shared concern.”

Employment and workplace.“AI will likely replace tasks rather than jobs in the near term, and will also create new kinds of jobs.But the new jobs that will emerge are harder to imagine in advance than the existing jobs that will likely be lost.”

Entertainment.“AI will increasingly enable entertainment that is more interactive, personalized, and engaging.”

For each of these eight domains, the panel examined the progress made in the past 15 years and anticipated potential developments over the next 15.Let me discuss their findings in a few of these domains.

For a while now, cars have been getting smarter.The auto industry has been adding L1 features to cars for over 40 years, and more advanced L2 features for the past two decades.While some of the features are aimed at user convenience, safety has been far and away the overriding objective.

The NHTSA estimates that there were over 35,000 US traffic deaths in 2015, - the largest such increase in decades. The number of people injured in 2015 also went up to over 2.4 million people. 94 percent of these crashes can be tied back to various kinds of human error. AI research on vehicle automation is very important because the stakes are so high. These technologies will significantly improve the overall safety of cars and help reduce our large numbers of traffic accidents, deaths and serious injuries.

When are we likely to see self-driven vehicles coursing along our streets and highways?“It is not yet clear how much better self-driving cars need to become to encourage broad acceptance,…” notes the AI100 report.“But if future self-driving cars are adopted with the predicted speed, and they exceed human-level performance in driving, other significant societal changes will follow.”

“Self-driving cars will eliminate one of the biggest causes of accidental death and injury in United States, and lengthen people’s life expectancy.On average, a commuter in US spends twenty-five minutes driving each way.With self-driving car technology, people will have more time to work or entertain themselves during their commutes.And the increased comfort and decreased cognitive load with self-driving cars and shared transportation may affect where people choose to live.”

Home/Service Robots

The Roomba home robot was first introduced in 2012 by iRobot.Since then, the company has sold over 15 million home robots around the world for a variety of applications including cleaning, scrubbing and mopping floors, and cleaning pools and gutters.Home robots are available from a number of companies for similar, narrowly defined applications.

“Early expectations that many new applications would be found for home robots have not materialized. Robot vacuum cleaners are restricted to localized flat areas, while real homes have lots of single steps, and often staircases; there has been very little research on robot mobility inside real homes.Hardware platforms remain challenging to build, and there are few applications that people want enough to buy.Perceptual algorithms for functions such as image labeling, and 3D object recognition, while common at AI conferences, are still only a few years into development as products.”

But, the future looks a lot more promising.In a 2015 Foreign Affairs article, - The Robots Are Coming: How Technological Breakthroughs Will Transform Everyday Life, - MIT professor Daniela Rus wrote that “Robots have the potential to greatly improve the quality of our lives at home, at work, and at play.Customized robots working alongside people will create new jobs, improve the quality of existing jobs, and give people more time to focus on what they find interesting, important, and exciting…By working together, robots and humans can augment and complement each other’s skills.”

In a 2014 interview, Professor Rus said that in 10 to 15 years she expects robots to be as commonplace as smartphones, “with personal robots that can help with everything from doing search-and-rescue operations to folding the laundry.”Her MIT research group, the Distributed Robotics Lab, has built robots that can “tend a garden, bake cookies from scratch, cut a birthday cake, fly in swarms without human aid to perform surveillance functions, and dance with humans.”

“Still, there are significant gaps between where robots are today and the promise of a future era of pervasive robotics, when robots will be integrated into the fabric of daily life, becoming as common as computers and smartphones are today, performing many specialized tasks, and often operating side by side with humans,” added Rus.“Current research aims to improve the way robots are made, how they move themselves and manipulate objects, how they reason, how they perceive their environments, and how they cooperate with one another and with humans.”

It’s ironic that after years of frustration with AI’s missed promises, experts worry that now that its mighty power is upon us, we still don’t know how to properly deploy it.AI advances might well lead to widespread economic dislocation.The concerns surrounding AI’s long term impact on jobs may well be in a class by themselves. Like no other technology, AI forces us to explore the very boundaries between machines and humans.

“While AI technologies are likely to have a profound future impact on employment and workplace trends in a typical North American city, it is difficult to accurately assess current impacts, positive or negative…” says the AI100 report.“There are clear examples of industries in which digital technologies have had profound impacts, good and bad,…Understanding these changes should provide insights into how AI will affect future labor demand, including the shift in skill demands…”

“To be successful, AI innovations will need to overcome understandable human fears of being marginalized. AI will likely replace tasks rather than jobs in the near term, and will also create new kinds of jobs.But the new jobs that will emerge are harder to imagine in advance than the existing jobs that will likely be lost.”

Overall, the AI300 report concludes that “Application design and policy decisions made in the near term are likely to have long-lasting influences on the nature and directions of such developments, making it important for AI researchers, developers, social scientists, and policymakers to balance the imperative to innovate with mechanisms to ensure that AI’s economic and social benefits are broadly shared across society… the technologies emerging from the field could profoundly transform society for the better in the coming decades.”

Digital Twin: Bringing the Physical and Digital Worlds Closer Togethertag:typepad.com,2003:post-6a00d8341f443c53ef01b8d237a764970c2016-11-28T09:25:06-05:002016-11-28T09:26:09-05:00A few weeks ago I first learned about a relatively new concept - Digital Twin. A Digital Twin is essentially a computerized companion to a real-world entity, be it an industrial physical asset like a jet engine, an individual’s health...IWB

A few weeks ago I first learned about a relatively new concept - Digital Twin.A Digital Twin is essentially a computerized companion to a real-world entity, be it an industrial physical asset like a jet engine, an individual’s health profile, or a highly complex system like a city.It’s a highly realistic, one-to-one digital model of each such specific physical entity.

Digital Twin helps bring the physical and digital worlds closer to each other.It’s intertwined with and complementary to the Internet of Things (IoT).The huge amounts of data now collected by IoT sensors on physical objects, personal devices and smart systems make it possible to represent their near real-time status in their Digital Twin alter-ego.

“The myriad possibilities that arise from the ability to monitor and control things in the physical world electronically have inspired a surge of innovation and enthusiasm,” said a 2015 McKinsey report on the Internet of Things.Experts estimate that the number of connected things or devices will reach 50 billion by 2020, growing to 100s of billions in the decades ahead.The economic potential of the smart solutions this makes possible is enormous, possibly reaching several trillion dollars within a decade.

The McKinsey report identified the applications areas where IoT solutions will have the biggest impact.These include:

GE is closely identified with Digital Twin, - not surprisingly given the central role played bythe Industrial IoT in the company’s overall strategy.According to GE, the Industrial Internet enables companies “to use sensors, software, machine-to-machine learning and other technologies to gather and analyze data from physical objects or other large data streams and then use those analyses to manage operations and in some cases to offer new, value-added services.”It’s particularly valuable “in the context of industries where equipment itself or patient outcomes are at the heart of the business - where the ability to monitor equipment or monitor patient services can have significant economic impact and in some cases literally save lives.”

“A vast array of industrial machines - jet engines, power generators, pipelines, locomotives increasingly are becoming connected through the Internet,” wrote Colin Parris, - GE’s VP of Software Research, - in a 2015 report.“With the amount of data generated by machine sensors rising exponentially, coupled with ever-more powerful Big Data analytics, the Industrial Internet has reached a critical tipping point.It requires industrial companies to adopt a digital mindset that embraces what the Industrial Internet can offer in new growth opportunities. Many are calling it the emergence of the data economy.”

The explosive growth of the consumer Internet over the past 15 years has created many innovative applications and business models and hundreds of billions of dollars in value.At its heart, the consumer Internet is based on connecting several billion people and extracting all kinds of insights from the huge amounts of data they generate. The Industrial Internet is similarly based on connecting 10s of billions of IoT devices and analyzing the even bigger amounts of data they’re beginning to generate.GE estimates that new Industrial Internet applications will create at least $15 billion of new value for GE alone by 2020.

A key aspect of GE’s strategy is the creation of an individual digital profile or Digital Twin for each and every industrial machine the company makes.GE is thus transforming and expanding its business models, much as consumer Internet companies, - e.g., Amazon, Facebook, Google, - have done over the past decade. Parris lists some examples of how Digital Twin profiles can help reduce costs and improve quality:

“On our GE90 Engine, we have used flight data from digital twins of our engines to save tens of millions of dollars in unnecessary service overhauls per customer.”

“With our 6FA Turbine Combined Cycle Plant, we have used digital models of these plants is helping us achieve a >1 percent increase in efficiency that will be scalable across all plants like this. At this scale, a 1 percent increase represents billions of dollars in savings.”

In a more recent article, Parris further explained the differences in value creationbetween the consumer and the Industrial Internet.“While consumer data typically determines what a particular person or group of people want, industrial data looks for the things we don’t want – detecting problems before they happen, saving our customers millions, even billions of dollars.”

A fleet of aircraft, for example, generates gigantic amounts of data over a year.But out all of that data, GE experts are looking for any serious issue that could require the airline to take the plane out of circulation and bring it in for maintenance.GE estimates that there are roughly 30 such bad events.Finding each of those potential needles in the vast data haystacks requires knowing what you’re looking for and where to look.You need both deep physical domain knowledge as well as deep software and analytics expertise.

“For example an aircraft engine blade can experience what the aviation industry terms as spallation, in which materials begin to erode from a part.This can occur in areas like the Middle East where engines can encounter sandy conditions.As a company that has been in the jet engine business for decades with deep customer relationships, spallation is a condition we know and understand very well.In fact, we have built Digital Twins of jet engines that can model these phenomena and better predict how a blade will degrade over time so that we can advise the customer on when to bring it in for maintenance before a problem occurs.Hence, we can greatly reduce the possibility of an unplanned maintenance that can take a plane unexpectedly out of service.We can avoid the airline losing money and passengers experiencing delays.”

Such an open AI ecosystem “connects not only to our mobile devices and computers - and through them to our messages, contacts, finances, calendars and work files - but also to the thermostat in the bedroom, the scale in the bathroom, the bracelet on the wrist, even the car in the driveway. The interconnection of the Internet with the Internet of Things and your own personal data, all instantly available almost anywhere via spoken conversations with an AI, could unlock higher productivity and better health and happiness for millions of people within the next few years.”

“By pooling anonymized health data and providing personalized health advice to individuals, such systems should lead to substantial improvements in health and reductions in the costs of health care. Applications of AI to financial services could reduce unintentional errors, as well as intentional (fraudulent) ones - offering new layers of protection to an aging population.”

“The secret ingredient in this technology that has been largely lacking to date is context.Up to now, machines have been largely oblivious to the details of our work, our bodies, our lives… AI systems are gaining the ability to acquire and interpret contextual cues so that they can gain these skills…Although initially these AI assistants will not outperform the human variety, they will be useful - and roughly a thousand times less expensive.”

Digital Twin is based on some of the most powerful technology trends of the past several years, - Internet of Things, Industrial Internet, predictive modeling, Big Data and analytics and artificial intelligence. Each is a major transformative technology in its own right. Together, as Digital Twin solutions, they promise to help us bring the physical and digital worlds even closer together

The Evolution of the Firmtag:typepad.com,2003:post-6a00d8341f443c53ef01bb093de1f9970d2016-11-21T13:42:06-05:002016-11-21T13:43:48-05:00A few weeks ago I discussed The Rise of the Global Superstar Company based on a recent special report on the subject by The Economist. The report noted that the decade-long trend toward increasingly concentrated global firms is somewhat surprising....IWB

“The rise of the giants is a reversal of recent history… In the 1980s and 1990s management gurus pointed to the demise of size as big companies seemed to be giving way to a much more entrepreneurial economy.Giants such as AT&T were broken up and state-owned firms were privatised.High-tech companies emerged from nowhere.Peter Drucker, a veteran management thinker, announced that ‘the Fortune 500 [list of the biggest American companies] is over.’ That chimed with the ideas of Ronald Coase, an academic who had argued in ‘The Nature of the Firm’ (1937) that companies make sense only when they can provide the services concerned more cheaply than the market can.”

Professor Coase’s views on the firm changed quite a bit over the years. In 1937 he published The Nature of the Firm, a seminal article which along with other major contributions earned him the 1991 Nobel Prize in economics.In the article, Professor Coase provided a simple answer to the question: Why do firms exist?He explained that, in principle, a firm should be able to find the cheapest, most productive goods and services by contracting them out in an efficient, open marketplace. However, markets are not perfectly fluid. Transaction costs are incurred in obtaining goods and services outside the firm, such as searching for the right people, negotiating a contract, coordinating the work, managing intellectual property and so on. Thus, firms came into being to make it easier and less costly to get work done.

Through much of the 20th century, firms kept expanding and adding people within their base countries and around the world.All that growth led to ever larger, multi-layered hierarchical organizations.It also often led to a bureaucratic culture that made it difficult to embrace new ideas and compete against faster moving startups, especially once the digital technology revolution kicked into high gear in the latter part of the century.

Over the past 20 years, the Internet radically lowered the transactions costs of obtaining goods and services outside the firm.As a result, firms started to rely on business partners for many of the functions once done in-house.In addition, the intense pace of global competition forced companies to focus their energies on their true points of differentiation instead of squandering competitive advantage by dispersing focus and investment on capabilities easily available in the marketplace.

Nevertheless, not only are large companies alive and well, but as The Economistpoints out, “the most striking feature of business today is not the overturning of the established order.It is the entrenchment of a group of superstar companies at the heart of the global economy.Some of these are old firms, like GE, that have reinvented themselves.Some are emerging-market champions, like Samsung, which have seized the opportunities provided by globalisation.The elite of the elite are high-tech wizards—Google, Apple, Facebook and the rest—that have conjured up corporate empires from bits and bytes.”

Why is this happening?Do we need a different answer to the question Why do firms exist? than the one given by Professor Coase almost 80 years ago, - firms make sense only if they can do things more cheaply than the market can. As The Economist points out, “Since firms continue to occupy a central place in the modern economy despite the enormous advances of the market in recent years, there must be other factors at work.”

Among the key such other factors is the ability to address complex technical and management problems. This is particularly important because of the growing complexity of the products, services, systems and solutions companies are now developing. Complexity generally results in unanticipated consequences and increases the risks of something going seriously wrong.

As firms increasingly rely on supply chain partners, one of their major challenges is how to best manage their distributed operations across a global network of interconnected companies. This requires not just good IT systems and well defined, data-driven processes, but also good human communications and coordination.Firms that aspire to be effective ecosystem leaders must have good social skills and trust-based working relationships with their various supply chain partners, so they can better collaborate, innovate and deal with the unexpected problems that will surely arise now and then. Firms with strong, internal management practices are in the best position to extend such practices to their external ecosystem partners.

Finally, is it time to reassess Coase’s transaction-cost theory of the firm? In fact, Professor Coase, - who passed away in 2013 at the age of 102, - addressed this question directly in a truly remarkable video presentation at a 2009 conference in his honor.Professor Coase, then 99 years old, first apologized for not being there in person, because he now gets very tired and was not feeling well. He then proceeded to talk with a sharpness of mind we would all wish on ourselves at 39, let alone at 99.

He wanted to clarify some concepts that he felt he had not quite gotten right in The Nature of the Firm, - which he now thought of as little more than an undergraduate essay, - and proceeded to explain the difference between markets and firms.Markets are artificial creations, not something that one can find that exist on their own. Markets appear when people decide to create them. They then negotiate with each other and work out the necessary contracts to make them come about. Sometimes markets work out, and sometimes they don’t.

“Firms are a little different,” Professor Coase then said. “Firms are usually based initially on the family, and they exist in that form, to a large extent today. But firms are not to be analyzed the way I did it in The Nature of the Firm, [which] talked about the firm as if it was an entity in economic theory. . . Firms are organizations in which the different parts of the firm have an interchange with other parts of the firm,… it is a sociological problem not an economic problem…”

“I discovered that there were friendships and antagonisms within firms,… one part of a firm would feel that another part is always going to mess up what they were doing. It operated in a very different way from the way it does in economic theory where you have a firm maximizing profits, knowing all the things that affected them and acting accordingly. In fact it’s very difficult to imagine a firm acting the way that is described in the textbooks”

In other words, as The Economistput it:“Companies are not just a way of keeping transaction costs to a minimum.They are proof that when people are trying to solve common problems, they are wiser collectively than they are individually. Such collective wisdom can accumulate over time and be embodied in corporate traditions that cannot be bought in the market.”

Towards a Trusted Framework for Identity and Data Sharingtag:typepad.com,2003:post-6a00d8341f443c53ef01bb09524e2a970d2016-11-14T06:39:46-05:002016-11-14T06:39:46-05:00I recently participated in a Treasury Identity Forum organized by the US Treasury Department in Washington, DC. The Forum focused “on the critical role of legal identity for financial inclusion, economic development, and anti-money laundering/counter financing of terrorism (AML/CFT) safeguards,...IWB

I recently participated in a Treasury Identity Forum organized by the US Treasury Department in Washington, DC.The Forum focused “on the critical role of legal identity for financial inclusion, economic development, and anti-money laundering/counter financing of terrorism (AML/CFT) safeguards, and the development of new technology identification/authentication solutions to help achieve these goals.”It brought together stakeholders from governments, financial service companies, FinTech startups and technologists to better understand how emerging technologies and legal frameworks can help us develop the required digital identity systems.

I was a member of a panel on how government, business and research communities can collaborate in developing workable identity solutions.Let me summarize the points I made in my introductory remarks.

From time immemorial, our identity systems have been based on face-to-face interactions and on physical documents and processes.But, the transition to a digital economy requires radically different identity systems.As the economy and society move toward a world where interactions are primarily governed by digital data and transactions, our existing methods of managing identity and data security are proving inadequate.Large-scale fraud, identity theft and data breaches are becoming common, and a large fraction of the world’s population lacks the credentials needed to be part of the digital economy.

Whether physical or digital in nature, identity is a collection of information or attributes associated with a specific entity.Identities can be assigned to three main kinds of entities: individuals, institutions, and assets. For individuals, there arethree main categories of attributes:

Inherent attributes are intrinsic to each specific individual, such as date of birth, weight, height, color of eyes, fingerprints, retinal scans and other biometrics.

Assigned attributes are attached to individuals, and reflect their relationships with different institutions.These include social security ID, passport number, driver’s license number, e-mail address, telephone numbers, and login IDs and passwords.

Accumulated attributes have been gathered over time, and can change and evolve throughout a person’s lifespan.These include education, job and residential histories, health records, friends and colleagues, pets, sports preferences, and organizational affiliations.

Attributes are used to determine the particular transactions in which the individual can rightfully participate. The attributes needed to certify your identity or permissions will vary with different kinds of transactions.For example, to buy alcohol, all you need is proof that the individual is over the legal drinking age.Approving a moderate financial transaction might require a relatively small number of attributes, but a large financial transactions like the purchase of a house will require many more attributes.Getting a passport or TSA Global Entry involves a different set of attributes from financial transactions, and so on.

These data attributes are generally siloed within different private and public sector institutions, each using its data for its own purposes.But to reach a higher level of privacy and security, we need to establish trusted data ecosystems, which requires the interoperability and sharing of data across a variety of institutions.The more data sources a trusted ecosystem has access to, the higher the probability of detecting fraud and identity theft while reducing false positives. In addition, an ecosystem with a wide variety of data sources can help foster economic inclusiveness by certifying the identities and credit worthiness of poor people with no banking affiliation.

It’s not only highly unsafe, but also totally infeasible to gather all the needed attributes in a central data warehouse.Few institutions will let their critical data out of their premises.But, there are innovative ways to move forward, in particular the identity and data sharing framework being developed at MIT Connection Science, a recently established research initiative led by MIT Media Lab professor Sandy Pentland.

Robust Digital Identity.“Identity, whether personal or organizational, is the key that unlocks all other data and data sharing functions.Digital Identity includes not only having unique and unforgeable credentials that work everywhere, but also the ability to access all the data linked to your identity and the ability to control the persona that you present in different situations…the work you, the health system you, the government you and many other permutations.Each of these pseudonym identities will have different data access associated with them, and be owned and controlled only by the core biological you.”

Universal Access. Universal access, like open data, is the kind of principle few would disagree with. However, to be effective, universal access requires a legal structure. “The U.S. Government can promote universal access by policies that provide for secure, citizen-controlled Personal Data Stores for all citizens in a manner analogous to current physical Post Office Boxes, and promote their use by making government benefits and interactions such as tax transfers and information inquiries conveniently available by mobile devices and web interfaces secured by the citizens’ digital identity.”

Distributed Internet Trust Authorities.“We have repeatedly seen that centralized system administration is the weakest link in cybersecurity, enabling both insiders and opponents to destroy our system security with a single exploit. The most practical solution to this problem is to have authority distributed among many trusted actors, so that compromise of one or even a few authorities does not destroy the system security consensus…Examples such as the blockchain that underlies most digital cryptocurrencies show that distributed ledgers can provide world-wide security even in very hostile environments.”

Distributed safe computation.“Our critical systems will suffer increasing rates of damage and compromise unless we move decisively toward pervasive use of data minimization, more encryption and distributed computation.Current firewall, event sharing, and attack detection approaches are simply not feasible as long-run solutions for cybersecurity, and we need to adopt an inherently more robust approach.The optimal technology for such an inherently safe data ecosystem is currently being built and tested [in] MIT’s Enigma project.”

“Enigma, is a decentralized computation platform enabling different parties to jointly store and run computations on data while keeping the data completely private.Enigma enables a sustainable data ecology by supporting the requirements that data be always encrypted, with computation happening on encrypted data only, by allowing owners of the data to control access to their data precisely, absolutely, and auditably, and by reliably enabling payment to data owners for use of their data…”

“Since users in Enigma are owners of their data, we use the blockchain as a decentralized secure database that is not owned by any party. This also allows an owner to designate which services can access its data and under what conditions, and so parties can query the blockchain and ensure that it holds the appropriate permissions.In addition to being a secure and distributed public database, the blockchain is also used to facilitate payments from services to computing parties and owners, while enforcing correct permissions and verifying that queries execute correctly.”

Enigma will take considerable time to develop and deploy.But a much simpler and easy-to-deploy version called OPAL (OPen ALgorithms) will soon be ready for pilot testing in a few European countries.“The concept of OPAL is that instead of copying or sharing data, algorithms are sent to existing databases, executed behind existing firewalls, and only the encrypted results are shared.This minimizes opportunities to attack databases or divert data for unapproved use,but places restrictions on the ability of an ecosystem to collaborate on data when it is in an encrypted state.Note that OPAL may be combined with anonymization identifying elements in order to reduce risk, and in the long run will evolve toward [Enigma’s] fully-encrypted, computation friendly model.”

I closed my remarks at the Treasury Identity Forum by discussing the roles of government and the private sector in advancing such initiatives.Governments have long provided us with a biological proof of identity, - i.e., our birth certificates.I expect that governments will continue to play the central role in establishing the digital equivalent of birth certificates, - core digital identities, - which will in turn be used to create the persona digital identities for each of our slices of life, e.g., work, financial, social, family, organizational affiliations, and government interactions.

Persona digital identities will be provided by various private sector identity ecosystems, each bringing together different kinds of partners, valuable data, and sophisticated technologies.These different ecosystems will compete with each other in the marketplace for our business, based on how much we trust that they will protect our personal data, privacy, security and identity.

The WEF Blueprint for Digital Identity argued that financial institutions are well positioned to drive the creation of such digital identity ecosystems because they already serve as intermediaries in many transactions, are generally trusted by consumers as safe repositories of information and assets, and their operations, - including the extensive use of customer data, - are already rigorously regulated.

Finally, as was the case with the Internet, government needs to play a leadership role in the creation of such highly complex identity ecosystems by supporting the required R&D, experimental testbeds, and legal frameworks.

The Rise of the Global Superstar Companytag:typepad.com,2003:post-6a00d8341f443c53ef01b7c8973ce1970b2016-11-08T05:42:17-05:002016-11-08T05:42:17-05:00The September 17 issue of The Economist included a special report on the rise of so called superstar companies. “Disruption may be the buzzword in boardrooms, but the most striking feature of business today is not the overturning of the...IWB

The September 17 issue of The Economist included a special report on the rise of so called superstar companies.“Disruption may be the buzzword in boardrooms, but the most striking feature of business today is not the overturning of the established order,” notesThe Economist.“It is the entrenchment of a group of superstar companies at the heart of the global economy.”

This trend toward consolidation and growing size is evidenced by a few worldwide statistics: 10% of public companies generate 80% of all profits; firms with over $1 billion in annual revenues are responsible for 60% of total global revenues; and the rate of mergers and acquisitions is more than twice what it was in the 1990s.The trend is particularly prominent in the US: in the 20 years from 1994 to 2013, the share of GDP generated by the 100 biggest companies rose from 33% to 46%; the five largest banks now account for 45% of all banking assets, up from 25% in 2000; and new firm formation has been going down since the late 1970s, leading to an overall decline in young- and medium-aged companies over the years.

The current rise of large companies is somewhat unexpected.Following the Great Depression and WW2, the country welcomed the stability promised by corporate capitalism.Big, multinational companies dominated most industries, - from GM, Ford and Chrysler in cars to Esso/Exxon, Mobil and Texaco in oil and gas.It was an era characterized by bureaucratic corporate cultures, focused on organizational power and orderly prosperity.

This all started to change a few decades later with the advent of a more innovative, fast moving entrepreneurial economy.The 1980s saw the rise of young, high-tech companies, - e.g., Microsoft, Apple, Oracle, Sun Microsystems; telecommunications was deregulated; ATT was broken up; and Silicon Valley became the global hub for innovation, emulated by regions around the world.

The nascent Internet, Web and e-commerce pushed all these trends into hyperdrive in the 1990s.The Internet made it much easier for companies to transact with each other around the world.Vertically integrated firms evolved into virtual enterprises, increasingly relying on supply chain partners for many of the functions once done in-house.Management experts noted that large firms were no longer necessary and would in fact be at a disadvantage in this emerging Internet era when competing against agile, innovative smaller companies.

But… it hasn’t quite worked out as expected.“Silicon Valley is a very different place from what it was in the 1990s.Back then it was seen as the breeding ground of a new kind of capitalism - open-ended and freewheeling - and a new kind of business organisation - small, nimble and fluid.Companies popped up to solve specific problems and then disappeared.Nomadic professionals hopped from one company to another, knowing that their value lay in their skills rather than their willingness to wear the company collar.”

“Today the valley has been thoroughly corporatised: a handful of winner-takes-most companies have taken over the world’s most vibrant innovation centre, while the region’s (admittedly numerous) startups compete to provide the big league with services or, if they are lucky, with their next acquisition.”Similar corporatization scenarios have played out around the world.

What happened?Why are large companies not only thriving but getting bigger?What has changed?The Economist argues that three major forces are responsible for this new era of concentration: technology, globalization, and regulation.

Technology.The industrial economy was driven by supply-side innovations.Companies leveraged technological advances and economies of scale to become bigger.Due to the massive fixed costs of physical assets, firms achieving higher volumes and production efficiencies had lower overall unit costs for their product, allowing them to reduce prices and further increase volumes.

In the digital economy, on the other hand, the driving force is demand-side economies of scale, generally achieved through platforms and network effects.The more products or services a platform offers, the more users it will attract, helping it then attract more offerings from ecosystem partners, which in turn brings in more users, - which then makes the platform even more valuable.Moreover, the larger the network, the more data is available to customize offerings to user preferences and better match supply and demand, further increasing the platform’s value.

“Most of the new tech firms are platforms that connect different groups of people and allow them to engage in mutually beneficial exchanges. Older tech companies too are putting increasing emphasis on the platform side of their business. Everyone wants to sit at the heart of a web of connected users and devices that are constantly opening up further opportunities for growth.”

“In some ways these tech giants look not so much like overgrown startups but more like traditional corporations.The open-plan offices and informal dress codes are still there, but their spirit is changing. They are investing more in traditional corporate functions such as sales and branding. This corporatisation is one reason for the companies’ success.”

Globalization.“An annual list of the world’s top multinationals produced by the United Nations Conference on Trade and Development (UNCTAD) shows that, judged by measures such as sales and employment, such companies have all become substantially bigger since the mid-1990s.They have also become more and more complex.UNCTAD points out that the top 100 multinationals have an average of 20 holding companies each, often domiciled in low-tax jurisdictions, and more than 500 affiliates, operating in more than 50 countries.”

Global companies have opened sales offices and R&D centers around the world.In addition to global supply chains for the production of physical goods, they’ve been developing global knowledge networks for a variety of services and R&D functions, taking advantage of the growing talent and lower costs now available around the world.

“Big companies have reaped enormous efficiencies by creating supply chains that stretch around the world and involve hundreds of partners, ranging from wholly owned subsidiaries to outside contractors… They are also forming ever more complicated alliances… America’s top 1,000 public companies now derive 40% of their revenue from alliances, compared with just 1% in 1980.”

This growth in regulation adds to the complexity of doing business in the US and around the world, further playing into the hands of large companies.“Regulation inevitably imposes a disproportionate burden on smaller companies because compliance has a high fixed cost…Younger companies also suffer more from regulation because they have less experience of dealing with it.”

While it might appear that large companies now have all the advantages, it’s important to remember that such advantages don’t tend to last.Let’s remember the number of once high-flying companies that no longer exist - e.g., Sun Microsystems, DEC, Compaq, Blockbuster; or that are shadows of their former selves, - e.g. Kodak, BlackBerry, Motorola, Yahoo and Nokia.Most large legacy companies, - e.g., GE, Microsoft, IBM, - have had to fight hard to keep up and reinvent themselves.Even the tech aristocracy, - e.g., Google, Apple, Facebook, Amazon, - must constantly watch out for new technologies, market-trends and global competitors that might push them off their elite perches, - as was the case with IBM in the mid-1980s and of Microsoft in the 2000s with the advent of client-server computing and smartphones respectively.

“The virtualisation of some sectors of the economy and the corporatisation of others are going hand in hand… Big companies have much to gain from contracting out their R&D to startups. They can make lots of different bets without involving their corporate bureaucracies.But startups also have a lot to gain by selling themselves to an established company that can provide stability, reliability and predictability, all of which can be hard to come by in the tech world.”In the end, big companies and startups will continue to co-exist in a kind of symbiotic relationship - each one doing what it does best.

Is Design Thinking the “New Liberal Arts”?tag:typepad.com,2003:post-6a00d8341f443c53ef01b8d2306ea7970c2016-10-31T07:46:26-04:002016-10-31T07:46:26-04:00Design thinking has become an increasingly popular topic of discussion over the past decade. It was featured in the September, 2015 issue of the Harvard Business Review with several articles on the subject. Design is no longer just for physical...IWB

Design thinking has become an increasingly popular topic of discussion over the past decade.It was featured in the September, 2015 issue of the Harvard Business Review with several articles on the subject.Design is no longer just for physical objects, e.g. cars, bridges, shoes, jewelry, smartphones.Design thinking is now being applied to abstract entities, - e.g. systems, services, information and organizations, - as well as to devise strategies, manage change and solve complex problems.

The application of design thinking beyond products isn’t new. Nobel laureate Herbert Simon discussed the concept in his 1969 classic The Sciences of the Artificial. IDEO, a firm best known for pioneering this expanded view of design, traces its roots back to 1978.The School of Design in London’s Royal College of Art has long been expanding the boundaries of industrial design. Stanford’s Institute of Design, - better known as the d.school, - was launched in 2004 as a graduate program that integrates business, the social sciences, the humanities and other disciplines into more traditional engineering and product design.

The d.school’s website nicely explains its design-thinking point of view: “Students and faculty in engineering, medicine, business, law, the humanities, sciences, and education find their way here to take on the world’s messy problems together. Human values are at the heart of our collaborative approach… Along the way, our students develop a process for producing creative solutions to even the most complex challenges they tackle…Our deliberate mash-up of industry, academia and the big world beyond campus is a key to our continuing evolution.”

I first learned about design thinking during a 2005 visit to the Olin College of Engineering.Olin College was started in 2002 as a small, new kind of engineering college on the outskirts of Boston. Its mission statement states that “Olin College prepares students to become exemplary engineering innovators who recognize needs, design solutions and engage in creative enterprises for the good of the world.”

Its curriculum emphasizes three key areas: a rigorous engineering education; entrepreneurship and entrepreneurial thinking; and the arts, which broadly encompasses creativity, innovation and design. “It is hoped that design will move toward the center of the Olin College curriculum.One cannot design what one cannot imagine; therefore, enhancing creativity is an important precursor to effective design.”

A few years later, I learned about the application of design thinking to business from a 2010 NY Times article about Roger Martin , who at the time was the Dean of the Rotman School of Management at the University of Toronto. Martin had long been advocating “that students needed to learn how to think critically and creatively every bit as much as they needed to learn finance or accounting. More specifically, they needed to learn how to approach problems from many perspectives and to combine various approaches to find innovative solutions.”

“Learning how to think critically - how to imaginatively frame questions and consider multiple perspectives - has historically been associated with a liberal arts education, not a business school curriculum, so this change represents something of a tectonic shift for business school leaders.” Achieving this goal would require business schools to move into territory “more traditionally associated with the liberal arts: multidisciplinary approaches, an understanding of global and historical context and perspectives, a greater focus on leadership and social responsibility and, yes, learning how to think critically.”

It’s not surprising that a number of engineering and business schools have embraced design thinking over the past decade.Recent studies have shown that, given our complex business world, companies are increasingly searching for talented individuals that are strong in quantitative, analytical, technical and similar hard skills, as well as in strategic thinking, teamwork, communications and related soft competencies.Business and engineering schools do a pretty good job when it comes to teaching hard skills. But they have generally not done so well with the softer competencies companies are also looking for.

By now, it’s been a generally accepted that design thinking can be applied to just about all disciplines and professions.But a number of recent articles have been asking an intriguing question. Given its broad applicability, has design thinking now become the new liberal arts? After pondering the question, I believe the answer is: No - the same answer I would give if asked whether engineering is now the new science. Let me explain.

Is Design Thinking the New Liberal Arts in Education?, is the title of a 2015 article by Olin College President Richard Miller and Professor of Design Benjamin Linder.“Design Thinking is frequently identified as an engaging process and methodical framework for approaching complex, multidisciplinary problems in ways that consistently result in solutions that are successful and often creative in unpredictable way,…” they write in their opening sentence.It’s “a framework for thinking about complex, multidisciplinary problems that applies to just about anything.It is not confined to an art medium or to any technology.”

Successful design solutions are generally found at the intersection of three independent dimensions. The first is feasibility, because “nothing exists in the real world that isn’t consistent with what we know about the laws of nature.” Next comes viability, - a solution “must have a cost of production and maintenance that is competitive with alternatives.”The final dimension is desirability,that is, “the quality of being desired, embraced and accepted by the people who must use and implement the solution,” as well as the ability “to understand the context and the culture of the people who will be most affected by the solution.” Desirability is central to Design Thinking.

“The types of solutions that emerge from engineering thinking usually involve a new technology (e.g., the internet, electric car, software, etc.) while the types of solutions that emerge from business thinking usually involve a new financial model (e.g., the credit card, iTunes, etc.).But the types of solutions that emerge from Design Thinking (or desirability) involve more human motivation and psychology… where the driver is the fundamental need to tell your personal story to a group of friends you care about.”

“By the end of the semester I was fascinated enough to head to Palo Alto to immerse myself in the ways of the d.school,” said Miller in his Chronicle article.“What I discovered got me thinking about more than design thinking.A very important experiment in humanities education is going on…what’s happening in Palo Alto right now is really about the future of the liberal arts…Do disciplines, in order to evolve and advance, need some place in which to play and from which to be provoked?…Research-as-questioning is a much freer and more playful approach to discovery.It keeps us in closer contact with our natural disposition to curiosity and wonder.”

While he found much to like about the d.school’s methods, Miller eventually concluded that their action-oriented approach to problem solving did not pay proper attention to past knowledge.“A truly human-centered design, if it takes culture at all seriously, would have to take pastness seriously… If we think about what the liberal arts teach, we find that the study of the past achievements of humans, whether history, literature, philosophy, music, or art, provides us with a richly nuanced appreciation for the complexity of human existence…What the liberal arts, - or humanities, - give us are the experiences of those who have come before us to add to our own.These surrogate experiences help us to live well in the world.”

“So, is design thinking the new liberal arts,” asked Miller.“Not yet,” he answered. “We in the university, at many different organizational levels, may all need our own d.schools.But for them to really shape the future of university learning, they will have to do a better job of engaging with precisely what the university was designed to promote, and what design thinking, with its emphasis on innovation, has thus far completely ignored: the past.”

To help me appreciate Miller’s argument, I asked myself a related question, - can engineering be viewed as a kind of new science?, and looked for guidance at the relationship between engineering and science, as explained in the aforementioned Olin College article.

“Engineering is, by nature, about solving problems and designing new things. The difference between science and engineering is often described by the nature of the questions that are asked: scientists ask why as they attempt to understand the world, while engineers ask why not as they attempt to change it and create what has never been. The process of creating what has never been is the essence of the process of design in engineering.”

“Of course, this differentiation between science and engineering is rather over simplified. In order to solve problems and create new things, it is also necessary to understand the world, so science and engineering are symbiotic twins.” Engineering applies scientific knowledge to come up with practical solutions to real world problems.

The Liberal Arts and Design Thinking have an equally symbiotic relationship. Design Thinking is not a kind of New Liberal Arts. In the end, Design Thinking is all about leveraging the foundational knowledge of the Liberal Arts to come up with creative, multidisciplinary solutions to highly complex, real world problems.

Has AI (Finally) Reached a Tipping Point?tag:typepad.com,2003:post-6a00d8341f443c53ef01b8d22a1062970c2016-10-24T08:53:08-04:002016-10-24T08:53:09-04:00After many years of promise and hype, AI seems to be finally reaching a tipping point of market acceptance. “Artificial intelligence is suddenly everywhere… it is proliferating like mad.” So starts a Vanity Fair article published around two years ago...IWB

After many years of promise and hype, AI seems to be finally reaching a tipping point of market acceptance. “Artificial intelligence is suddenly everywhere… it is proliferating like mad.” So starts a Vanity Fairarticle published around two years ago by author and radio host Kurt Andersen. And, this past June, a panel of global experts convened by the World Economic Forum (WEF) named Artificial Intelligence, - Open AI Ecosystems in particular - as one of its Top Ten Emerging Technologies for 2016 because of its potential to fundamentally change the way markets, business and governments work.

AI is now being applied to activities that not long ago were viewed as the exclusive domain of humans.“We’re now accustomed to having conversations with computers: to refill a prescription, make a cable-TV-service appointment, cancel an airline reservation - or, when driving, to silently obey the instructions of the voice from the G.P.S,” wrote Andersen.The WEF report noted that “over the past several years, several pieces of emerging technology have linked together in ways that make it easier to build far more powerful, human-like digital assistants.”

What will life be like in such an AI-based society?What impact is it likely to have on jobs, companies and industries?How might it change our everyday lives?

These questions were addressed in Artificial Intelligence and Life in 2030, a report that was recently published by Stanford University’s One Hundred Year Study of AI (AI100).AI100 was launched in December, 2014 “to study and anticipate how the effects of artificial intelligence will ripple through every aspect of how people work, live and play.”The core activity of AI100 is to convene a Study Panel every five years to assess the then current state of the field, review AI’s progress in the years preceding the report, and explore the potential advances that lie ahead as well the technical and societal challenges and opportunities these advances might raise.

The first such Study Panel, launched a year ago, was comprised of AI experts from academia, corporate laboratories and industry as well as AI-savvy scholars in law, political science, policy, and economics.The study’s overriding theme was the likely impact of AI on a typical North American city by the year 2030. The panel examined key AI research trends, AI’s impact on various sectors of the economy, and major issues concerning AI public policy. The report’s Executive Summary succinctly summarized its key finding:

“Contrary to the more fantastic predictions for AI in the popular press, the Study Panel found no cause for concern that AI is an imminent threat to humankind. No machines with self-sustaining long-term goals and intent have been developed, nor are they likely to be developed in the near future.Instead, increasingly useful applications of AI, with potentially profound positive impacts on our society and economy are likely to emerge between now and 2030, the period this report considers.At the same time, many of these developments will spur disruptions in how human labor is augmented or replaced by AI, creating new challenges for the economy and society more broadly.”

The report’s first section addresses a very important question: How do researchers and practitioners define Artificial Intelligence?

From its inception about sixty years ago, there has never been a precisely, universally accepted definition of AI.Rather, the field has been guided by a rough sense of direction, such as this one by Stanford professor Nils Nilsson in The Quest for Artificial Intelligence: “Artificial intelligence is that activity devoted to making machines intelligent, and intelligence is that quality that enables an entity to function appropriately and with foresight in its environment.

Such a characterization of AI depends of what we mean for a machine to function appropriately and with foresight. It spans a very wide spectrum, - as it should.Is a simple calculator intelligent because it does math much faster than the human brain?Where in the spectrum do we place thermostats, cruise-control in cars, navigation applications that give us detail directions, speech recognition, and chess and Go-playing apps?

Over the past six decades, the frontier of what we’re willing to call AI has kept moving forward.AI suffers from what’s become known as the AI effect: AI is whatever hasn’t been done yet, and as soon as an AI problem is successfully solved, the problem is no longer considered part of AI.“The same pattern will continue in the future,” notes the report. “ AI does not deliver a life-changing product as a bolt from the blue.Rather, AI technologies continue to get better in a continual, incremental way.”

One of the key ways of assessing progress in AI is to compare it to human intelligence.Any activity that computers are now able to perform that was once the exclusive domain of humans could be counted as an AI advance.And, one of the best ways of comparing AI to humans is to pit them against each other in a competitive game.

Chess was one of the earliest AI challenges.Many AI leaders were then convinced that it was just a matter of time before AI would consistently beat humans at chess.They were trying to do so by somehow programming the machines to play chess, even though to this day we don’t really understand how chess champions think, let alone how to translate their thought patterns into a set of instructions that would enable a machine to play expert chess.All these ambitious AI approaches met with disappointment and were abandoned in the 1980s, when after years of unfulfilled promises a so called AI winter of reduced interest and funding set in that nearly killed the field.

AI was reborn in the 1990s.Instead of trying to program computers to act intelligently, the field embraced a statistical, brute force approach based on analyzing vast amounts of information with powerful computers and sophisticated algorithms. AI researchers discovered that such an information-based approach produced something akin to intelligence or knowledge. Moreover, unlike the earlier programming-based projects, the statistical approaches scaled very nicely. The more information you had, the more powerful the supercomputers, the more sophisticated the algorithms, the better the results.

Deep Blue, IBM’s chess playing supercomputer, demonstrated the power of such a statistical, brute force approach by defeating then reigning chess champion Gary Kasparov in a celebrated match in May, 1997. “Curiously, no sooner had AI caught up with its elusive target than Deep Blue was portrayed as a collection of brute force methods that wasn’t real intelligence… Was Deep Blue intelligent or not? Once again, the frontier had moved.”Now, the best chess programs consistently beat the strongest human players, and even smartphone-based apps play a strong game of chess.

As human-computer chess matches no longer attract much interest, the AI frontier has moved to games considerably more complex than chess.In 2011, Watson, - IBM’s question-answering system, - won the Jeopardy! Challenge against the two best human Jeopardy! players, demonstrating that computers could now extract meaning from the unstructured knowledge embodied in books, articles, newspapers, web sites, social media, and anything written in natural language.And earlier this year, Google’s AlphaGo claimed victory against Lee Sedol, - one of the world’s top Go players, - in a best-of-five match, winning four games and losing only one.In the game of Go, there are more possible board positions than there are particles in the universe.A Go-playing system cannot simply rely on computational brute force.AlphaGo relies instead on deep learning algorithms, modeled partly on the way the human brain works.

Given the broad, changing scope of the field, what then is Artificial Intelligence?The AI100 Study Panel offers a circular, operational answer: AI is defined by what AI researchers do.The report then lists the key AI research trends, that is, the hot areas AI researchers are pursuing.These include:

Large-scale machine learning.Machine learning gives computers the ability to learn by ingesting huge amounts of data instead of being explicitly programmed.Machine learning has been propelled dramatically forward by the huge amounts of data we now have access to and by cloud computing computational and storage resources.“A major focus of current efforts is to scale existing algorithms to work with extremely large data sets.”

Deep learning.Deep learning takes machine learning to the next level, using deep graphs with multiple processing layers, which enable advanced visual application like object recognition and video labeling, as well as significantly improved audio, speech and natural language processing.

Reinforcement learning.“Whereas traditional machine learning has mostly focused on pattern mining, reinforcement learning shifts the focus to decision making, and is a technology that will help AI to advance more deeply into the realm of learning about and executing actions in the real world.”

Robotics.“Current efforts consider how to train a robot to interact with the world around it in generalizable and predictable ways.… Advances in reliable machine perception, including computer vision, force, and tactile perception, much of which will be driven by machine learning, will continue to be key enablers to advancing the capabilities of robotics.”

Computer Vision. “For the first time, computers are able to perform some (narrowly defined) visual classification tasks better than people. Much current research is focused on automatic image and video captioning.”

Natural Language Processing.Natural Language Processing “is quickly becoming a commodity for mainstream languages with large data sets… Research is now shifting towards developing refined and capable systems that are able to interact with people through dialog, not just react to stylized requests.”

“Over the nextfifteen years, the Study Panel expects an increasing focus on developing systems that are human-aware, meaning that they specifically model, and are specifically designed for, the characteristics of the people with whom they are meant to interact.There is a lot of interest in trying to find new, creative ways to develop interactive and scalable ways to teach robots.Also, IoT-type systems - devices and the cloud - are becoming increasingly popular, as is thinking about social and economic dimensions of AI.In the coming years, new perception/object recognition capabilities and robotic platforms that are human-safe will grow, as will data-driven products and their markets.”

Reflections on the Development of Implementable, Winning Strategiestag:typepad.com,2003:post-6a00d8341f443c53ef01bb0938b089970d2016-10-18T06:05:31-04:002016-10-18T08:46:43-04:00I recently read an article by strategy consultant and writer Ken Favaro which nicely explained how to best think about strategy in today’s business environment. “Many business leaders subscribe to the classic definition of strategy as a set of actions...IWB

I recently read an article by strategy consultant and writer Ken Favaro which nicely explained how to best think about strategy in today’s business environment. “Many business leaders subscribe to the classic definition of strategy as a set of actions designed to achieve an overall aim,” wrote Favaro in The Trouble with Putting Goals Ahead of Strategy, published in strategy+business in 2015.“In other words, they believe strategy starts with a goal.But for companies that have implemented winning strategies, that’s not how it typically happens.”

Goals and strategies serve very different purposes.Goals, visions and missions are important to paint an exciting picture of the future around which everyone can rally, as well as to help set the general direction of a company. But being too high level, goals by themselves don’t give you much guidance on how to get things done and what key decisions must be made and prioritized.“[G]oals tell you very little about the fundamental choices you should make around creating customer and company value.Such choices are the very essence of your strategy.”

Most winning strategies start with an idea for an innovative new product, service or business model, followed by a plan to bring the idea to market.Only then should come a big, bold goal, as a way “to crystalize an ambition, motivate the troops, and excite investors.Unfortunately, strategic planning in most companies gets this sequence exactly reversed- and when that happens, bad strategies result.”

I totally agree with Favaro, based on my personal experiences leading emerging technology initiatives at IBM, including parallel supercomputing, the Internet, and Linux.Let me share some of what I learned when working on IBM’s Internet strategy.

In the Fall of 1995, then Chairman and CEO Lou Gerstner made the decision to embrace the Internet across the whole company, and asked me to organize a cross-IBM group to define and lead this new strategy.Netscape, the best known Internet startup at the time, had gone public in August of 1995 with a very successful IPO.A lot was starting to happen around the Internet, but it was not clear where things were heading, and in particular what the implications would be to the world of business.

Our job was to figure out the business value around the Internet, what we should advise our clients to do and what new products and services we needed to develop. Equally important, we had to come up with an Internet business model for IBM that made sense and was financially sound. We worked through 1996 to figure out what our strategy should be, and towards the end of the year the picture began to emerge.

What made the Internet job different from just about any other I’d previously had is that there was no one technology or product you could work on in the labs that would make you a success in the marketplace. This time around, the strategy had to come from the marketplace itself, not the labs.And so it did. Watching what our customers were doing, it became clear that the Internet was definitely going to have a profound impact on just about all aspects of business.

The universal reach and connectivity of the Internet were enabling access to information and transactions of all sorts for anyone with a browser and an Internet connection. Any business, by integrating its existing databases and applications with a web front end, could now reach its customers, employees, suppliers and partners at any time of the day or night, no matter where they were. And they could start very simply by web-enabling specific applications and databases.

Thus was born what became our successful e-business strategy.As I think back, the aha moment came to us when we succinctly captured our strategy with the simple phrase e-business = Web + IT.Companies were now able to engage in their core activities in a much more productive and efficient way. But, unlike the prevailing hype, we also believed that the brand reputation, installed customer base and IT infrastructures that companies had built over the years would be even more valuable assets when combined with the new Web capabilities. All companies could benefit, - whether large or small, new or mature.

The e-business strategy was very well received in the marketplace.In the Fall of 1997 we launched a creative marketing campaign that successfully established the e-business brand in the marketplace by consistently telling our e-business stories over a variety of communication channels, including press interviews, conferences around the world, IT and financial analyst meetings, Web articles, TV and print ads, and lots of client engagements. These communication and marketing efforts helped to explain what it meant to do business in the Internet age and closely associated IBM with the Internet. They also helped us revitalize IBM’s brand.

“For nearly 50 years, strategy has been a business of promoting universal prescriptions based on what appears to explain the success of a few revered companies,” he wrote.“At first glance, this practice makes perfect sense. Why not draw lessons from those who seem to have figured it all out?”However, such a practice has limited value.“By their nature, big strategy concepts are not particular to any one company.That’s problematic, because when they become wildly popular and widely adopted, no one gains advantage from them. In fact, the me-too pursuit of strategy concepts stymies their supposed benefits…”

“Great strategies answer five critical questions (the strategic five) in ways that are unique to your company: (1) What business or businesses should your company be in? (2) How should you add value to your businesses? (3) Who should be the target customers for your businesses? (4) What should be your value propositions to those target customers? (5) What capabilities should differentiate your ability to add value to your businesses and deliver their value propositions?”

By the end of 1995, for example, it was pretty clear that the Internet had evolved from a network primarily used by universities, research labs and geeks in general, to a major new technology for just about all aspects of IT and business.So for IBM or any other company to embrace the Internet was important as a high level strategic direction, but without lots of additional details it did not constitute any kind of implementable strategy.By the time a strategic concept becomes popular, - e.g., total quality management, business process reengineering, core competencies, co-opetition, time-to-market, user experience, - just about everyone has embraced it.For a company to extract competitive advantage from the concept it must define a much more detailed and differentiated strategy.

“Nevertheless, the business of strategy will continue to churn out the next big thing, because strategy concepts provide a modicum of comfort in an uncertain, complex world.But the most capable strategists are never swept up in the hype. They understand the limitations of such concepts and resist the allure generated by their popularity…”

“To exploit strategy concepts without allowing them to take over, consider each one that comes along to be an opportunity to challenge and improve the strategy you already have.If you don’t already have a strategy to which you are truly committed, you are particularly vulnerable to being captured by the latest strategy fashion.If you do, ask how a new concept can enhance it.But never let that concept become a shortcut: a way to skip the hard work of identifying the big idea that will power your company’s strategy; of formulating a unique, specific, and complete set of answers to the strategic five; and of owning your strategy through thick and thin.”

Blockchains: the Promise of More Frictionless, Trusted Economiestag:typepad.com,2003:post-6a00d8341f443c53ef01b8d21b5a3c970c2016-10-11T07:21:06-04:002016-10-11T07:21:06-04:00Why do firms exist? Ronald Coase, - the eminent British economist and University of Chicago professor, - addressed this question in The Nature of the Firm, - a seminal paper published in 1937 which along with other major achievements earned...IWB

Why do firms exist? Ronald Coase, - the eminent British economist and University of Chicago professor, - addressed this question in The Nature of the Firm, - a seminal paper published in 1937 which along with other major achievements earned him the 1991 Nobel Prize in economics.

Professor Coase explained that, in principle, a firm should be able to find the cheapest, most productive, highest quality goods and services by contracting them out in an efficient, open marketplace. However, markets are not perfectly fluid. Transaction costs are a kind of friction incurred in obtaining goods and services outside the firm, such as searching for the right supply chain partners, establishing a trusted relationship, negotiating a contract, coordinating the work, managing intellectual property and so on. Firms came into being to make it easier and less costly to get work done.

A recent IBM report, - Fast forward: Rethinking enterprises, ecosystems and economies with blockchains, - harks back to Coase’s paper to analyze the potential value of blockchains. The report notes that while transaction costs are lower within firms, “in recent years as enterprises have scaled, the added complexity of operations has grown exponentially while revenue growth has remained linear.The result?At a certain point, organizations are faced with diminishing returns.Blockchains have the potential to eradicate the cost of complexity and ultimately redefine the traditional boundaries of an organization.”

Over the past two decades, the Internet has significantly reduced a number of these business frictions, - lowering external transactions costs; giving rise to powerful, global ecosystems and platforms; and thus enabling companies to focus their energies and investments on their true points of differentiation. Companies have been carefully analyzing their key strengths and weaknesses process by processes and function by function, so that for each, they can better decide whether to build, acquire, or rely on a supply chain partner.

At the same time, Internet threats are growing. Large-scale fraud, data breaches, and identity thefts are becoming more common, and companies are finding that cyber-attacks are costly to prevent and recover from.

Why is the Internet so vulnerable to these threats? Why wasn’t stronger security designed into the original Internet infrastructure?MIT research scientist and Internet pioneer David Clark addressed this question in a recent article about the early design choices that have led to today’s Internet.

The Internet is basically a general purpose data network that supports a remarkable variety of applications.Being general purpose was a major design choice, that has enabled the Internet to become one of, if not, the most prolific innovation platform the world has ever seen.But while the Internet has enabled many innovations, it’s extensive use has led to new, serious risks in business and economic activity.Foremost among them is security.

A major reason for the Internet’s ability to keep growing and adapting to widely different applications is that it’s stuck to its basic data-transport mission, i.e., just moving bits around.The Internet has no idea what the bits mean or what they’re trying to accomplish.That’s all the responsibility of the applications running on top of it.

Consequently, there’s no one overall owner responsible for security.Responsibility for security is divided among several actors, making it significantly harder to achieve. As Clark points out, “the design decisions that shaped the Internet as we know it likely did not optimize secure and trustworthy operation.” Hopefully, that’s what the blockchains will now help us achieve.

As firms now rely on ecosystem partners for many of the functions once done in-house, one of their major organizational challenges is how to best manage their increasingly complex operations across a network of interconnected companies. Distributed operations can lead to increased risks, unanticipated consequences and new kinds of serious frictions.

“The long history of human progress has been a steady march against friction,” wrote the IBM report in its opening sentence.“From the introduction of money to replace barter and the gradual replacement of wax seals by digital signatures, we have seen steady progress facilitated by digital innovations.The internet primed friction for a free-fall. Since then, some frictions fell while others rose.”

Three types of frictions predominate today:

Information frictions:Participants in a transaction don’t have access to the same information; the required information is not easily accessible; and security and privacy risks keep rising, - e.g., hacking, cybercrime, identity theft.

Interaction frictions:Intermediaries are needed to help deal with growing scale and complexity; transactions take longer due to arcane global processes; and a lack of trusted marketplaces in many economies around the world.

Innovation frictions:These include legacy systems, bureaucratic processes and institutional inertia; restrictive regulations that stifle innovation and change; and growing uncertainties and threats that make it harder to move forward.

Blockchains promise to significantly reduce these frictions by bringing one of the most important and oldest concepts, the ledger, to the Internet age.Ledgers constitute a permanent record of all the economic transactions an institution handles, whether it’s a bank managing deposits, loans and payments; a brokerage house keeping track of stocks and bonds; or a government office recording births and deaths, the ownership and sale of land and houses, or legal identity documents like passports and driver licenses.They’re one of the oldest and most important concepts in financial transactions and other mission critical applications.

As the IBM report notes, “Today, transactions are recorded in multiple ledgers.Each one captures at best a moment in time and reflects the information held by a single party: Bank X purchased or sold a mortgage, for example.They don’t record what happens next, what came before, or the role of others - partners, suppliers, consumers - in the transaction.Moreover, they’re prone to human error and vulnerable to tampering.By contrast, distributed ledgers can be shared and updated in near real-time across a group of participants.”

The report lists five major distributed ledger or blockchain attributes that will help reducie these frictions:

Distributed and sustainable. “The ledger is shared, updated with every transaction and selectively replicated among participants in near real-time.Privacy is maintained via cryptographic techniques and/or data partitioning techniques to give participants selective visibility into the ledger; both transactions and the identity of transacting parties can be masked.Because it is not owned or controlled by any single organization, the blockchain platform’s continued existence isn’t dependent on any individual entity.”

Secure and indelible.“Cryptography authenticates and verifies transactions and allows participants to see only the parts of the ledger that are relevant to them.Once conditions are agreed to, participants can’t tamper with a record of the transaction.Errors can only be reversed with new transactions.”

Transparent and auditable.“Because participants in a transaction have access to the same records, they can validate transactions, and verify identities or ownership without the need for third-party intermediaries.Transactions are time-stamped and can be verified in near real-time.”

Consensus-based and transactional.“All relevant network participants must agree that a transaction is valid. This is achieved by using consensus algorithms.Blockchains establish the conditions under which a transaction or asset exchange can occur.”

Orchestrated and flexible.“Because business rules and smart contracts that execute based on one or more conditions can be built into the platform, blockchain business networks can evolve as they mature to support end-to-end business processes and a wide range of activities.”

Most everyone agrees that the Internet has been transforming economies, societies and our personal lives. What’s the long term promise of blockchains?

“Over the years, businesses have overcome multiple sources of friction. Institutions and instruments of trust emerged to reduce risk in business transactions. Technology innovations helped overcome inefficiencies. Still, many business transactions remain inefficient, expensive and vulnerable. Blockchain technology… has the potential to obviate intractable inhibitors across industries.”

“As frictions fall, a new science of organization emerges, and the way we structure industries and enterprises will take novel shape.With transparency the norm, a robust foundation for trust can become the springboard for further ecosystem evolution.Participants and assets once shut out of markets can join in, unleashing an accelerated flow of capital and unprecedented opportunities to create wealth.”

Competing Against “Digital Invaders”tag:typepad.com,2003:post-6a00d8341f443c53ef01b8d218f7ef970c2016-10-04T06:38:57-04:002016-10-04T06:38:57-04:00Last year, IBM’s Institute for Business Value conducted a C-suite study aimed at identifying the key disruptive trends that will likely impact companies around the world over the next three to five years, as well as what their senior executives...IWB

Last year, IBM’s Institute for Business Value conducted a C-suite studyaimed at identifying the key disruptive trends that will likely impact companies around the world over the next three to five years, as well as what their senior executives are doing to better prepare their organizations for the expected disruptions.The study surveyed over 5,200 CEOs, CFOs, CIOs, CMOs and other C-suite executives across 21 industries in over 70 countries, - most of them in face-to-face interviews.After analyzing the survey data,- including the use of Watson Analytics to extract inferences from open-ended responses, - IBM published its findings in Redefining Boundaries: Insights from the Global C-suite Study.

To better understand the traits of the most successful enterprises, the IBM study also asked CxOs to rank their companies based on their innovation reputation and their financial performance over the previous three years.After analyzing their responses, it identified 5% of companies as Torchbearers, the name given to those companies enjoying both a strong innovation reputation and an excellent financial track record.At the other end of the spectrum, 34% of enterprises were identified as Market Followers or laggards in both innovation and financial performance.

The report organized its findings and recommendations into three main areas:Preparing for digital invaders; Creating a panoramic perspective; and Be first, be best or be nowhere. Let me briefly discuss each.

Prepare for digital invaders

Industry convergence was found to be the overriding concern of most senior executives.As boundaries continue to erode, previously separate industries are being brought closer to each other.“A few years ago, CxOs could see the competition coming…”They could generally fend off competitive threats “by improving or expanding the range of products and services you offered, or getting to market more efficiently and imaginatively.Today, the competition’s often invisible until it’s too late.”

Competition can now come from different directions, including newly formed or adjacent industries and so called digital invaders with totally different business models.Digital invaders can be giants with highly sophisticated platforms, - like Google, Amazon and Alibaba, - or focused, agile startups unencumbered by legacy infrastructures.These invaders typically target profitable areas of the value chain, often looking to seize control of the customer relationship and relegate incumbents to less profitable back-end support services.

This competitive shift is caused primarily by rapidly changing technologies and markets.CxOs recognize that greater efficiency alone is not sufficient to fend off digital invaders; - much bolder approaches are required.Most anticipate that they must transform the way they engage with customers, including more digital and personalized interactions.More than half agree that innovation increasingly comes through collaborations with external ecosystem partners.

While nearly two thirds of CxOs plan to enter new markets, they will largely do so by sticking to their areas of experience and expanding into new demographic segments and geographies.But, the best performing companies, the so-called Torchbearers, are bolder and more likely to enter both adjacent and totally new markets.

The report offers three recommendations to help fend off digital invaders: listen to the people closest to your customers and rely on them for all but the most important decisions; form new partnerships and share key resources with allies; and develop online platforms forums where buyers and sellers can trade and share information.

Create a panoramic perspective

“The more nebulous your enemies and the faster the pace of change, the wider- and further - you need to look.Yet it’s extremely difficult to glimpse beyond the immediate future…It’s also, arguably, becoming even more unpredictable, as knowledge becomes increasingly specialized and fragmented.”

CxOs generally agree that cloud computing, mobile solutions and the Internet of Things (IoT) will be particularly important technologies over the next few years.Cognitive technologies hold great promise but are a bit further over the horizon.Overall, “it’s the confluence of different technologies that holds the greatest promise,” such as the combination of AI-based smart products and advances in bioengineering to improve healthcare, or new cloud-based business models combining mobility and data sharing.

There are also serious downsides.“In 2013, when we conducted our previous C-suite study, security concerns made just a blip on their radar screens.Today, the majority of CxOs, irrespective of role, think IT security is the top risk.”As more things are connected, security is rising to the top of the agenda.Another major issue, - that surprisingly few CxOs brought up, - is that technologies like cloud computing are leading to even more digital disruption.“It’s not just helping large enterprises become more efficient; it’s also opening the doors to the ankle-biters nipping at their heels.”

The study found that CxOs rely primarily on traditional techniques to help identify new trends, such as brainstorming and predictive analytics.So far, less than 15% use advanced cognitive computing methods.It also found that CxOs draw insights from fairly limited external resources.Around half rely on thought leaders, customer feedback, market research firms and competitive intelligence.But a smaller number look to companies in adjacent industries who might become potential competitors or to social media to better understand market trends.

To help create a panoramic perspective, the report recommends: using advanced analytics methods like cognitive computing to extract insights about future trends from the vast amounts of data now available; setting up a small forecasting team equipped with the right skills and technologies; and leveraging the contacts, skills and assets of ecosystem partners.

Be first, be best, or be nowhere

“CxOs see technology primarily as a means of adding value rather than subtracting costs.” Develop better products/services was brought up by over 80% of CxOs, followed closely by Develop stronger customer/relationships and Improve effectiveness of marketing and sales.Many are experimenting with alternative business models, primarily open initiatives and platform models which are particularly conducive to collaborating within and across industries.

As would be expected, the more aggressive Torchbearers lead in using open and platform business models to help them reach markets faster.“The speed at which technology evolves is accelerating…The CxOs running our Torchbearers are clearly aware of the implications.Almost all of them know coming to market second or third is a luxury they can’t afford.And thanks to their panoramic perspective, they’re more comfortable than Market Followers about taking the risks associated with being a pioneer…Torchbearers are focusing on satisfying the desires of their most discerning customers - those who demand the very best. And they’re seeking alliances to help them fulfill the expectations of such customers.”

What has enabled the Torchbearers in the study to forge ahead of the rest? The report concludes by summarizing their key attributes:

Scope: “Torchbearers are more forward-looking and bolder about exploring the opportunities in related industries… They also understand that they compete as part of a bigger ecosystem of interdependent entities, which greatly enhances their potential impact on the market.”

Scale: “Torchbearers are braver about investing in emerging technologies with high risks and returns, and more aware of the need to preserve their competitive advantage and scale their expertise.”

Speed: “Torchbearers are more agile, more willing to experiment and more confident about taking the lead.Once they’ve developed a new product, service or business model, they race for the finishing line, recognizing the pace at which technology is evolving and the importance of dominating the market before their competitors do.”

Why Are We Stuck in a World of Slow Economic Growth? Nobody Seems to Know for Suretag:typepad.com,2003:post-6a00d8341f443c53ef01b8d2122359970c2016-09-26T10:54:27-04:002016-09-27T07:38:09-04:00Over the past few centuries, the natural sciences, - e.g. physics, chemistry, biology, - have developed a variety of principles and models. These have enabled them to analyze and predict the behavior of our highly complex physical systems under widely...IWB

Over the past few centuries, the natural sciences, - e.g. physics, chemistry, biology, - have developed a variety of principles and models. These have enabled them to analyze and predict the behavior of our highly complex physical systems under widely different conditions.But the situation is quite different in the social sciences, - e.g., economics, sociology, political science. It’s much more difficult to make accurate predictions in social systems, - whose key components are people, organizations and their intricate interactions, - because of their highly fluctuating behaviors.

I was reminded of this major distinction between physical and social systems by a recent issue of Foreign Affairs, which focused on How to Survive Slow Growth.“[G]rowth has ground to a halt almost everywhere, and economists, investors, and ordinary citizens are starting to confront a grim new reality: the world is stuck in the slow lane and nobody seems to know what to do about it,” notes its introductory article.Several prominent authors wrote about various aspects of the economic slowdown.But in the end, they didn’t arrive at consensus reasons for the slow growth or how long it will likely last, - years or decades.

In the lead article, - The Age of Secular Stagnation: What It Is and What to Do About It, - Harvard economics professor Larry Summers wrote: “As surprising as the recent financial crisis and recession were, the behavior of the world’s industrialized economies and financial markets during the recovery has been even more so.”Back in 2009, almost no one would have predicted that we would still be in a period of slow economic growth, that inflation would be around one percent and interest rates would hover around zero.But, “nearly seven years into the U.S. recovery, markets are not expecting normal conditions to return anytime soon.”

Secular stagnation, says Summers, is the reason behind this unusual situation.“The economies of the industrial world, in this view, suffer from an imbalance resulting from an increasing propensity to save and a decreasing propensity to invest.The result is that excessive saving acts as a drag on demand, reducing growth and inflation, and the imbalance between savings and investment pulls down real interest rates.”

The term secular stagnationwas first coined by economist Alvin Hansen in 1938 who “thought a slowing of both population growth and technological progress would reduce opportunities for investment.Savings would then pile up unused, he reasoned, and growth would slump unless governments borrowed and spent to prop up demand.”Hansen’s forebodings were proved wrong by the post-war economic boom.

Is Summers right this time around, or will he be proved equally wrong? “Not all economists are sold on the secular stagnation hypothesis,” he noted in the Foreign Affairsarticle before presenting some of the alternative explanations.

Harvard economists Carmen Reinhart and Kenneth Rogoffargued that our prolonged slow recovery is due to the excessive debt incurred in the build-up to the 2007-2008 financial crisis. The subsequent deleveraging, as the debts were paid off led to low levels of consumer spending and business investments. A related view comes from Nobel Prize economist and NY Times op-ed columnist Paul Krugman, who believes that we’re stuck in a liquidity trap caused by individuals and companies hoarding cash because they expect deflation and/or insufficient demand to justify investment.Given such weak private sector demand, even zero short-term interest rates have failed to stimulate the economy.

“Between 1960 and 2005, the global labor force grew at an average of 1.8 percent per year, but since 2005, the rate has downshifted to just 1.1 percent, and it will likely slip further in the coming decades as fertility rates continue to decline in most parts of the world.The labor force is still growing rapidly in Nigeria, the Philippines, and a few other countries.But it is growing very slowly in the United States - at 0.5 percent per year over the past decade, compared with 1.7 percent from 1960 to 2005 - and is already shrinking in some countries, such as China and Germany. The implications for the world economy are clear: a one-percentage-point decline in the population growth rate will eventually reduce the economic growth rate by roughly a percentage point…Ultimately, then, the world should brace itself for slower growth and fewer economic standouts.”

Others, - most prominently Northwestern University economist Robert Gordon, - have argued that slow growth is the result of a fundamental decline in innovation and productivity.According to Gordon, the rapid growth and rising-per-capita incomes we experienced from 1870 to 1970 was a unique episode in human history.Innovation is now stalled and there may well be little growth for the rest of this century.

Earlier this year Gordon published The Rise of Fall of American Growth, - which was reviewed in the Foreign Affairs issue by George Mason University economist Tyler Cowen. In Is Innovation Over? The Case Against Pessimism, Cowen writes that “predicting future productivity rates is always difficult; at any moment, new technologies could transform the U.S. economy, upending old forecasts.Even scholars as accomplished as Gordon have limited foresight…Ultimately, Gordon’s argument for why productivity won’t grow quickly in the future is simply that he can’t think of what might create those gains. Yet it seems obvious that no single individual, not even the most talented entrepreneur, can predict much of the future in this way.”

We’re in a Low Growth World.How Did We Get Here?, a NY Times article by journalist and author Neil Irwin offers a more succinct, easier-to-read take on the slow growth problem, but no overriding conclusions on why it’s happening or what to do about it.“Economic growth in advanced nations has been weaker for longer than it has been in the lifetime of most people on earth…” said Irwin.“It increasingly looks as if something fundamental is broken in the global growth machine - and that the usual menu of policies, like interest rate cuts and modest fiscal stimulus, aren’t up to the task of fixing it…”

“Weak productivity and fewer workers are hits to the supply side of the economy.But there is evidence that a shortage of demand is a major part of the problem, too…The distinction is important if there is to be any hope of solving the low-growth problem.If the issue is a shortage of demand, then some more stimulus should help.If it is entirely on the supply side, then government stimulus is not much use, and policy makers should focus on trying to make companies more innovative and coax people back into the work force. But what if it’s both?…”

“Economic history is full of unpredictable fits and starts. When Bill Clinton was elected in 1992, the internet, a defining feature of his presidency, was rarely mentioned, and Japan seemed to be emerging as the pre-eminent economic rival of the United States.In other words, there’s a lot we don’t know about the economic future.What we do know is that if something doesn’t change from the recent trend, the 21st century will be a gloomy one.”

Blockchain Can Reshape Financial Services… But it Will Take Significant Time and Investmenttag:typepad.com,2003:post-6a00d8341f443c53ef01bb093605b5970d2016-09-19T13:22:50-04:002016-09-19T13:10:13-04:00In early August, the World Economic Forum (WEF) published two excellent reports: A Blueprint for Digital Identity, - which I wrote about a few weeks ago, - and The future of financial infrastructure: An ambitious look at how blockchain can...IWB

“Distributed ledger technology (DLT), more commonly called blockchain, has captured the imaginations, and wallets, of the financial services ecosystem,” notes the report, citing a few statistics as evidence: over 90 central banks are engaged in DLT discussions around the world; more than 24 countries have already launched blockchain-based initiatives; 80% of banks predict that they’ll launch blockchain projects by 2017; over 90 financial and technology companies have already joined blockchain consortia; over the past 3 years; more than 2,500 patents have been filed; and $1.4 billion has been invested in blockchain-based startups over the same time span.

But, significant hurdles must be overcome before the advent of large-scale blockchain infrastructures. The WEF correctly warns that this will take time.Not only are there major technology and standards issues to be worked out, but the industry will have to collaborate with governments around the world to develop the appropriate legal frameworks and regulatory environments.

The future of financial infrastructure report is based on a year long study involving industry leaders, regulators and other subject matter experts. While it’s quite bullish about the potential benefits of blockchain technologies, it also acknowledges the serious challenges involved in transforming the global financial infrastructure. This balanced view is reflected in its six key findings:

DLT has great potential to drive simplicity and efficiency through the establishment of new financial services infrastructure and processes;

DLT is not a panacea; instead it should be viewed as one of many technologies that will form the foundation of next- generation financial services infrastructure;

Applications of DLT will differ by use case, each leveraging the technology in different ways for a diverse range of benefits;

Digital Identity is a critical enabler to broaden applications to new verticals;

The most impactful DLT applications will require deep collaboration between incumbents, innovators and regulators, adding complexity and delaying implementation;

New financial services infrastructure built on DLT will redraw processes and call into question orthodoxies that are foundational to today’s business models.

Let me discuss a few of these findings.

DLT is not a panacea

A recent Gartner report noted that blockchain technologies are hitting the peak of the hype cycle, when the excitement and publicity about a potentially disruptive innovation often leads to a peak of inflated expectations, before falling into the trough of disillusionment when the fledgling technology fails to deliver.

“Distributed ledger technology is not a panacea; instead it should be viewed as one of many technologies that will form the foundation of next-generation financial services infrastructure.,” wrote the WEF in a very timely message.“Over the last 50 years, technology innovation has been fundamental to financial services industry transformation.Today, multiple technologies poised to drive the next wave of financial services innovation are converging in maturity.” DLT is one such technology.Others include biometrics, cloud computing, cognitive computing, machine learning and predictive analytics.

A FinTech report published earlier this year by Citigroup noted that investments in financial technologies have increased by a factor of 10 over the past 5 years.The majority of these investments have been focused on mobile-based applications, - e.g., creating new consumer services, improving the user experience at the point of sale, - while continuing to rely on the existing, back-end financial infrastructures. Not surprisingly given their complexity, change comes much slower to global financial infrastructures.

Sometimes, the emergence of an innovative disruptive technology can help propel change forward. The Internet proved to be such a catalyst in the transformation of global supply chain ecosystems.Over time, blockchain could well become the needed catalyst for the evolution of global financial infrastructures.

Applications of DLT will differ by use case

The majority of the 130-page report consists of a deep-dive analysis of how DLT might apply to several different use cases.Theseinclude global payments, insurance claims processing, syndicated loans, capital raising, and investment management compliance.

DLT brings a diverse range of benefits to these various applications, such as operational simplification based on real-time multi-party tracking; faster and more accurate regulatory compliance; near real-time settlement between financial institutions; and liquidity and capital improvements.

While each use case is different, the deep-dives revealed a number of characteristics that might help identify other high-potential applications of DLT to financial services:

A shared repository of information is used by multiple internal and external parties;

The report suggests that the evolution towards a DLT-based financial infrastructure will take significant time and investment, requiring changes to existing regulations and standards and the creation of new legal and liability frameworks.

Transforming this highly complex global ecosystem is very difficult. It requires the close collaboration of its various stakeholders, including existing financial institutions, FinTech startups, merchants of all sizes, government regulators in just about every country, and huge numbers of individuals around the world. All these stakeholders must somehow be incented to collaborate in developing and embracing new infrastructure innovations. Getting them to work together and pull in the same direction is a major undertaking, given their diverging, competing interests.Overcoming these challenges will add complexity and delay large-scale, multi-party DLT implementations.

Despite these major challenges, the WEF remains optimistic. “Our findings suggest this technology has the potential to live-up to the hype and reshape financial services, but requires careful collaboration with other emerging technologies, regulators, incumbents and additional stakeholders to be successful.”

“Blockchain will become [the] beating heart of the global financial system…” it added in a related media statement. “Blockchain could thus redraw the structure of financial institutions and the back-end of services as we know them today…” allowing consumers “to pay less for all kinds of financial activity, from international payments to the trading of stocks and bonds. It could also give regulators new capabilities, allowing them to stop regulatory violations before they start and to watch more effectively for warning signs of financial crises…”

“Similar to any technological innovation, blockchain comes with a set of risks that must be considered,…These include errors in the design, malicious autonomous behaviour as a consequence of human decisions, and potential gaps in security across all inputs and outputs.Challenges such as these must be overcome if the economic and social benefits of blockchain are to be realized.”

STEM Literacy and Jobstag:typepad.com,2003:post-6a00d8341f443c53ef01bb0927f80f970d2016-09-13T05:21:14-04:002016-09-13T05:21:14-04:00In the early days of the Industrial Revolution, it’s been estimated that around 12% of the world's population was literate. Literacy rates increased throughout the 19th century, as people started moving from the countryside to towns and cities for the...IWB

In the early days of the Industrial Revolution, it’s been estimated that around 12% of the world's population was literate. Literacy rates increased throughout the 19th century, as people started moving from the countryside to towns and cities for the job opportunities opening up in the newly industrialized societies. Many of these new jobs, especially the higher paying ones, required the ability to read and write. With the rise of universal education in almost all countries around the world, literacy rates steadily increased in the 20th century, - from roughly 20% in 1900 to around 80% in 2000. In the 21st century, we not only have the challenge of eliminating illiteracy altogether, but we must now also focus on STEM (Science, Technology, Engineering, Math) literacy.

Most everyone would agree that there is a big difference between being proficient at reading and writing and being a playwright, a literary critic, a book editor or a journalist. The skills requirement are radically different. But, students don’t often appreciate the difference between achieving a modicum of STEM literacy and pursuing a STEM profession. Many avoid taking STEM courses because they have no intention of majoring in a STEM discipline. While everyone agrees that basic literacy is critical for just about any job, we don’t quite have the same level of appreciation that being STEM literate is increasingly important to qualify for a wide variety of jobs in our information-based knowledge economy.

Over the past few years, an increasing number of studies have argued that a good education for students majoring in STEM disciplines should include the so-called softer competencies more associated with the liberal arts in addition to harder, more technical skills.For example, a 2006 report by the National Academy of Engineering on the need to reforming engineering education noted that “New graduates were technically well prepared but lacked the professional skills for success in a competitive, innovative, global marketplace.Employers complained that new hires had poor communication and teamwork skills and did not appreciate the social and nontechnical influences on engineering solutions and quality processes.”

More recently, USC’s Annenberg School of Communications and Journalism conducted a studyto better understand the key competencies companies were looking for, and whether their talent requirements were being adequately addressed by universities. Future leaders, the study found, must be strong in quantitative, technical and business skills. But to advance in their careers, they also need to be good strategic thinkers and must have strong social and communications skills.Graduates in STEM disciplines will find that complementing their specialized technical skills with broader and more diverse liberal arts competencies will significantly enhance their employability in the near term, and their capacity for lifelong learning over the course of their careers.

But, when it comes to STEM, the discussions have mostly focused on STEM jobs rather than STEM literacy, and in particular, on whether we have a STEM crisis or a STEM surplus, - a debate I recently wrote about.A number of articles have pointed out that, as is often the case with such complex questions, both sides are right.It all depends.STEM includes a variety of disciplines, degree levels and employment sectors.While some occupations do indeed have a shortage of qualified talent, others have a surplus.

“Every year U.S. schools grant more STEM degrees than there are available jobs,” wrote Robert Charette in his 2013 article The STEM Crisis is a Myth.It’s thus hard to make the case that there’s a general STEM labor shortage other than spot shortages for certain specialized skills.However, “there is indeed a shortage - a STEM knowledge shortage,” he later added.

“To fill that shortage, you don’t necessarily need a college or university degree in a STEM discipline, but you do need to learn those subjects, and learn them well, from childhood until you head off to college or get a job.Improving everyone’s STEM skills would clearly be good for the workforce and for people’s employment prospects, for public policy debates, and for everyday tasks like balancing checkbooks and calculating risks.And, of course, when science, math, and engineering are taught well, they engage students’ intellectual curiosity about the world and how it works.”

The need for widespread STEM literacy was nicely articulated by MIT Professor Richard Larson in a 2012 article, - STEM is for Everyone.“A person has STEM literacy,” explained Larson, “if she can understand the world around her in a logical way guided by the principals of scientific thought. A STEM-literate person can think for herself. She asks critical questions. She can form hypotheses and seek data to confirm or deny them. She sees the beauty and complexity in nature and seeks to understand. She sees the modern world that mankind has created and hopes to use her STEM-related skills and knowledge to improve it.”

If STEM literacy is so important for improving their employment prospects, why don’t more young people take STEM courses? In his article, Larson discusses a few of the misperceptions that shed light on this question:

Engineering is all about hardware, gadgets and circuits.Engineering is not just about techno-gadget creation, if it ever was.Bioengineering, for example, is a growing discipline that deals with the application of engineering to problems in medicine and biology, including the design and development of new diagnostic and therapeutic devices, synthetic biomaterial, artificial tissues and organs and drug delivery systems.Systems engineering is an interdisciplinary field that applies engineering processes to the design and management of complex systems in a variety of areas including healthcare, education, energy and finance.And, design thinking is increasingly being applied beyond physical products to customer experiences, innovation, business strategy and complex problem solving.

The world already has too many scientists and engineers.The reality is that a relatively small percentage of STEM students go on to STEM careers.“Most STEM-literate students follow more regular non-technical careers but with a rich STEM knowledge that can give them a competitive advantage in this increasingly complex highly connected world.Becoming STEM literate can help any career path.”

“I do not plan to be an engineer or scientist, so STEM is not for me.Unfortunately, many STEM professionals and educators contribute to this misperception. They don’t do a particularly good job explaining to young people, their parents and the world at large what we mean by STEM literacy. They have not adequately made the case why quantitative reasoning, familiarity with sophisticated machines, and dealing with complex systems, problems and decisions are important job skills in our fast changing, complex world.

These skills are also very important in our daily life, which is now full of numbers and statistics. For example, understanding the interest rates we pay for our credit cards, car loan or mortgage requires a fair degree of quantitative reasoning. If we don’t do such reasoning by ourselves, we are at the mercy of others doing it for us who may not always have our best interests in mind.

“Many children born today are likely to live to be 100 and to have not just one distinct career but two or three by the time they retire at 80,” wrote Charette in his concluding paragraph.“Rather than spending our scarce resources on ending a mythical STEM shortage, we should figure out how to make all children literate in the sciences, technology, and the arts to give them the best foundation to pursue a career and then transition to new ones.”

Automation Technologies and the Future of Worktag:typepad.com,2003:post-6a00d8341f443c53ef01b7c88c9b93970b2016-09-06T06:43:59-04:002016-09-06T06:40:04-04:00Last year, McKinsey launched a multi-year study to explore the potential impact of automation technologies on jobs, organizations and the future of work. “Can we look forward to vast improvements in productivity, freedom from boring work, and improved quality of...IWB

Last year, McKinsey launched a multi-year study to explore the potential impact of automation technologies on jobs, organizations and the future of work.“Can we look forward to vast improvements in productivity, freedom from boring work, and improved quality of life?,” its initial article on the study asked, or “Should we fear threats to jobs, disruptions to organizations, and strains on the social fabric?”

Most jobs involve a number of different tasks or activities.Some of these activities are more amenable to automation than others.But just because some of the activities have been automated, does not imply that the whole job has disappeared.To the contrary, automating parts of a job will often increase the productivity and quality of workers by complementing their skills with machines and computers, as well as by enabling them to focus on those aspect of the job that most need their attention.

Given that few jobs or occupations will be entirely automated in the near-mid future, the study focused instead on the kinds of activities within jobs that are more likely to be automated, as well as how those jobs and business processes will then be redefined.It did so by analyzing the extensive data in O*NET, a web-based application sponsored by the US Department of Labor which includes the most comprehensive information on US occupations.

The study analyzed around 2,000 work activities across 800 different occupations, and grouped them into 18 different capabilities, - 3 of them social in nature; 10 cognitive; and 5 physical, - and then assessed the automation potential of each.It found that 45% of work activities could be automated using existing, state-of-the-art technologies.

In general, we assume that automation has been most successful when applied to routine tasks, that is, tasks or processes that follow precise, well understood procedures that can be well described by a set of rules.These include physical activities such as manufacturing and other forms of production, as well as information-based activities like accounting, record keeping, and many kinds of administrative tasks. As a result, mid-skill white-collar and blue-collar jobs, - where these activities predominate, - have experienced the biggest declines in employment and earnings.

But, AI, robotics and other advanced technologies are now challenging these automation assumptions.“It’s no longer the case that only routine, codifiable activities are candidates for automation and that activities requiring tacit knowledge or experience that is difficult to translate into task specifications are immune to automation.”Automation is now increasingly applied to activities requiring cognitive, physical, and social capabilities that not long ago were viewed as the exclusive domain of humans.“In many cases, automation technology can already match, or even exceed, the median level of human performance required.”

According to McKinsey, “fewer than 5 percent of occupations can be entirely automated using current technology.However, about 60 percent of occupations could have 30 percent or more of their constituent activities automated.In other words, automation is likely to change the vast majority of occupations - at least to some degree - which will necessitate significant job redefinition and a transformation of business processes…”

“As roles and processes get redefined, the economic benefits of automation will extend far beyond labor savings.Particularly in the highest-paid occupations, machines can augment human capabilities to a high degree, and amplify the value of expertise by increasing an individual’s work capacity and freeing the employee to focus on work of higher value.”

This past July, McKinsey published a second article on its automation study, which examined in more detail the technical feasibility of automating 7 different occupational activities:

A companion interactive website adds the ability to analyze the automation potential of over 800 occupations based on the study’s data sets.For example, the US manufacturing sector comprises 11.8 million full time employees, and has an overall automation potential of 59%, based on the automation potential and time spent in each of its 7 occupational activities; the finance and insurance sector comprises 5.5 million full time employees and has a 43% automation potential; and the administrative and government sector has 17.3 million full time employees and an automation potential of 38%.

The article reminds us that while technical feasibility is a necessary precondition for automation, it’s not a sufficient predictor.Other factors also come into play, including the costs of developing and deploying the automation technologies; the availability and costs of labor; additional benefits like higher quality and performance; and regulatory and social considerations.

The most automatable sector in the economy is accommodations and food service, with a 73% potential.Almost half the labor time of its 12 million employees is spent on predictable physical activities with an automation potential of 94%, which includes food preparation, cooking and serving; cleaning food preparation areas; and collecting dirty dishes.A variety of technologies are being tested, including self-service ordering, robotics servers, and even the automation of a number of cooking and food preparation tasks.

“But while the technical potential for automating them might be high, the business case must take into account both the benefits and the costs of automation, as well as the labor-supply dynamics…,” notes the article.“For some of these activities, current wage rates are among the lowest in the United States, reflecting both the skills required and the size of the available labor supply.Since restaurant employees who cook earn an average of about $10 an hour, a business case based solely on reducing labor costs may be unconvincing.”

Not surprisingly, healthcare and education are among the sectors with the lowest potential for automation since they both involve extensive human interactions, expertise and judgement.Overall, healthcare has a 36% automation potential.The potential is lower for health activities requiring medical expertise and direct contact with patients, and higher for activities like food preparation in hospitals and data collection.

At 27%, educations has the lowest automation feasibility of all the sectors examined in the McKinsey study.“To be sure, digital technology is transforming the field, as can be seen from the myriad classes and learning vehicles available online.Yet the essence of teaching is deep expertise and complex interactions with other people,” - among the least automatable of tasks.

“Understanding the activities that are most susceptible to automation from a technical perspective could provide a unique opportunity to rethink how workers engage with their jobs and how digital labor platforms can better connect individuals, teams, and projects. It could also inspire top managers to think about how many of their own activities could be better and more efficiently executed by machines, freeing up executive time to focus on the core competencies that no robot or algorithm can replace - as yet.”

Digital Identity - the Key to Privacy and Security in the Digital Worldtag:typepad.com,2003:post-6a00d8341f443c53ef01b7c88b9855970b2016-08-29T08:12:18-04:002016-09-08T09:45:33-04:00From time immemorial, our identity systems have been based on face-to-face interactions and on physical documents and processes. But, the transition to a digital economy requires radically different identity systems. In a world that’s increasingly governed by digital transactions and...IWB

From time immemorial, our identity systems have been based on face-to-face interactions and on physical documents and processes.But, the transition to a digital economy requires radically different identity systems.In a world that’s increasingly governed by digital transactions and data, our existing methods for managing security and privacy are no longer adequate.Data breaches, identity theft and large-scale fraud are becoming more common.In addition, a significant portion of the world’s population lacks the necessary digital credentials to fully participate in the digital economy.

Last month, the World Economic Forum (WEF) published an excellent report, A Blueprint for Digital Identity.The report lays out a framework for the creation of digital identity systems, discusses the benefits that these systems would bring to its various stakeholders, and argues that financial institutions should lead their development.It also includes a primer on identity, which clearly explains what identity is all about.

Identity plays a major role in everyday life. Think about going to an office, getting on a plane, logging to a website or making an online purchase. While all around us, we generally don’t pay much attention to our identity credentials unless something goes wrong. But, it’s a highly complex and interesting subject, which the report helped me better understand. Let me summarize some of what I learned.

“Why is identity important?,” the primer starts out by asking.“In an increasingly borderless and digital world, privacy and security cannot be ensured through the construction of walls around sensitive information.Identity is the new frontier of privacy and security, where the very nature of entities is what allows them to complete some transactions but be denied from completing others.To understand the importance of identity and the criticality of strong identity protocols that protect against cyber-risk and suit the needs of transacting parties, it is essential to understand what identity is, and its role in enabling transactions.”

What is identity?Whether physical or digital in nature, identity is a collection of individual information or attributes that describe an entity and is used to determine the transactions in which the entity can rightfully participate. Identites can be assigned to three main kinds of entities:

The identity for each of these entities is based on all its individual attributes, which fall into three main categories:

Inherent - “Attributes that are intrinsic to an entity and are not defined by relationships to external entities.”Inherent attributes for individuals include age, height, date of birth, and fingerprints; for a legal entity it includes business status, - e.g., C Corporation, S Corporation, LLC, - and industry, - e.g., retail, technology, media; and for an asset it includes the nature of the asset and the asset’s issuer.

Accumulated - “Attributes that are gathered or developed over time.These attributes may change multiple times or evolve throughout an entity’s lifespan.”For individuals these include health records, job history, Facebook friends lists, and sports preferences.

Assigned - “Attributes that are attached to the entity, but are not related to its intrinsic nature.These attributes can change and generally are reflective of relationships that the entity holds with other bodies.”For individuals these include e-mail address, login IDs and passwords, telephone number, social security ID, and passport number.

These attributes enable entities to participate in transactions, by proving that they have the specific attributes required for that particular transaction.For example, to buy alcohol, individuals must prove that they’re over the legal drinking age; to vote, they must prove that they’re over the legal voting age, are citizens, and reside in that jurisdiction.

An identity system generally includes four key roles:

Users - “entities for which the system provides identity, for the purpose of allowing them to engage in transactions”

Identity providers - “entities that hold user attributes, attest to their veracity and complete identity transactions on behalf of users”

Relying parties - “entities that accept attestations from identity providers about user identity to allow users to access their services”

Governance body - entity that “provides oversight for the system and owns the operating standards and requirements”

Let’s illustrate how an identity system works using passports as an example. Users are the individuals asked to present their passports as proof of identity to enter a country or open a bank account; the identity provider is the government of the user’s country issuing the passport; the relying party is the entity that accept the passport based on trusting the issuer and verifying that the passport is valid and the bearer is its true owner; and the governance body includes international agreements among passport agencies and passport standards agreed to by the International Civil Agency Organization.

The report notes that “The fundamental concept, purpose and structure of identity systems have not changed over time, while methods and technology have made huge strides forward…A digital identity system has the same basic structure as a physical identity system, but attribute storage and exchange are entirely digital, removing reliance on physical documents and manual processes.”

Five key trends are driving the need for digital identity systems:

Increasing transaction volumes - “The number of identity-dependent transactions is growing through increased use of the digital channel”

Increasing speed of financial and reputational damage - “Bad actors in financial systems are increasing sophisticated in the technology and tools that they use to conduct illicit activity, increasing their ability to quickly cause financial and reputational damage by exploiting weak identity systems”

In general, a digital identity system consist of multiple layers, each of which serves a different purpose.The WEF report identifies six distinct such layers:

Authentication - Mechanisms must be provided to link users to attributes to avoid inconsistent authentication

Attribute Exchange - Mechanisms must be provided provided for exchanging attributes between different parties without compromising security and privacy

Authorization - Proper rules and relationships must be applied to authorize what services users are entitled to access based on their attributes

Service Delivery - Users must be provided with efficient, effective, easy-to-use services

The report argues that “Financial Institutions [FIs] are well positioned to drive the creation of digital identity systems,” citing 3 major reasons for its conclusion:FIs already serve as intermediaries in many transactions; they’re generally trusted by consumers as safe repositories of information and assets; and their operations, - including the extensive use of customer data, - are rigorously regulated.

“FIs could derive substantial benefit from investing in the development of digital identity solutions,” the report further adds.These benefits fall into three main categories:

Blockchain - Once Again, the Technological Genie Has Been Unleashed from its Bottletag:typepad.com,2003:post-6a00d8341f443c53ef01bb09296fbe970d2016-08-22T18:21:18-04:002016-08-22T18:05:09-04:00Skeptics needing further reassurance that the Blockchain is truly reaching a tipping point can rest easier with the recent publication of Don Tapscott’s Blockchain Revolution, co-written with his son Alex. Don, whom I’ve long known, has a knack for identifying...IWB

The book is organized into three main sections.The first explains the blockchain from two complementary points of view: as a major next step in the evolution of the Internet; and as the architecture underpinning bitcoin, the best known and most widely held digital currency.The second and longest section describes how blockchain could potentially transform financial services, companies, government, the Internet of Things, and other key areas.The last section summarizes the major challenges that must be overcome as well as the governance required to fulfill the promise of blockchain.

“It appears that once again, the technological genie has been unleashed from its bottle,” write the authors in their opening paragraph.“Summoned by an unknown person or persons with unclear motives, at an uncertain time in history, the genie is now at our service for another kick of the can - to transform the economic power grid and the old order of human affairs for the better.If we will it.”

They remind us that “The first four decades of the Internet brought us e-mail, the World Wide Web, dot-coms, social media, the mobile Web, big data, cloud computing, and the early days of the Internet of Things.It has been great for reducing the costs of searching, collaborating, and exchanging information…Overall, the Internet has enabled many positive changes - for those with access to it - but it has serious limitations for business and economic activity.”

Foremost among these limitations are privacy, security and inclusion.“Doing business on the Internet requires a leap of faith” because the infrastructure lacks the necessary security.

Why wasn’t stronger security designed into the original Internet protocols?MIT research scientist and Internet pioneer David Clark addressed this question in a recent article about the early design choices that have led to today’s Internet.

Fundamentally, the Internet is a general purpose data network supporting a remarkable variety of applications.Being general purpose was a major design choice, best appreciated when considering the alternatives, such as the telephone network, which was designed specifically to carry telephone calls, or payment networks, - specifically designed to transfer money and settle financial transactions.This generality has enabled the Internet to become one of, if not, the most prolific innovation platforms the world has ever seen. But, it’s come at a price: it’s good-enough but not optimal for any one application.

Over the years, the Internet has faced a number of serious challenges, including running out of IP addresses, and lacking the necessary bandwidth to handle the growing requirements for streaming high-quality video.So far, it’s been up to its challenges. A major reason for its adaptability is that it’s stuck to its basic data-transport mission, i.e., just moving bits around.The Internet has no idea what the bits mean or what they’re trying to accomplish.That’s all the responsibility of the applications running on top of it.

Consequently, there’s no one overall owner responsible for security, - arguably the biggest challenge currently facing the Internet.The responsibility for security is divided among several actors, making it significantly harder to achieve. As Clark points out, “the design decisions that shaped the Internet as we know it likely did not optimize secure and trustworthy operation.” Hopefully, that’s what the blockchain will now help us do.

The blockchain first came to light around 2008, when “a pseudonymous person or persons named Satoshi Nakamoto outlined a new protocol for a peer-to-peer electronic cash system using a cryptocurrency called bitcoin,…” wrote Don and Alex Tapscott.“This protocol established a set of rules - in the form of distributed computations - that ensured the integrity of the data exchanged among these billions of devices without going through a trusted third party.This seemingly subtle act set off a spark that has excited, terrified, or otherwise captured the imagination of the computing world and has spread like wildfire to business, governments, privacy advocates, social development activists, media theorists, and journalists, to name a few, everywhere…”

“Today thoughtful people everywhere are trying to understand the implications of a protocol that enables mere mortals to manufacture trust through clever code.This has never happened before - trusted transactions directly between two or more parties, authenticated by mass collaboration and powered by collective self-interests, rather than by large corporations motivated by profit.” The blockchain is essentially “the World Wide Ledger of value… - a distributed ledger representing a network consensus of every transaction that has ever occurred.”

The original blockchain vision, - as inspired by Satoshi Nakamoto, - was limited to creating bitcoin, a digital currency and payment system whose users could transact directly with each other with no need for a bank or government agency to certify the validity of the transactions.There was no broader goal of creating the next generation of the Internet or of fundamentally transforming how the economy works.But, as has been the case with the Internet and World Wide Web, the blockchain has now transcended its original objective.

In the course of researching their book, Don and Alex Tapscott talked to lots of people and read many publications to better understand the potential of blockchain in shaping the evolution of the digital economy.A number of themes emerged from their research which they distilled into seven blockchain design principles:

Networked Integrity - “Trust is intrinsic, not extrinsic.Integrity is encoded in every step of the process and distributed, not vested in any single member… For the first time ever, we have a platform that ensures trust in transactions and much recorded information no matter how the other party acts.”

Distributed Power - “The system distributes power across a peer-to-peer networks with no single point of control.No single party can shut the system down.”

Value as Incentive - “The system aligns the incentives of all stakeholders… Now we have a platform where people and even things have proper financial incentives to collaborate effectively and create just about anything.”

Security - “Safety measures are embedded in the network with no single point of failure, and they provide not only confidentiality, but also authenticity and nonrepudiation to all activity… In the digital age, technological security is obviously the precondition to security of a person in society.”

Privacy - “People should control their own data.Period.People ought to have the right to decide what, when, how, and how much about their identities to share with anybody else.”

Inclusion - “The economy works best when it works for everyone.That means lowering the barriers to participation.It means creating platforms for distributed capitalism.”

“[T]hese seven principles can serve as a guide to designing the next generation of high-performance and innovative companies, organizations and institutions.If we design for integrity, power, value, privacy, security, rights and inclusion, then we will be redesigning our economy and social institutions to be worthy of trust.”

The Top Ten Emerging Technologies of 2016tag:typepad.com,2003:post-6a00d8341f443c53ef01bb091fde11970d2016-08-16T05:53:16-04:002016-07-28T08:07:15-04:00For the past several years, the World Economic Forum (WEF) has published an annual list of the Top Ten Emerging Technologies that would likely have the greatest impact on the world in the years to come. A few weeks ago,...IWB

“Horizon scanning for emerging technologies is crucial to staying abreast of developments that can radically transform our world, enabling timely expert analysis in preparation for these disruptors,” said Meyerson. “The global community needs to come together and agree on common principles if our society is to reap the benefits and hedge the risks of these technologies.”

The technologies on the list are not new.They’ve been worked on for many years.But their selection to the Top Ten List indicates that, in the opinion of the council members, each of these technologies has now reached a maturity and acceptance tipping point where its impact can be meaningfully felt.

Here are the ten technologies comprising the 2016 list, along with the reason cited by the WEF for their selection:

Nanosensors and the Internet of Nanothings — “With the Internet of Things expected to comprise 30 billion connected devices by 2020, one of the most exciting areas of focus today is now on nanosensors capable of circulating in the human body or being embedded in construction materials.”

The Blockchain — “With venture investment related to the online currency Bitcoin exceeding $1 billion in 2015 alone, the economic and social impact of blockchain’s potential to fundamentally change the way markets and governments work is only now emerging.”

2D Materials — “Plummeting production costs mean that 2D materials like graphene are emerging in a wide range of applications, from air and water filters to new generations of wearables and batteries.”

Autonomous Vehicles — “The potential of self-driving vehicles for saving lives, cutting pollution, boosting economies, and improving quality of life for the elderly and other segments of society has led to rapid deployment of key technology forerunners along the way to full autonomy.”

Organs-on-chips — “Miniature models of human organs could revolutionize medical research and drug discovery by allowing researchers to see biological mechanism behaviors in ways never before possible.”

Perovskite Solar Cells — “This new photovoltaic material offers three improvements over the classic silicon solar cell: it is easier to make, can be used virtually anywhere and, to date, keeps on generating power more efficiently.”

Open AI Ecosystem — “Shared advances in natural language processing and social awareness algorithms, coupled with an unprecedented availability of data, will soon allow smart digital assistants to help with a vast range of tasks, from keeping track of one’s finances and health to advising on wardrobe choice.”

Optogenetics — “Recent developments mean light can now be delivered deeper into brain tissue, something that could lead to better treatment for people with brain disorders.”

Systems Metabolic Engineering — “Advances in synthetic biology, systems biology, and evolutionary engineering mean that the list of building block chemicals that can be manufactured better and more cheaply by using plants rather than fossil fuels is growing every year.”

The full report includes a longer, one-page description of each of these technologies.Let me make some brief comments on those technologies I’m most familiar with: AI, IoT, Autonomous Vehicles and The Blockchain.

Artificial Intelligence

After many years of promise and hype, AI has finally been making great progress over the past several years.AI is now being applied to activities that not long ago were viewed as the exclusive domain of humans.And, as the WEF report notes in Open AI Ecosystem: From artificial to contextual intelligence,“over the past several years, several pieces of emerging technology have linked together in ways that make it easier to build far more powerful, human-like digital assistants… into an open AI ecosystem.”

“This ecosystem connects not only to our mobile devices and computers - and through them to our messages, contacts, finances, calendars and work files - but also to the thermostat in the bedroom, the scale in the bathroom, the bracelet on the wrist, even the car in the driveway…The secret ingredient in this technology that has been largely lacking to date is context.Up to now, machines have been largely oblivious to the details of our work, our bodies, our lives… AI systems are gaining the ability to acquire and interpret contextual cues so that they can gain these skills…Although initially these AI assistants will not outperform the human variety, they will be useful - and roughly a thousand times less expensive.”

The Internet of Things

The Internet of Things (IoT) has also been long in coming. Ubiquitous Computing dates back to the 1990s, when neither the necessary low-cost devices nor wireless networks were anywhere near ready. But in the last few year, IoT started to take off, with over 10 billion interconnected smart devices already out there, and the expectation that those numbers would rapidly expand to 10s of billions by 2025 and to 100s of billions in the decades ahead. “The myriad possibilities that arise from the ability to monitor and control things in the physical world electronically have inspired a surge of innovation and enthusiasm,” said a 2015 McKinsey report.”

IoT-based solutions are being deployed in a number of areas, including smart cities, smart homes and smart healthcare.In Nanosensors and the Internet of Nanothings, the WEF takes IoT to a whole new level:“Scientists have started shrinking sensors from millimeters or microns in size to the nanometer scale, small enough to circulate within living bodies and to mix directly into construction materials.This is a crucialfirst step toward an Internet of Nano Things (IoNT) that could take medicine, energy efficiency, and many other sectors to a whole new dimension…When it arrives, the IoNT could provide much more detailed, inexpensive, and up-to-date pictures of our cities, homes, factories - even our bodies.”

Autonomous Vehicles

According to the National Highway Traffic Safety Administration (NHTSA), traffic accidents claim over 30,000 lives and lead to over 2 million injuries every year in the US alone.94 percent of those crashes can be tied back to human error. Safety is thus far and away the overriding objective of vehicle automation technologies. These technologies may also aim to develop fully autonomous vehicles over time. But in the meantime, they will significantly improve the overall safety of our cars and help reduce our large numbers of traffic accidents, deaths and serious injuries.

In Autonomous Vehicles: Self-driving cars coming sooner than expected, the WEF report writes that “The long-term impact of autonomous vehicles on society is hard to predict, but also hard to overstate… Self-driving systems may have bugs - the software that runs them is complicated - but they are free from the myriad distractions and risk-taking behaviors that are the most common causes of crashes today.In the near term, semi-autonomous safety systems that engage only to prevent accidents, but that otherwise leave the driver in charge, will also likely reduce the human cost of driving significantly… Like every technology, autonomous vehicles will involve drawbacks as well.But the many benefits of self-driving cars and trucks are so compelling that their widespread adoption is a question of when, not if.”

The Blockchain

Finally, let me comment on the blockchain, a technology that first came to light around 2008 as the architecture underpinning bitcoin.But, as with the Internet, the Web and other major technologies, the blockchain has now transcended its original objective.Over the years, blockchain has developed a following of its own as a distributed data base architecture with the ability to handle trust-less transactions where no parties need to know nor trust each other for transactions to complete.Blockchain holds the promise to revolutionize the finance industry and other aspects of the digital economy by bringing one of the most important and oldest concepts, the ledger, to the Internet age. In The Blockchain: A revolutionary decentralized trust system, the WEF report explains that:

“Blockchain… is a decentralized public ledger of transactions that no one person or company owns or controls.Instead, every user can access the entire blockchain, and every transfer of funds from one account to another is recorded in a secure and verifiable form by using mathematical techniques borrowed from cryptography… The technology doesn’t make theft impossible, just harder.But as an infrastructure that improves society’s public records repository and reinforces representative and participatory legal and governance systems, blockchain technology has the potential to enhance privacy, security and freedom of conveyance of data - which surely ranks up there with life, liberty and the pursuit of happiness.”

Is There a STEM Crisis or a STEM Surplus?tag:typepad.com,2003:post-6a00d8341f443c53ef01b7c8827ea9970b2016-08-09T06:00:52-04:002016-08-05T07:39:21-04:00Is the US facing a critical shortage of STEM skills? Do we have enough STEM workers to meet the demands of the labor market? Are enough young people choosing STEM careers so we can meet future demands? Such serious concerns...IWB

Is the US facing a critical shortage of STEM skills? Do we have enough STEM workers to meet the demands of the labor market?Are enough young people choosing STEM careers so we can meet future demands?

Such serious concerns have been expressed in a number of national studies over the past two decades.In 2005, for example, the National Innovation Initiative listed “Build the Base of Scientists and Engineers” as one of its top recommendations, noting that “unless the United States takes swift action, the demand for S&E talent will far outstrip supply.The number of jobs requiring technical training is growing at five times the rate of other occupations.”

Two years later, a major National Academies study, Rising Above the Gathering Storm, called for increasing America’s STEM talent pool by providing 25,000 new 4-year competitive undergraduate scholarships to US citizens enrolled in the physical sciences, the life sciences, engineering and math; and by funding 5,000 new graduate fellowships each year for US citizens pursuing graduate studies in areas of national need.

And in 2012, a report by the President’s Council of Advisors on Science and Technology wrote: “Economic projections point to a need for approximately 1 million more STEM professionals than the U.S. will produce at the current rate over the next decade if the country is to retain its historical preeminence in science and technology.To meet this goal, the United States will need to increase the number of students who receive undergraduate STEM degrees by about 34% annually over current rates.”

There are, however, different views.In September of 2013, IEEE Spectrum published an article with the provocative title The STEM Crisis is a Myth: Forget the dire predictions of a looming shortfall of scientists, technologists, engineers, and mathematicians, by Robert Charette.“Every year U.S. schools grant more STEM degrees than there are available jobs,” wrote Charette.“When you factor in H-1B visa holders, existing STEM degree holders, and the like, it’s hard to make a case that there’s a STEM labor shortage… Even as the Great Recession slowly recedes, STEM workers at every stage of the career pipeline, from freshly minted grads to mid- and late-career Ph.D.s, still struggle to find employment as many companies, including Boeing, IBM, and Symantec, continue to lay off thousands of STEM workers.”

A similar view was expressed by Michael Teitelbaum, - a Sloan Foundation Vice President at the time, - when he testified before the House Subcommittee on Technology and Innovation in November of 2007.In his testimony, Teitelbaum said:

“First, no one who has come to the question with an open mind has been able to find any objective data suggesting general shortages of scientists and engineers… I would add here that these findings of no general shortage are entirely consistent with isolated shortages of skilled people in narrow fields or in specific technologies that are quite new or growing explosively.”

“Second, there are substantially more scientists and engineers graduating from U.S. universities that can find attractive career openings in the U.S. workforce.Indeed science and engineering careers in the U.S. appear to be relatively unattractive - relative that is to alternative professional career paths available to students with strong capabilities in science and math…”

“[T]he postdoc population, which has grown very rapidly in U.S. universities and is recruited increasingly from abroad, looks more like a pool of low-cost research lab workers with limited career prospects than a high-quality training program for soon-to-be academic researchers. Indeed, if the truth be told - only a very small percentage of those in the current postdoc pool have any realistic prospects of gaining a regular faculty position.”

A NY Timesarticle recently added that “The United States is producing more research scientists than academia can handle… The lure of a tenured job in academia is great - it means a secure, prestigious position directing a lab that does cutting-edge experiments, often carried out by underlings.Yet although many yearn for such jobs, fewer than half of those who earn science or engineering doctorates end up in the sort of academic positions that directly use what they were trained for.”

How do we reconcile these widely different views of the STEM labor market?An article published last year, STEM crisis or STEM surplus?Yes and yes - by Yi Xue, at the time an MIT graduate student, and MIT professor Richard Larson, - tried to make sense of this ongoing STEM debate.Their answer, as is often the case with such complex questions, is that both sides are right.It all depends.STEM includes a variety of disciplines, degree levels and employment sectors.While some occupations do indeed have a shortage of qualified talent, others have a surplus.

“The upshot is that there may not be a STEM crisis in all job categories, but instead just in select ones at certain degree levels and in certain locations…A job segment that traditionally has a shortage of workers may at some times have a surplus and vice versa. Thus, it is probably far more accurate to state that, within STEM job categories, there is a crisis or a surplus depending on the circumstances at the time the categories are investigated.”

The article examined the heterogeneous nature of STEM occupations on the basis of statistical data, current research papers, interviews with company recruiters across a range of industries, and anecdotal evidence from newspapers.It focused on graduates with postsecondary education within the STEM domain across the three main employment sectors: academia, government and the private sector.Their analysis yielded the following findings:

“The STEM labor market is heterogeneous.There are both shortages and surpluses of STEM workers, depending on the particular job market segment.”

“In the academic job market, there is no noticeable shortage in any discipline. In fact, there are signs of an oversupply of Ph.D.’s vying for tenure-track faculty positions in many disciplines (e.g., biomedical sciences, physical sciences).”

“In the government and government-related job sector, certain STEM disciplines have a shortage of positions at the Ph.D. level (e.g., materials science engineering, nuclear engineering) and in general (e.g., systems engineers, cybersecurity, and intelligence professionals) due to the U.S. citizenship requirement. In contrast, an oversupply of biomedical engineers is seen at the Ph.D. level, and there are transient shortages of electrical engineers and mechanical engineers at advanced-degree levels.”

“In the private sector, software developers, petroleum engineers, data scientists, and those in skilled trades are in high demand; there is an abundant supply of biomedical, chemistry, and physics Ph.D.’s; and transient shortages and surpluses of electrical engineers occur from time to time.”

“The geographic location of the position affects hiring ease or difficulty.”

So, is there a STEM crisis or a STEM surplus?“The answer is that both exist… As our society relies further on technology for economic development and prosperity, the vitality of the STEM workforce will continue to be a cause for concern.”

Is the Blockchain Now Reaching a Tipping Point?tag:typepad.com,2003:post-6a00d8341f443c53ef01b8d20a212d970c2016-08-02T05:33:51-04:002016-07-27T15:39:09-04:00A few weeks ago, the World Economic Forum (WEF) published its annual list of the Top Ten Emerging Technologies for 2016. The technologies on the list have been worked on for years. But their inclusion in the Top Ten List...IWB

A few weeks ago, the World Economic Forum (WEF) published its annual list of the Top Ten Emerging Technologies for 2016.The technologies on the list have been worked on for years.But their inclusion in the Top Ten List indicates that each has now reached a market acceptance tipping point where its impact can be meaningfully felt.The Blockchain is one of the technologies in this year’s list, selected by the WEF panel of global experts because of its emerging potential to fundamentally change the way markets and governments work.

What does it mean for an infrastructure technology like the blockchain to have reached such a tipping point?The WEF report compared the blockchain to the Internet, noting that “Like the Internet, the blockchain is an open, global infrastructure upon which other technologies and applications can be built.And like the Internet, it allows people to bypass traditional intermediaries in their dealings with each other, thereby lowering or even eliminating transaction costs.”

I agree with this comparison and find it useful to help us understand how blockchains might evolve over the years. So, I’d like to compare the state of the blockchain in 2016 to the state of the Internet 25 years ago or so.

In my opinion, the Internet reached its tipping point of market acceptance in the early to mid-1990s due to the confluence of three key factors:

A set of serious problems.In the early years of the IT industry, the proprietary products of different IT vendors did not work well with each other.Just sending an e-mail or exchanging information across systems based on different vendors’ networks and applications was quite a chore.By the late 1980s, the IT industry was facing a serious problem in integrating the expanding number of proprietary products from different vendors, a problem that threatened the continued growth of the industry.

Technology solutions to this problem had already emerged and were being deployed in the academic and research communities, e.g., Internet networks based on TCP/IP protocols; Internet e-mail applications based on SMTP, MIME, POP, and IMAP; and the World Wide Web based on an open set of standards, - HTML, HTTP, URLs, - and the easy-to-use, graphical web browsers.The use of open standards and open source software made it much easier to send an e-mail or share information across disparate systems and institutions.

The private sector, government and academia came together to jointly collaborate in leveraging the Internet, Web and other promising technologies to address IT’s serious integration problems.These various institutions now collaborated on developing the common Internet architecture, - and Internet-based applications like e-mail and the Web, - that they all were starting to deploy.

Let’s explore how these lessons might apply to the blockchain.The blockchain first came to light around 2008 as the architecture underpinning bitcoin.But, as with the Internet, the Web and other major technologies, the blockchain has now transcended its original objective.Over the years, blockchain has developed a following of its own as a distributed data base architecture with the ability to handle trust-less transactions where no parties need to know nor trust each other for transactions to complete.Blockchain holds the promise to revolutionize the finance industry and other aspects of the digital economy by bringing one of the most important and oldest concepts, the ledger, to the Internet age.

Ledgers constitute a permanent record of all the economic transactions an institution handles, whether it’s a bank managing deposits, loans and payments; a brokerage house keeping track of stocks and bonds; or a government office recording births and deaths, the ownership and sale of land and houses, or legal identity documents like passports and driver licenses.They’re one of the oldest and most important concepts in finance and other mission critical transaction applications.

Let’s take a look at the key factors now driving the blockchain to its market acceptance tipping point.

A Set of Serious problems.Foremost among the problems to be addressed is the need to bolster security in our increasingly digital economy and society.Over the past two decades, we’ve been moving to a world where information of all kinds is increasingly digital, and where many different kinds of online transactions are now taking place between people, institutions, and things.Our current methods for securely managing personal and proprietary information are proving inadequate as evidenced by the all-too-common data breaches, identity theft and large-scale fraud around the world.And in our increasingly digital world, applications must be robust enough to handle trust-less transactions, where no parties need to know or trust each other for transactions to complete.

“From buying products to running businesses to finding directions to communicating with the people we love, an online world has fundamentally reshaped our daily lives. But just as the continually evolving digital age presents boundless opportunities for our economy, our businesses, and our people, it also presents a new generation of threats that we must adapt to meet. Criminals, terrorists, and countries who wish to do us harm have all realized that attacking us online is often easier than attacking us in person. As more and more sensitive data is stored online, the consequences of those attacks grow more significant each year.”

Another problem is that a number of key financial and government legacy systems are rather inflexible and inefficient. Over the years, institutions have automated their original paper-based ledgers with sophisticated IT applications and data bases.But while most ledgers are now digital, their underlying structure has not changed. Each institution continues to own and manage its own ledger, synchronizing its records with those of other institutions as appropriate, - a cumbersome process that often takes days and involves a number of intermediaries.

In a recent NY Timesarticle, tech reporter Quentin Hardy, nicely explained the inefficiencies inherent in our current payment systems.“In a world where every business has its own books, payments tend to stop and start between different ledgers.An overseas transfer leaves the ledger of one business, then goes on another ledger at a domestic bank. It then might hit the ledger of a bank in the international transfer system.It travels to another bank in the foreign country, before ending up on the ledger of the company being paid.Each time it moves to a different ledger, the money has a different identity, taking up time and potentially causing confusion.For some companies, it is a nightmare that can’t end soon enough.”

Emerging technology solutions.Blockchain innovations have drawn on advances from a number of disciplines, including cryptography for secure communication, storage and data access; mathematical models like game theory that enable cooperation and decision-making among non-trusted parties; and distributed computing methods that enable large numbers of distributed systems and institutions to coordinate their actions and interact with each other to achieve common goals.

Blockchain technologies have been explained in a number of ways.A report published earlier this year by Citigroup, describes blockchain as “a distributed ledger database that uses a cryptographic network to provide a single source of truth.Blockchain allows untrusting parties with common interests to co-create a permanent, unchangeable, and transparent record of exchange and processing without relying on a central authority.In contrast to traditional payment model where a central clearing is required to transfer money between the sender and the recipient, Blockchain relies on a distributed ledger and consensus of the network of processors, i.e. a super majority is required by the servers for a transfer to take place.”

The report then compares blockchain technologies to the Internet.“If the Internet is a disruptive platform designed to facilitate the dissemination of information, then Blockchain technology is a disruptive platform designed to facilitate the exchange of value.Blockchain has a few clear advantages relative to the current system.First of all, it disintermediates the middle man.It enables direct transfer of digital assets without the need for an intermediary.Moreover, since no middle man is required, a Blockchain system has the likely benefit of fast and low cost settlement.Another promising innovation that leverages the Blockchain is smart contracts and tokenization.Smart contracts automate and execute pre-agreed conditions once they are met.And lastly, Blockchain provides irrefutable proof of existence, an important feature to maintain an audit trail that tracks the ownership of the valuable asset being transferred- this is crucial from a business and a regulatory perspective.”

The urgent need to collaborate.In my opinion, what has brought blockchain to its tipping point is the realization that these critical problems can only be addressed by the close collaboration among companies government and research communities around the world.You could sense this consensus emerging over the past year by the growing number of newspaper and magazine articles, as well as government and business reports.As was the case with the Internet, the consensus to collaborate is absolutely critical for blockchain technologies and applications to move forward, - including common standards, open source implementations of key components, and marketplace experiments to see what works and what does not.

While a tipping point might have been reached, it’s too early to know to degree to which blockchain will become a major transformational innovation.But, we’re hopeful that the necessary ingredients are now in place to propel blockchain forward.

The Long Term Impact of AI on Jobs - Some Lessons from Historytag:typepad.com,2003:post-6a00d8341f443c53ef01b8d1fff836970c2016-07-26T06:03:17-04:002016-07-26T06:03:17-04:00The June 25 issue of The Economist includes a special report on artificial intelligence. AI has been making extraordinary progress in the past few years. It’s ironic that after years of frustration with AI’s missed promises, many now worry that...IWB

The June 25 issue of The Economist includes a special report on artificial intelligence.AI has been making extraordinary progress in the past few years. It’s ironic that after years of frustration with AI’s missed promises, many now worry that its mighty power is now upon us while we still don’t know how to properly deploy it.Some fear that at some future time, a sentient, superintelligentgeneral AI might pose an existential threat to humanity.But while being dismissive of such dire concerns, many experts worry that the real threat is that AI advances could lead to widespread economic dislocation.

People have long worried about the impact of technology on society, be it railroads, electricity, and cars in the Industrial Age, or the Internet, mobile devices and smart connected products now permeating just about all aspect of our lives.The Economistreminds us that these worries have been with us ever since the advent of industrialization two centuries ago.Eminent English economist David Ricardo first raised the machinery question in 1821, that is, the “opinion entertained by the labouring class, that the employment of machinery is frequently detrimental to their interests”.

Automation anxieties continued to resurface in the 20th century, right along with accelerating technology advances. In a 1930 essay, English economist John Maynard Keynes wrote about the onset of “a new disease” which he named technological unemployment, that is, “unemployment due to our discovery of means of economising the use of labour outrunning the pace at which we can find new uses for labour.”But each time those fears arose in the past, technology innovations ended up creating more jobs than they destroyed, causing the majority of economists to confidently wave away the machinery question.

Automation fears have understandbly accelerated in recent years, as our increasingly smart machines are now being applied to activities requiring intelligence and cognitive capabilities that not long ago were viewed as the exclusive domain of humans. The concerns surrounding AI’s long term impact may well be in a class by themselves. Like no other technology, AI forces us to explore the very boundaries between machines and humans.

What impact will AI have on jobs?Could our smart machines lead to mass unemployment?What will life be like in such an AI future?“After 200 years, the machinery question is back. It needs to be answered, notes The Economist.”What can we learn from history that will help us better respond to AI’s technological advances?

MIT economist David Autor explored the lessons from history in a 2015 paper, Why Are There Still So Many Jobs? The History and Future of Workplace Automation.Dramatic declines have occurred in a number of occupations over the past 100 years.The percentage of US workers employed in agriculture has declined from 41% in 1900 to 2% in 2000.Cars drastically reduced the demand for blacksmiths and stable hands, machines have replaced many manual jobs in construction and factories, and computers have been steadily displacing large numbers of record keeping and office positions.

Given the continuing automation of so much human work over the past couple of centuries, - why are there still so many jobs left?The answer isn’t very complicated, although frequently overlooked. As Autor succinctly puts it: “tasks that cannot be substituted by automation are generally complemented by it.” Automation does indeed substitute for labor. However, automation also complements labor, raising economic outputs in ways that often lead to higher demand for workers.

Most jobs involve a number of tasks or processes. Some of these tasks are more routine in nature, while others require judgement, social skills and other human capabilities. The more routine and rules-based the task, the more amenable it is to automation. But just because some of its tasks have been automated, does not imply that the whole job has disappeared. To the contrary, automating the more routine parts of a job will often increase the productivity and quality of workers, by complementing their skills with machines and computers, as well as enabling them to focus on those aspect of the job that most need their attention.

The Economistreferences the work of economist James Bessen, who in a recent Atlantic article, - The Automation Paradox, - argued that “what’s happening with automation is not so simple or obvious.It turns out that workers will have greater employment opportunities if their occupation undergoes some degree of computer automation.As long as they can learn to use the new tools, automation will be their friend.”

This was the case with the weaving machines that the Luddites famously opposed in the early days of the Industrial Revolution.The automation of tasks in the weaving processes prompted workers to focus on the things the machines could not do causing output to grow explosively. “In America during the 19th century the amount of coarse cloth a single weaver could produce in an hour increased by a factor of 50, and the amount of labour required per yard of cloth fell by 98%. This made cloth cheaper and increased demand for it, which in turn created more jobs for weavers: their numbers quadrupled between 1830 and 1900.In other words, technology gradually changed the nature of the weaver’s job, and the skills required to do it, rather than replacing it altogether…”

The advent of automated teller machines (ATMs) in the 1970s is another, more recent example. By 2010, there were approximately 400,000 ATMs in the US. But, not only were bank tellers not eliminated, but their numbers actually rose modestly from 500,000 in 1980 to 550,000 in 2010.Replacing some bank employees with ATMs made it cheaper to open new branches while changing their work mix, away from routine tasks and towards tasks like sales and customer service that machines could not do.

“The same pattern can be seen in industry after industry after the introduction of computers, says Mr Bessen: rather than destroying jobs, automation redefines them, and in ways that reduce costs and boost demand.In a recent analysis of the American workforce between 1982 and 2012, he found that employment grew significantly faster in occupations (for example, graphic design) that made more use of computers, as automation sped up one aspect of a job, enabling workers to do the other parts better.The net effect was that more computer-intensive jobs within an industry displaced less computer-intensive ones.Computers thus reallocate rather than displace jobs, requiring workers to learn new skills…So far, the same seems to be true of fields where AI is being deployed.”

Since the 1980s, US job opportunities have sharply polarized.Mid-skill occupations involving routine manual (blue-collar) and cognitive (white-collar) tasks have been declining because they’re prone to automation and to outsourcing to lower-wage countries.At the same time, we’ve seen the steady growth of jobs involving non-routine, low skill manual tasks, - e.g., food and cleaning services, personal care and health care aides, - and non-routine, high skill cognitive tasks, - e.g., managerial, professional and technical occupations.A recent graph by the Federal Reserve Bank of St Louis starkly illustrates the job polarization that’s taken place over the past 30 years, and in particular, the increasing dominant role of high skill cognitive occupations.

“As with the introduction of computing into offices, AI will not so much replace workers directly as require them to gain new skills to complement it…” notes The Economist. But, “Even if job losses in the short term are likely to be more than offset by the creation of new jobs in the long term, the experience of the 19th century shows that the transition can be traumatic”.Industrialization led to major increases in productivity, income and living standards over the long run, but it took significantly longer than is often appreciated.“[D]ecades passed before this was fully reflected in higher wages.The rapid shift of growing populations from farms to urban factories contributed to unrest across Europe.Governments took a century to respond with new education and welfare systems.”

“This time the transition is likely to be faster, as technologies diffuse more quickly than they did 200 years ago.Income inequality is already growing, because high-skill workers benefit disproportionately when technology complements their jobs.This poses two challenges for employers and policymakers: how to help existing workers acquire new skills; and how to prepare future generations for a workplace stuffed full of AI.”

No one can really tell if technology will once more end up creating more jobs than it destroys, or if this time will be different and AI will end up replacing many jobs, including high skill ones, while creating few new ones.But regardless, we cannot ignore the machinery question.Even if AI doesn’t lead to mass unemployment, technological advances are already disrupting labor markets and contributing to social unrest.

How should we respond? Companies and governments need to assist workers in acquiring new skills while helping them switch jobs as needed. This includes “making education and training flexible enough to teach new skills quickly and efficiently… a greater emphasis on lifelong learning and on-the-job training, and wider use of online learning and video-game-style simulation.”

It will also require updating our social policies, perhaps along the lines of Denmark’s flexicuritysystem, which aims to achieve both flexibility in labor markets and security for workers, letting firms “hire and fire easily, while supporting unemployed workers as they retrain and look for new jobs.Benefits, pensions and health care should follow individual workers, rather than being tied (as often today) to employers.”

“John Stuart Mill wrote in the 1840s that ‘there cannot be a more legitimate object of the legislator’s care’ than looking after those whose livelihoods are disrupted by technology.That was true in the era of the steam engine, and it remains true in the era of artificial intelligence.”

The Continuing Evolution of Service Sciencetag:typepad.com,2003:post-6a00d8341f443c53ef01bb09155c6f970d2016-07-19T07:34:21-04:002016-07-19T07:34:21-04:00Service Science emerged in the mid-2000s as an academic discipline aimed at applying technology and science to the service sector, - by far the largest sector of the US economy and of most economies around the world. Since its inception,...IWB

A few weeks ago I had an interesting conversation on the the state of service science with analysts from an IT research organization who were preparing a report on the subject for their clients.Our discussion led me to reflect on the evolution of service science over the past several years.I think that we are hearing a bit less about it these days.But is that because we’ve become tired of the subject and moved on, or because the application of science and technology to services is now so well accepted that it’s no longer a topic of debate?I very much think it’s the latter.

The service sector is by far the largest in the global economy, comprising close to 65 percent of the world’s overall GDP, and between 70 and 80 percent in the more advanced economies. Most of the working population in such countries are employed in services in one way or another, - roughly two thirds of all jobs in Brazil, Japan and the European Union and around 80 percent in the UK and the US.

In July of 2009, the UK’s Royal Society released a report -Hidden Wealth: the contribution of science to service sector innovation.“Our main conclusion,…” said the report “is that services are very likely to remain central to the new economy, not least because we are at or near a tipping point: innovations now underway seem likely to change dramatically the way we live and to generate many services (though few can be predicted in detail at present).”

According to the study, STEM (science, technology, engineering, math) is omnipresent in the service sector, but, unlike the case with the industrial sector, its impact is rarely recognized.

“Scientific and technological developments (many of which originated in fundamental blue skies research), have precipitated major transformations in services industries and public services, most notably through the advent of the internet and world-wide-web… However, the full extent of STEM’s current contribution is hidden from view - it is not easily visible to those outside the process and is consequently under-appreciated by the service sector, policymakers and the academic research community. This blind spot threatens to hinder the development of effective innovation policies and the development of new business models and practices…”

The study used Hidden Wealth in its title to make the point that even though services constitute such a large portion of GDP and of jobs around the world, their very nature remains vague, - a kind of hard-to-measure dark matter. Services are ubiquitous across many sectors of the economy, e.g., finance, healthcare, retail, creative industries, business support, education and transportation and logistics. However, they’re neither easily visible nor well understood.

It’s often easier to define the service sector by what it doesn’t include: it’s not agriculture or fishing, and it’s not manufacturing, construction or mining. Just about every other job is in services, including manual ones like janitors, gardeners, restaurant employees and health care aides, and white collar ones like sales and office workers, managers and professionals.Perhaps the one definition everyone can agree to is one attributed to The Economist: a service is “anything sold in trade that cannot be dropped on your foot.”

We’ve been applying science and technology to the agriculture and industrial sectors of the economy ever since the advent of the Industrial Revolution over two hundred years ago.But until recently it’s been difficult for universities, companies and policy makers to support the samekind of research and education programs in the service sector.This has all been changing.Services are now front and center in some of the most prominent areas in IT, such as analytics and data science, cloud computing, and design thinking.Let me say a few words about each.

Analytics and Data Science

“This is the first time in human history that we have the ability to see enough about ourselves that we can hope to actually build social systems that work qualitatively better than the systems we’ve always had,” said MIT Professor Alex “Sandy” Pentland in an online conversation, Reinventing Society in the Wake of Big Data.

“I believe that the power of Big Data is that it’s information about people’s behavior - it’s about customers, employees, and prospects for your new business, . . . This Big Data comes from location data from your cell phone and transaction data about the things you buy with your credit card. It’s the little data breadcrumbs that you leave behind you as you move around in the world… and by analyzing this sort of data, scientists can tell an enormous amount about you. They can tell whether you are the sort of person who will pay back loans. They can tell you if you’re likely to get diabetes.”

Throughout history, scientific revolutions have been launched when new tools make possible new measurements and observations, e.g., the telescope, the microscope, spectrometers, DNA sequencers. Over the past few hundred years, we’ve significantly increased our understanding of the natural world around us by collecting large amounts of data and by developing disciplined ways to analyze and make sense of all that data.

Our new big data tools now have the potential to usher an information-based scientific revolution, helping us extract insights from all the data we’re now collecting by applying tried-and-true scientific methods, that is, empirical and measurable evidence subject to testable explanations and predictions.We’ve long been applying scientific methods in the natural sciences and engineering.But given our newfound ability to gather valuable data on almost any area of interest, we can now bring out tried-and-true scientific methods to just about any domain of knowledge.This should enable us to better understand and make predictions in complex, people-centric, service-oriented systems like healthcare, business organizations, government agencies and cities.

Cloud computing is essentially the Internet of Services.Data centers have now become the production plants of cloud-based services.Software and information are increasingly being delivered as industrial-scale online services, while the Internet and wireless networks connect more and more devices to such offerings.

Cloud computing requires well-engineered infrastructures, applications and services, a transformation that’s been pioneered by born-to-the-cloud companies like Amazon, Google, and Salesforce.The architectural standards and management disciplines of public cloud providers have been increasingly embraced by many other companies, so they too can efficiently deliver high quality services to their own customers, business partners and employees.

Design Thinking

High quality and competitive costs are the key objectives of good products. But services are all about people, as consumers and/or providers of the service. In addition to high quality and competitive costs, achieving a superior customer experience is now a top priority across all industries given the growth of services throughout the economy.

It’s much easier to appreciate the role of design when it comes to physical objects: cars, bridges, buildings, dresses, shoes, jewelry, smartphones, laptops, and so on. But, it’s considerably harder to appreciate its importance when it comes to more abstract entities like services, systems, information and organizations.Yet, they account for the bulk of the growing complexity in our daily lives.

Design thinking is all about having positive service experiences with the companies we do business with. Good design aims to make our interactions with complex institutions, - e.g., a business, a healthcare provider, a government function, - as appealing and intuitive as possible. Design-centric organizations are adamantly focused on their customers’ needs.

We may not be hearing as much about service science because, in a sense, the battle has been won.The technologies, methods and concepts once pioneered in service science are now well accepted in mainstream IT and academic disciplines.We still have much to do, but we no longer have to argue that science, engineering and design now play a prominent role in services all across the economy.

Vehicle Automation - a Very Important Priority for AI Researchtag:typepad.com,2003:post-6a00d8341f443c53ef01b8d2034900970c2016-07-12T07:13:47-04:002016-07-10T11:32:13-04:00In September, 2014, I attended an MIT conference that explored the major progress that’s taken place in artificial intelligence, robotics and related technologies over the past several years. Autonomous vehicles was one of the main areas of discussion. With most...IWB

In September, 2014, I attended an MIT conference that explored the major progress that’s taken place in artificial intelligence, robotics and related technologies over the past several years. Autonomous vehicles was one of the main areas of discussion. With most other topics, there was considerable consensus, but not so with self-driving cars. While some thought that fully autonomous vehicles will be all around us within a decade or so, others were not quite so sure, myself included, due to the many highly complex technical, economic and societal issues that must be worked out.

I was reminded of this meeting a few weeks ago when I read that a Florida man had been killed while driving a Model S Tesla in autopilot mode.The accident is still under investigation, but it appears that the Tesla struck a tractor-trailer truck that was making a left turn in front of its path.Neither the driver or the Tesla’s autopilot noticed that a truck was suddenly crossing its lane of traffic, perhaps because the white truck was hard to spot against a bright sky.

This accident has led to a renewed discussion of the current state-of-the-art of vehicle automation, the approaches being pursued by different companies, and the prospects for the near- and long-term future.

AI is being increasingly applied to activities that not long ago were viewed as the exclusive domain of humans.Most complex human activities are composed of several distinct tasks or steps, some of which are more routine in nature and thus more amenable to automation, while others require judgement, problem solving skills, and other hard-to-automate human capabilities.Automating the more routine parts of such complex activities enables us to focus on those tasks that most require our attention.

Combined Function Automation (Level 2): Driver can cede active control to at least two primary functions but is still responsible for monitoring and safe operations and expected to be available at all times, e.g., adaptive cruise control and lane departure warning systems.

Limited Self-Driving Automation (Level 3): Driver can cede full control under certain traffic and environmental conditions, but must be available at all times to take back controls when needed.

Full Self-Driving Automation (Level 4): Human provides destination and/or navigation input but is not expected to be available to control the vehicle at any time during the trip. Vehicle is solely responsible for safe operations and could be traveling occupied or empty.

The auto industry has been adding level 1 features to their cars for over 40 years, and more sophisticated level 2 features for the past couple of decades. Generally, these are first introduced in higher end cars, but as their costs drop over time, the features are then available in lower priced cars as well.While some of the features are aimed at user convenience, safety is far and away the overriding objective.

The NHTSA estimates that there were over 35,000 traffic deaths in 2015, an increase of 7.7 percent over 2014 fatalities. In addition, over 2.3 million people were injured in 2014. 94 percent of crashes can be tied back to human error. Over the past decade, close to 100 people have been killed in traffic accidents each day in the US alone.

These figures are stark reminders that, as NHSTSA Administrator Mark Rosekindnoted, “we need to focus our efforts on improving human behavior while promoting vehicle technology that not only protects people in crashes, but helps prevent crashes in the first place.” Vehicle automation, including autonomous vehicle AI research, is thus a very important priority to try to significantly reduce the 94 percent of fatal crashes involving human error.

Toyota, for example, recently established the Toyota Research Institute to pursue the development of what it calls guardian angel automation technologies, where the driver is always in control until a crash looms, at which point the automation system fully takes over to try to prevent the accident.Toyota is also pursuing the development of level 4 self-driving cars, but its guardian angel approach appears more promising for the near future. It’s clearly positioned as an advanced level 2 capability where the driver is always in charge unless an accident is imminent.Its key objective is saving human drivers from their all-too-frequent errors, - not replacing them.

While just about all car companies continue to seriously invest in advanced level 1 and 2 technologies, their approaches toward level 4 are quite different.Some, like Tesla, have taken a more incremental level 3 approach, which allows the driver to cede control of the car under certain conditions while emphasizing that drivers must remain alert, engaged and always ready to take over.But, asa number of recent articles have pointed out, the autopilot feature is so compelling that drivers soon get comfortable and are lulled into feeling that they can turn their attention away from the road. Then can then get distracted with other activities and take their hands off the wheel, no matter how much they’ve been warned not to do so.

“Experiments conducted last year by Virginia Tech researchers and supported by the national safety administration found that it took drivers of Level 3 cars an average of 17 seconds to respond to takeover requests,” noted a recent NY Timesarticle.“In that period, a vehicle going 65 m.p.h. would have traveled 1,621 feet- more than five football fields.”

According to this recent LA Timesarticle, Google chose a different path from Tesla based on its early experiences with its self-driving car prototypes. “Once behind the wheel of the modified Lexus SUVs, the drivers quickly started rummaging through their bags, fiddling with their phones and taking their hands off the wheel- all while traveling on a freeway at 60 mph.”In a panel earlier this year, Chris Urmson, - head of Google’s self-driver program, - noted that “Within about five minutes, everybody thought the car worked well, and after that, they just trusted it to work.It got to the point where people were doing ridiculous things in the car.”

“After seeing how people misused its technology despite warnings to pay attention to the road, Google has opted to tinker with its algorithms until they are human-proof… focusing on fully autonomous vehicles- cars that drive on their own without any human intervention and, for now, operate only under the oversight of Google experts.”

But, fully autonomous cars still face many challenges.For example, Gill Pratt, who was recruited from DARPA to lead the Toyota Research Institute, noted earlier this year that a fully autonomous vehicle must be able to handle highly unusual situations it had never encountered before, such as avoiding a mattress falling off a moving truck on a crowded highway. Handling such challenges will take time.

Given that fully autonomous level 4 vehicles are likely many years in the future, some car companies believe that the more incremental transition to level 3 is still a good intermediate step to take, especially if it’s done under highly specific circumstances aimed at reducing overall traffic accidents.Audi, for example, is planning to introduce level 3 features for some of its cars in the near future, such as stop-and-go traffic on the highway at no more than 35 mph.

Finally, there are other innovations to be explored beyond the automation levels defined by the NHTSA.While autonomous machines find it very hard to operate in highly unpredictable environments, we’ve long been adapting and simplifying environments so we can benefit from what machines are good at.In highly selected environments, such as moving between terminals in an airport, the trains can be fully automated and not require a human operator.

Such approaches would enable us to co-design our vehicles along with the environment in which they will operate. We could imagine the development of cars and trucks that can only go into self-driving mode when they’re in specially instrumented traffic lanes limited to such vehicles. Once outside those lanes, the vehicles would revert to their more classic human-controlled mode. Such hybrid approaches to vehicle automation might be a reasonable intermediate step on our way to full vehicle automation.

The AI research going into vehicle automation is very important because the stakes are so high, - saving many, many thousands of lives around the world.Most of the technologies being developed will significantly improve the overall safety of our cars and help reduce our large numbers of traffic accidents, deaths and serious injuries.And perhaps at some point in the future we will see self-driven vehicles coursing along our streets and highways.

The Smart London Plantag:typepad.com,2003:post-6a00d8341f443c53ef01b8d1deeef5970c2016-07-04T09:35:57-04:002016-07-04T09:35:57-04:00Urbanization is one of the major forces shaping the 21st century - right up there with the digital technology revolution and globalization. Over half of the world’s population lives in urban areas, and as the 2014 UN World Urbanization Prospect...IWB

Urbanization is one of the major forces shaping the 21st century - right up there with the digital technology revolution and globalization. Over half of the world’s population lives in urban areas, and as the 2014 UN World Urbanization Prospect noted: “The continuing urbanization and overall growth of the world’s population is projected to add 2.5 billion people to the urban population by 2050,” with the proportion of the population living in urban areas increasing to 66 percent by 2050.

Just about every study that’s benchmarked the competitiveness of major urban areas ranks London, - along with New York, - as the world’s two top cities. But despite, - or perhaps because of, - their leadership positions, both cities face major challenges as they deal with economic growth and a growing population.

London has been much in the news lately. First came the election of Sadik Khan last May, - the first Muslim mayor not only of London but of a major Western capital, followed by the recent Brexit referendum, where London voted to Remain in the EU by an overwhelming 60% of its vote while the UK as a whole voted 52% to Leave.

At this point, it’s very hard to predict where Brexit is heading over the next few months, let alone what it’s long term consequences might be. But, given its many strengths I have little doubt that London will remain among the world’s very top cities. So, let me put Brexit aside for the moment and discuss instead what London has been doing to address its major population and economic challenges.

In March of 2013, then London Mayor Boris Johnson appointed the Smart London Board, to advise “how London can put digital technology at the heart of making the capital an even better place to live, work and invest.”The Board is chaired by David Gann, - Professor and VP of Innovation at Imperial College, - and is composed of top experts from business, academia and NGOs.

In December of 2013, the Board released its Smart London Plan.The report articulated some of the growth challenges that London faces, including:

“Its population is set to grow by one million people over the next decade, and the city’s infrastructure is struggling to cope with the increasing demands placed on it.”

“Congestion on London‘s roads cost the economy an estimated £2 billion, with Londoners spending 70 hours on average in traffic jams annually.”

“An ageing population is changing the capital’s health, social care and educational needs.”

“There is a pressing demand to create new jobs alongside skills so that Londoners can access the new opportunities that advances in technology bring.”

“A (projected) population of almost ten million by 2030 will increase stress not only on healthcare and transport, but the management of energy and utilities - such as water, electricity and heat, and the need to deal with growing waste and pollution.”

“Without new approaches, including new business models and new ways of investing in new approaches, London will not be able to grow whilst maintaining and improving its lead as the greatest city on earth, and as a liveable city, offering a good quality of life.” The 2013 report outlined seven specific objectives to be further developed over the next few years.

In March of 2016 the Smart London Board released its follow-on report, - The Future of Smart: Harnessing digital innovation to make London the best city in the world. The report evaluated how much progress had been made on the Smart London Plan in the intervening two years. Here’s a brief summary of the progress made in each of the seven objectives:

Increase the number of Londoners who use digital technologies. Smart London is experimenting with a number of digital tools, including the ability to engage online in policy development.“We’re also increasing Londoners’ digital skills… . The number of Londoners using digital technologies to engage with City Hall is increasing. We want it to grow even faster in the future.”

Provide open access to London’s data. A number of initiatives have been started, especially Data for London, which is aimed at developing a dynamic, productive City Data Market.“We are building data products and platforms to show the tech sector what is possible and to test growth scenarios for London and its infrastructure.”

Leverage London’s research, technology and creative talent.Smart London has supported a number of smart, connected business initiative through its Export and International Business Programs, Super Connected Cities scheme, connectivity toolkit, and connectivity rating scheme.“We want to improve our support for London’s SME tech community as the number of employees in the technology sector moves past 200,000.”

Establish a Smart London Innovation Network.The Smart London District and Infrastructure Innovation Networks are identifying and bringing together London’s tech talent to work on London’s challenges… It aims to show how innovative uses of technology -for example, using the River Thames to heat homes, testing electric bikes and trialling state-of-the-art smart parking bays - can improve the lives of residents.”

Enable London to grow its physical infrastructure.“We’re promoting new and smarter heating, electricity, waste and water networks that use resources efficiently and do more with less investment.The Infrastructure Mapping Application and our work in speeding up London’s transition to a circular economy are other ways we’re helping the city to grow and adapt.”

Help the London Government to better serve the complex needs of its population.“The Smart London Borough Partnership is increasing data sharing between the boroughs. It identifies opportunities to roll solutions out at scale.”

Offer a Smarter London experience to all through digital technologies.Smart London is working hard to improve the lives of Londoners through digital technology.“We are demonstrating this through projects in sustainability and transport in Queen Elizabeth Olympic Park as part of the Smart London District Innovation Networks.London’s connectivity is improving, but we still need faster networks to capitalise on the digital talents of Londoners and businesses.”

In addition, the Smart London Board identified specific recommendations for Mayor Khan in three key areas: engaging citizens; enabling growth; and working with business:

Wide, inclusive digital engagement with Londoners and businesses.“Citizen engagement in the development of digital services should be part of the democratic process. Londoners and businesses should become digitally included and both the public and private sector need to scale-up proven efforts to overcome digital exclusion.”

Enabling good growth.This includes “harnessing data and digital technology to meet the growth challenges facing London’ s infrastructure, environment, and transport systems.”In addition, the report recommends that “The Mayor should create an innovation investment programme that supports scalable smart environmental solutions and services as market opportunities for the technology and data sector.”

Working with business.“The Mayor should convene all parties and pilot innovative solutions to resolve real world challenges that affect London and Londoners.It should pave the way for solutions to scale-up from pilots to wide distribution by connecting with and supporting London’s expanding community of tech entrepreneurs…The Mayor should campaign to improve basic broadband networks and support alternative connectivity providers to prevent small companies from paying for expensive bespoke digital solutions.”

“We have succeeded in keeping London at the forefront in a data-driven world.We have funded and launched a range of new projects. We have also developed a wide community of participants. However, there is more to be done.”

“The twin challenges of economic growth and a growing population are putting a strain on London’s housing, healthcare and transport infrastructure. The environment remains a challenge, particularly air quality.Technology is changing apace… In these tough conditions, standing still is not an option.”

“We must invest more in London’s data infrastructure. Doing so will help the city be brilliantly placed to make the most of the Internet of Things.It must be better connected too. That means making super-fast broadband available to everyone in London and investing in digital skills.”

Virtual Work Interactions, Context, and Related Subjectstag:typepad.com,2003:post-6a00d8341f443c53ef01bb0916978c970d2016-06-27T09:08:26-04:002016-06-27T08:55:37-04:00I just read Tripping Down a Virtual Reality Rabbit Hole,- a very interesting NY Times column by technology journalist Farhad Manjoo. Manjoo recently spent a few weeks testing the Occulus Rift and the HTC Vive, two of the most powerful...IWB

“The whole point of virtual reality is to create a fantasy divorced from the physical world,” he wrote. “You’re escaping the dreary mortal coil for a completely simulated experience: There you are, climbing the side of a mountain, exploring a faraway museum, flying through space or getting in bed with someone way out of your league… There are some great games on these systems… There are also several useful experiences, like designing your Ikea kitchen in V.R.”

“But if you’re not a gamer and you’re not looking for a new kitchen, V.R. is, at this point, just too immersive for most media.A few minutes after donning my goggles, I came to regard my virtual surroundings as a kind of prison… I suspect that V.R. will be used by the masses one day, but not anytime soon.I’m not sure we’re ready to fit virtual reality into our lives, no matter how excited Silicon Valley is about it.”

Manjoo’s reaction brought to mind the work of Karen Sobel-Lojeski, who’s long been studying technology-mediated interactions. She’s in the faculty at Stony Brook University as well as the founder of the workplace consulting firm Virtual Distance International. To be effective, she argues, our technology-based interactions must take place within the proper context, defined as “everything around us that helps us to understand who we are, where we are, and what our role is.” The way VR works, - at least for now, - is by cutting off all real-world context to better immerse us in its simulations, - an experience that as Farhad Manjoo discovered, can be very unsettling.

In a 2015 Harvard Business Reviewarticle, Sobel-Lokeski wrote that “today’s workforce has more than enough tools to send information back and forth to people all over the world.But those tools - and the use of them - do not necessarily constitute collaboration. Collaboration implies more than just passing data back and forth in an attempt to develop what is often a non-descript deliverable that can be as forgettable as the interactions themselves.Genuine collaboration is achieved through ongoing meaningful exchanges between people who share a passion and respect for one another…”

“However, today’s keyboard-tapping workers have very little context around who their counterparts are, how they feel about things, or what they hope for - in other words, what motivates them.Without a panoramic perspective, it’s difficult to form a sense of common purpose. In fact, when a seemingly intelligent screen is the only frame in sight, people often default to decoding messages based on what they know, filling the contextual void using their own experience to color in the blank backgrounds behind their co-workers.But this can create distorted perceptions about other people’s values and beliefs, causing collaboration conundrums.”

She coined the term virtual distance to describe such a lack of shared context and common purpose among the members of a work team.Virtual distance is essentially a measure of what’s lost when human interactions are not properly enabled by machines.The greater the virtual distance, the more problems the team will experience.

People often think that physical distance is the main contributor to virtual distance.While physical distance is indeed a component, poor performance occurs just as frequently when work teams are co-located on the same floor but their members remain distant from each other.According to Sobel-Lojeski, three sets of factors can contribute to virtual distance:

Physical Distance:Including geographical and time-zone separation, and/or being part of different departments or organizations.While these all have an impact on the success of projects, they don’t on their own create virtual distance.

Operational Distance: This can lead to miscommunications due to a number of factors, including not enough face-to-face group meetings, and problems with the technology tools being used. These issues are the easiest to change, but will not necessarily have a long-term impact if other concerns persist.

Affinity Distance: Including the degree to which team members share cultural values and common communication styles, their attitudes toward work, how dependent they feel on each other for the success of the project, and how much of a working relationship they have.This is the most important factor in determining virtual distance.

To better understand what it takes to organize successful virtual work teams, Sobel-Lojeski and her colleagues have conducted surveys and interviews with hundreds of large enterprises, which uncovered high levels of virtual distance in companies around the world.Their data also showed that high virtual distance is accompanied by unintended, unwanted effects, including declining trust and goal clarity; less cooperation and innovative behaviors; and a decline by more than 50% in organizational commitment, satisfaction and overall project success.

“Virtual distance generates a shift in how people feel about themselves, other people, and the way in which they see themselves as part of, or separate from, the larger organizational landscape.In the absence of shared context, the connectivity paradox emerges: the more people are connected, the more isolated they can feel.And isolates among isolates do not collaborate, instead they simply comply with management edicts…”

“To restore true collaboration, leaders must continuously restore shared context.A simple example would be to make sure that all team members know what the local time is for each participant on a call.If it’s late for one member, the leader can acknowledge that whatever to-do list results from the call, they can start it in the morning.Believe it or not, this small thing - bringing the time of day into context and acting accordingly - can help a team member feel respected.It also shows other team members that the manager is compassionate, which makes everyone feel more at ease.Revealing shared context and making appropriate adjustments can have a profound impact on performance.”

Creating Context.Context is essentially “the foundation upon which we derive meaning from what other people say.”In the past, context was readily available because co-workers saw each other every day.“But today it’s not so simple. Work is commonly done in temporary projects where people come and go, and organizational affiliations change with each new project or merger or downsizing…”

“So one of the things that leaders need to do most is to help individuals and teams in the virtual workforce see the context that is otherwise invisible. They do this by understanding how to use technology to communicate effectively and by serving as a human anchor, or constant, to help everyone stay connected.”

Cultivating Community.“The word community is not one normally associated with corporate leadership.But today as organizations have become flatter and more matrixed, the ability to recruit people to work on projects or other assignments has become an important aspect of leadership.One way that effective leaders do this is by building diverse communities of people who have the skill and commitment to help, even though this may fall outside their prescribed organizational roles.”

Co-Activating New Leaders.“[L]eadership alone is not enough when it comes to large, networked organizations consisting of people who sit within the bounds of traditional organizational structures but who are also part of the new virtual workforce… Unlike models that espouse the leader as the singular transformative figure, today’s leaders co-opt others to make things happen - putting themselves aside at times, asserting their authority at other times, but recruiting others to lead at all times.”

“For leaders, the very first step in reducing virtual distance is to become aware that it’s strongly embedded everywhere screen-based interactions occur - between people sitting side-by-side with thumbs thumping while meeting for lunch or amongst team members scattered across the globe with only a glowing screen to keep them company,” writes Sobel-Lojeski in her HBR article.

“To address it, leaders need to develop techno-dexterity, which is the ability to act deliberately when communicating, understanding which message to deliver when and through which channel (face-to-face, phone, email, video, etc.)…By thinking about the who, what, when, where, and how of messaging and by including how much context the other person might need to fully understand your message, you will reduce virtual distance and improve performance.”

FinTech and Financial Inclusiontag:typepad.com,2003:post-6a00d8341f443c53ef01b7c86e4eb2970b2016-06-21T06:31:00-04:002016-06-21T06:19:24-04:00In early June I spent a few days in Mexico City. The main purpose of the trip was to participate and give a keynote at the 2016 Cumbre de Directores (Director’s Summit), a conference sponsored by Endeavor Mexico in collaboration...IWB

In early June I spent a few days in Mexico City.The main purpose of the trip was to participate and give a keynote at the 2016 Cumbre de Directores (Director’s Summit), a conference sponsored by Endeavor Mexico in collaboration with the IPADE Business School.Endeavor is a global not-for-profit organization dedicated to long-term economic growth by supporting high-impact entrepreneurship in over 25 cities around the world, including Mexico.The Director’s Summit is Endeavor Mexico’s major annual meeting, bringing together over 400 entrepreneurs and senior executives.

My talk was focused on the changing nature of innovation in the digital economy.A major part of the talk included a discussion of innovations in the digital payments and financial services ecosystem.My keynote was followed by two separate FinTech panels, where different entrepreneurs discussed the importance of FinTech in Mexico as well as their individual companies’ efforts in the area.That same evening I attended and spoke at a reception sponsored by Angel Ventures Mexico, an early stage VC firm where I also heard quite a bit about FinTech startups. My overall impression is that there’s considerable FinTech activity in Mexico, a large portion of which is aimed at financial inclusion.

Every year, the World Bank publishes the Global Findex database, a measure of financial inclusion around the world, including how individuals save, borrow, make payments and manage risks in over 140 countries.According to their latest report, - Global Findex 2014, - 62 percent of adults worldwide have an account at a bank, at another type of financial institution, or with a mobile money provider, - up from 51 percent in 2011.The 2014 report also noted that 2 billion adults remain without an account around the world, a 20 percent decrease from the 2.5 billion unbanked adults in 2011. Technology advances, particularly the rapid growth in mobile devices and digital financial services, are the major reasons for these dramatic improvements.

The Global Findex database includes data for each individual country. Their 2014 data for Mexico showed that about 39 percent of adults had accounts with financial institutions, - a significant improvement over the 27.4 percent of adults with accounts in 2011.But, Mexico still lags many of its peers in Latin America.Over half of all adults have financial accounts in Latin America as a whole, significantly above Mexico’s figure.

It’s thus not surprising that entrepreneurs see a business opportunity in providing financial services to the large fraction of the working population in Mexico’s informal economy who are not officially involved with financial institutions or monitored by government agencies. Informal labor markets represents a significant portion of the economies of developing countries like Mexico, providing jobs and incomes to the large portion of the working population outside the countries’ formal economy. But its members are mostly unbanked, lacking the financial histories and credit ratings that are generally required to establish a banking relationship.

The Opportunities of Digitizing Payments, - a report published in 2014 by the Gates Foundation, the Better than Cash Alliance, and the World Bank, - noted:“Studies show that broader access to and participation in the financial system can reduce income inequality, boost job creation, accelerate consumption, increase investments in human capital, and directly help poor people manage risk and absorb financial shocks.”The digitization of payments has a major impact on broad-based economic growth, financial inclusion, and women’s economic empowerment, helping to overcome the costs and physical barriers that have beset otherwise valuable financial inclusion efforts. “Digital platforms offer the opportunity to rapidly scale up access to financial services using mobile phones, retail point of sales, and other broadly available access points, when supported by an appropriate financial consumer protection framework.”

“Account ownership is a first step towards financial inclusion,” added the Global Findex report.“But what really matters is whether people actually use their account - and the data are promising. More than 65 percent of account users in developing countries report having used their accounts at least three times a month, to save, or to make or receive electronic payments directly from their account.Yet, 1.3 billion adults with an account in developing countries pay their trash, water, and electric bills in cash, and over half a billion adults with an account in developing countries pay school fees in cash.”

“Financial inclusion also allows people to manage risks by providing a safe place to save money for emergency and giving them access to credit when needed: 28 percent of adults- 1.2 billion adults - in developing countries report they would use their savings in case of an emergency.Yet 56 percent of these adults do not save at a financial institution.”

In developed markets, FinTech innovations are primarily aimed at improving the mobile payments user experience. But, for the billions around the world without access to traditional financial services, FinTech innovations go well beyond convenience. FinTech could be their ticket to financial inclusiveness and membership in the global digital economy.

“High unbanked population, weak consumer banks and high mobile phone penetration make emerging markets ripe for FinTech disruptions,…” says Digital Disruption: How FinTech is Forcing Banking to a Tipping Point, an excellent report by Citigroup released earlier this year.“In our view, new entrants have a greater chance of success in markets with underdeveloped or fragmented banking systems accompanied with a high level of digital readiness.Emerging market banks are more at risk of market share shift - or more likely lost future retail growth opportunity.Smartphone penetration is higher than banking penetration in many emerging market countries and many emerging markets are digital leaders while they are banking laggards.”

“Banks, emerging market banks in particular, often tend to focus on the wealthy and mass affluent segment of the population.Where wealth is concentrated in a small segment of the population there is a long tail of lower value bank customers that can be captured by FinTech companies with a lower cost to serve model.Another important factor is the more pragmatic regulatory environment in some emerging market countries such as China and Kenya towards FinTech innovators…Not surprisingly, policymakers look favorably at FinTech as part of the solution to financial inclusion.”

As Christensen explained in a recent Harvard Business Review article, disruption is “a process whereby a smaller company with fewer resources is able to successfully challenge established incumbent businesses.”Rather than focusing on incremental improvements to existing products and services, disruptive innovations appeal to new or less-demanding customers whose needs have not been served by incumbent providers, mostly because they’re not as profitable as their present customers.Startups generally succeed by reaching brand new customers whose needs were previously unserved by existing products, rather than by competing directly against entrenched offerings.

Such a disruption is now taking place in the finance industry, with FinTech startups addressing the large number of potential customers that’ve long been left behind by traditional financial institutions, especially in emerging economies like Mexico where the need is greatest.The digital revolution is finally reaching the banking industry, and it will be interesting to see whether incumbent institutions embrace FinTech innovations before FinTech startups gain scale and distribution.While it’s difficult to predict how it will all play out over the years, the game is most definitely on, as I learned during my recent trip to Mexico.

Platforms and the New Rules of Strategytag:typepad.com,2003:post-6a00d8341f443c53ef01bb090858ac970d2016-06-14T06:45:16-04:002016-06-17T19:19:14-04:00Last week I wrote about one of the talks at the annual conference of the MIT Initiative on the Digital Economy, - Only Humans Need Apply by Tom Davenport, based on his recently published book of the same title. I...IWB

Van Alstyne started out his presentation by noting that back in 2007, seven firms controlled 99% of handset profits: Nokia, Samsung, Ericsson, Motorola, LG, RIM and HTC.That same year, Apple’s iPhone was born and began gobbling up market share.By 2015, only one of the former incumbents had any profit at all, while Apple now generated 92% of the industry’s global profits.

What happened? “Is it likely all 7 incumbents had failed strategies, run by clueless management, lacking execution capabilities?,” he asked.“Or was something more fundamental happening?…Nokia and the others had classic strategic advantages that should have protected them: strong product differentiation, trusted brands, leading operating systems, excellent logistics, protective regulation, huge R&D budgets, and massive scale.For the most part, those firms looked stable, profitable, and well entrenched.”How can we explain their rapid free fall?

We all know the answer to Van Alstyne’s rhetorical questions. “Apple (along with Google’s competing Android system) overran the incumbents by exploiting the power of platforms and leveraging the new rules of strategy they give rise to.Platform businesses bring together producers and consumers in high-value exchanges. Their chief assets are information and interactions, which together are also the source of value they create and their competitive advantage.”

For the past two centuries, the industrial economy has been driven by supply-side economies of scale.Because of the massive fixed costs of physical assets, firms achieving higher volumes have a lower cost of doing business, which allows them to reduce costs and further increase volumes.Market power is thus achieved by controlling resources, increasing efficiency and fending off competition. “The goal of strategy in this world is to build a moat around the business that protects it from competition and channels competition toward other firms,” notes the HBR article.

In the digital economy, on the other hand, the driving force is demand-side economies of scale.That’s what network effects are all about.Scale is what increases a platform’s value.The more products or services a platform offers, the more users it will attract, helping it then attract more offerings, which in turn brings in more users, which then makes the platform even more valuable.Moreover, the larger the network, the more data is available to customize offerings to user preferences and better match supply and demand, further increasing the platform’s value.

Platforms have long played a key role in the IT industry.IBM’s System 360 family of mainframes, announced in 1964, became the premier platform for commercial computing over the following 25 years by developing a 3rd party ecosystem of add-on hardware, software and service.In the 1980s, the rapid growth of personal computers was largely driven by the emergence of the Wintel platform which attracted a large ecosystem of hardware and software developers.Then in the 1990s, the explosive growth of the Internet drove platforms to a whole new level, connecting large numbers of PC users to a wide variety of web sites and online applications.Platforms have grown even more dramatically over the past decade, with billions of users now connecting via smart mobile devices to all kinds of cloud-based applications and services.

IT has been bringing the power of platforms to an increasing number of industries.By reducing their need to own physical assets, “IT makes building and scaling up platforms vastly simpler and cheaper, allows nearly frictionless participation that strengthens network effects, and enhances the ability to capture, analyze, and exchange huge amounts of data that increase the platform’s value to all.You don’t need to look far to see examples of platform businesses, from Uber to Alibaba to Airbnb, whose spectacular growth abruptly upended their industries.”

For almost 20 years now, e-commerce platforms have been giving physical stores and shopping malls a run for their money.More recently, we’ve seen Uber, - with only 5,000 employees and a valuation of $60 billion, - and Airbnb, - with 3,000 employees and a valuation of $21 billion, - disrupt the transportation and hotel industries respectively.

It’s instructive to contrast platform-based strategies with those of more classic product strategies. In a 2008 HBRarticle, Harvard professor Michael Porter explained that the key to a successful product strategy is to understand and cope with five key competitive forces:

rivalry among existing competitors;

threat of potential new entrants;

threat from substitute products or services;

bargaining power of suppliers; and

bargaining power of buyers.

“The extended rivalry that results from all five forces defines an industry’s structure and shapes the nature of competitive interaction within an industry,” wrote Porter.

But the nature of competition and strategy is quite different in a platform-based business, said Van Alstyne:

The goal is interactions that yield network effects and provide growth and sustainability, - not protecting market niches or erecting industry barriers.

Industry boundaries can be altered as appropriate, - rather than sticking to sharply defined categories.

Competition is multi-layered, “more like 3D chess”, - rather than just relaying on product differentiation or lower costs.

Competitors are turned into complementors that offer their products or services on the platform - there’s no longer a need to own unique, inimitable resources.

In a product based business, “the five forces are relatively defined and stable. If you’re a cement manufacturer or an airline, your customers and competitive set are fairly well understood, and the boundaries separating your suppliers, customers, and competitors are reasonably clear.” But, in a platform business, those boundaries can shift rapidly because of shifting dynamics within the platform ecosystem.

The platform participants, - consumers, producers, providers, - are key to value creation. “But they may defect if they believe their needs can be met better elsewhere… every platform must induce producers and consumers to interact and share their ideas and resources. Effective governance will inspire outsiders to bring valuable intellectual property to the platform… That won’t happen if prospective partners fear exploitation.”

“With platforms, a critical strategic aim is strong up-front design that will attract the desired participants, enable the right interactions (so-called core interactions), and encourage ever-more-powerful network effects… while guarding against threats remains critical, the focus of strategy shifts to eliminating barriers to production and consumption in order to maximize value creation…To that end, platform executives must make smart choices about access (whom to let onto the platform) and governance (or control - what consumers, producers, providers, and even competitors are allowed to do there)…”

The transition from a product-oriented strategy to a platform strategy requires three key shifts, note Van Alstyne et al in their HBR article:

From resource control to resource orchestration.Don’ttry to gain competitive advantage by controlling scarce and/or valuable resources“With platforms, the assets that are hard to copy are the community and the resources its members own and contribute, be they rooms or cars or ideas and information.In other words, the network of producers and consumers is the chief asset.”

From internal optimization to external interaction.Don’t attempt to create value by optimizing the entire chain of product and service activities.“Platforms create value by facilitating interactions between external producers and consumers.Because of this external orientation, they often shed even variable costs of production.”

From a focus on customer value to a focus on ecosystem value.Don’t seek to maximize the lifetime value of individual customers of products and services.“[P]latforms seek to maximize the total value of an expanding ecosystem in a circular, iterative, feedback-driven process.”

“The failure to transition to a new approach explains the precarious situation that traditional businesses -from hotels to health care providers to taxis - find themselves in…the writing is on the wall: Learn the new rules of strategy for a platform world, or begin planning your exit.”

Learning to Work with Our Increasingly Smart Machinestag:typepad.com,2003:post-6a00d8341f443c53ef01bb090c1971970d2016-06-06T06:16:02-04:002016-06-06T12:06:31-04:00A few weeks ago I attended the annual conference of MIT’s Initiative on the Digital Economy. The day-long conference featured a number of interesting talks on the impact of digital technologies on business, the economy and society. Tom Davenport, -...IWB

Davenport started his talk by noting that over the past two centuries we’ve seen three distinct stages of automation, based on the kinds of jobs that were replaced by machines.The machines of the first automation era “relieved humans of work that was manually exhausting,” making up for our physical limitations, - steam engines and electric motors enhanced our physical power while railroads, cars and airplanes helped us go faster and farther.

Next came the automation of jobs involving routine tasks that could be well described by a set of rules and were thus prime candidates for IT substitution.“Era Two automation doesn’t only affect office workers.It washes across the entire services-based economy that arose after massive productivity gains wiped out jobs in agriculture, then manufacturing.”It threatened many transactional service jobs that “are so routinized that they are simple to translate into code,” from bank tellers to airline reservations clerks.

We’ve now entered the third era of automation.Our increasingly smart machines are “now breathing down our necks…This time the potential victims are not tellers and tollbooth collectors, much less farmers and factory workers, but rather all those knowledge workers who assumed they were immune from job displacement by machines,…” including, - as Davenport and Kirby poignantly remind us, - “People like the writers and readers of this book.”

Earlier this year Google’s AlphaGo won a match against one of the world’s top Go players.AI-based machines can now play championship-level Go, assist in the diagnosis and treatment of rare forms of cancer, and navigate our roads as self-driving cars, encroaching into activities that not long ago were viewed as the exclusive domain of humans. “Brilliant technologies can now decide, learn, predict, and even comprehend much faster and more accurately than the human brain, and their progress is accelerating.Where will this leave lawyers, nurses, teachers, and editors?”

Part of the answer is that most jobs involve an amalgam of tasks or processes. Some of these tasks are more routine in nature, while others require judgement, social skills and other human capabilities. The more routine and rules-based the task, the more amenable it is to automation. But just because some of the tasks have been automated, does not imply that the whole job has disappeared. To the contrary, automating the more routine parts of a job will often increase the productivity and quality of workers by complementing their skills with machines and computers, as well as enabling them to focus on those aspect of the job that most need their attention.

“Most workers eagerly embrace the machines that save them from the day-in and day-out chores of their jobs that take up time and add nothing to their net knowledge…,” write Danvenpot and Kirby.“People want the extra productivity they get from state-of-the-art tools because it frees up capacity for them to take on more interesting challenges… automation of one task after another tends not to be seen as the infiltrating enemy by employees.”

“And neither is it seen as a problem by most customers.When a task can be performed well by a machine, they prefer it, too.Obviously, paying customers appreciate when higher productivity means that prices go down; while some people might cherish paying higher prices to enjoy artisanal products and services, most go for the product that does the job at the lowest price possible.But beyond price, automation often improves quality, reliability, and convenience. When ATMs arrived, customers didn’t complain about the automated option.By now, few could imagine life without them.”

Many knowledge work tasks are at risk of being automated, warned Davenport. But while some knowledge workers will lose their jobs, it will likely be on the margins rather than the whole job going away.Perhaps we’ll need 8 lawyers instead of 10.But, there is no room for complacency, he added, listing several examples of tasks within knowledge jobs that are quite susceptible to automation, including online class content within teaching, e-discovery in law firms, and automated cancer detection in radiology.

All in all, four major skills areas emerge where, - at least for now, - humans are still considerably superior to machines:

Expert thinking:People are very good at pattern recognition, allowing us to imagine new ways of solving problems based on our knowledge of what’s worked well in other areas.

Creativity and ideation: People keep coming up with new scientific breakthroughs, gripping novels, and great new business ideas.

Complex communications:Millions of years of evolution enable humans to broadly interpret a situation and read people’s emotions and body language, - skills that are crucial for interpersonal activities like nurturing, coaching, motivating and leading.

Dexterity and mobility: Humans are very good at many tacitly learned, common sense tasks, such as being a waiter, which might involve walking across a crowded restaurant, serving a table, and taking dishes back into the kitchen.

In the end, it all comes down to learning to race with rather than against the machines. Over the past two centuries we’ve successfully adapted to Industrial age machines. It would have made no sense to look at the Industrial Revolution as a race between humans and steam power to see who is stronger, or between humans and cars to see who is faster.Similarly, we must now learn to adapt to and work with our increasingly smart machines.

The key, wrote Davenport and Kirby in a related HBR article, is to “reframe the threat of automation as an opportunity for augmentation… What if, rather than asking the traditional question - What tasks currently performed by humans will soon be done more cheaply and rapidly by machines? - we ask a new one: What new feats might people achieve if they had better thinking machines to assist them? Instead of seeing work as a zero-sum game with machines taking an ever greater share, we might see growing possibilities for employment.”

This is in fact what we’ve been doing for the past couple of centuries, - from the textile machines at the dawn of the Industrial Revolution to the productivity apps of our digital revolution.In such an augmentation environment, humans and machines support each other, with the machines making the human more productive, and the human ensuring that the computer is doing a good job, is on the look-out for the common-sense mistakes that computers often make, and making sure that the machine keeps learning and improving.

Davenport finished his IDE talk by discussing five key ways for humans to partner with smart machines, so together they can do things much better than either could on its own:

Stepping In - Master the details of the system and learn its strengths and weaknesses, including when it needs to be carefully monitored and improved.

Stepping Up - Move up above the automated systems, developing big-picture insights, decisions and views that are too unstructured and sweeping for computers to make on their own.

Stepping Aside - Focus on areas that humans do better than computers, at least for now, such as selling, motivating people or explaining the decisions that computers have made.

Stepping Narrowly - Find an area that is so specialized and narrow that it’s not worth automating.

Stepping Forward - Build the next generation of computers, robots and AI tools, using the human talent for thinking outside the box and envision new tools, applications and business opportunities that don’t yet exist.

“Today, many knowledge workers are fearful of the rise of the machines,” write Tom Davenport and Julia Kirby in the book’s concluding paragraph.“We should be concerned, given the potential for these unprecedented tools to make us redundant.But we should not feel helpless in the midst of the large-scale change unfolding around us.The steps are there for us to take.It’s up to us, individually and collectively, to strike new, positive relationships with the machines we have made so capable.With our powers combined, we can make our workplaces, and our world, better than they have ever been.”

The Prosperity Puzzletag:typepad.com,2003:post-6a00d8341f443c53ef01b7c85c4034970b2016-05-31T06:43:43-04:002016-05-31T06:43:43-04:00The April 30 issue of The Economist includes a special briefing on The Prosperity Puzzle. The briefing highlights how tricky it is to compare living standards across countries, across economic classes within a country, and, - arguably hardest of all,...IWB

The April 30 issue of The Economist includes a special briefing on The Prosperity Puzzle. The briefing highlights how tricky it is to compare living standards across countries, across economic classes within a country, and, - arguably hardest of all, - across time.

“Which would you prefer to be: a medieval monarch or a modern office-worker?,” it glibly asks.“The king has armies of servants.He wears the finest silks and eats the richest foods.But he is also a martyr to toothache.He is prone to fatal infections.It takes him a week by carriage to travel between palaces.And he is tired of listening to the same jesters.Life as a 21st-century office drone looks more appealing once you think about modern dentistry, antibiotics, air travel, smartphones and YouTube.”

He first calculated the price of light by adding up the prices of the things people bought over the years to make light, - from candles in 1800 to compact fluorescent light bulbs in 1992.This is the traditional way that prices are calculated using GDP, - “a measure of the value of all final goods and services produced in a period (quarterly or yearly).”By this GDP-like measure, the price of light rose by a factor of between three and five over the past two centuries. Next he calculated the change in the true price of light, by estimating the price in cents per lumen-hours, a measure that would quantify the considerable innovations in generating light over the same time period. By this measure, the true price of light has plummeted by considerably more than a hundredfold since the early 1800s.

Nordhaus’ research implies that the true price of light “has been underestimated by a factor of between nine hundred and sixteen hundred since the beginning of the industrial age.If the case of light is representative of those products that have caused tectonic shifts in output and consumption, then this raises the question of whether the conventional measures of real-output and real-wage growth over the last two centuries come close to capturing the true growth.”

According to The Economist, not only is GDP a poor measure of material well-being and prosperity, but it’s not even a reliable gauge of production. Our heavy reliance on such a deeply flawed measure may account for our prosperity puzzle, “distorting levels of anxiety in the rich world about everything from stagnant incomes to disappointing productivity growth.”

GDP was first developed in the 1930s at the request of the US Congress to quantify the full economic impact of the Depression.In the 1940s it was used in the US and the UK to help estimate their economies’ capacities to make guns, tanks, airplanes and everything necessary for the second world war.Since then, GDP has become the key measure of a country’s production capacity, while GDP per capita is generally viewed as an indicator of a country’s standard of living.

Most measures of economic performance used by government officials to inform their policies and decisions are based on GDP figures, including setting taxes, fixing unemployment and managing inflation.But, many concerns have been raised about the adequacy of GDP-based measurements given the major structural changes that economies around the world have been experiencing over the past few decades.

GDP does not adequately capture the growing share of services and complex system infrastructures that characterize advanced economies.In the 1950s, the industrial sector made up more that a third of GDP, and the service sector was a bit over 50% in the US and the UK.Today, services account for close to 80% of of their economies, while manufacturing has significantly declined.

GDP is a relic of a time dominated by manufacturing, where the production of physical goods was easier to measure.But it’s a less reliable measure of services, where there is much more variation in quality and value.Services are increasingly tailored to individual tastes, with user experience being a growing portion of the value of a service, - be it a meal in a restaurant or an online purchase.

This is a particular concern in our increasing digital economy.How do you measure the value of the explosive amount of free goods available over the Internet, including Wikipedia articles, Facebook social interactions, Linux open source software and You Tube videos?Since all these services are free, they are excluded from GDP.In addition, many new services we used to pay for are now also free, including long-distance calls, newspaper articles, maps and recorded music.

“We know less about the sources of value in the economy than we did 25 years ago,” wrote economists Erik Brynjolfsson and Adam Saunders in a 2009 article.“We see the influence of the information age everywhere, except in the GDP statistics. More people than ever are using Wikipedia, Facebook, Craigslist, Pandora, Hulu and Google. Thousands of new information goods and services are introduced each year. Yet, according to the official GDP statistics, the information sector (software, publishing, motion picture and sound recording, broadcasting, telecom, and information and data processing services) is about the same share of the economy as it was 25 years ago - about 4%. How is this possible? Don’t we have access to more information than ever before?”

“What we measure affects what we do; and if our measurements are flawed, decisions may be distorted…” it noted in its report issued in September, 2009. “So too, we often draw inferences about what are good policies by looking at what policies have promoted economic growth; but if our metrics of performance are flawed, so too may be the inferences that we draw.”

The Commission recommended measuring economic activity beyond GDP, looking at income and consumption rather than production because they more closely reflect economic well being, that is, how well off people are. To better track living standards, it called for tracking household income and consumption in addition to aggregated measure of the economy, as well as measuring not just the averages, but also the distribution of income, consumption and wealth.It also called for factoring in measurements of sustainability to help reflect the evolution of the economy into the future.

“Quality of life depends on people’s objective conditions and capabilities,” noted the report. “Steps should be taken to improve measures of people’s health, education, personal activities and environmental conditions. In particular, substantial effort should be devoted to developing and implementing robust, reliable measures of social connections, political voice, and insecurity that can be shown to predict life satisfaction.”

According to The Economist, measuring prosperity better calls for three key changes:

Improve GDP as a gauge of production. “Junking it altogether is no answer: GDP’s enduring appeal is that it offers, or seems to, a summary statistic that tells people how well an economy is doing.Instead, statisticians should improve how GDP data are collected and presented…they should rely more on tax records, internet searches and other troves of contemporaneous statistics, such as credit-card transactions, than on the standard surveys of businesses or consumers.”

Pioneer a new, broader annual measure, that would aim to capture production and living standards more accurately.This is particularly important in services-dominated advanced economies.“This new metric - call it GDP-plus - would begin with a long-overdue conceptual change: the inclusion in GDP of unpaid work in the home, such as caring for relatives.GDP-plus would also measure changes in the quality of services by, for instance, recognising increased longevity in estimates of health care’s output.It would also take greater account of the benefits of brand-new products and of increased choice.”

Take stock of a country’s overall prosperity and wealth each decade.“This balance-sheet would include government assets such as roads and parks as well as private wealth.Intangible capital - skills, brands, designs, scientific ideas and online networks - would all be valued. The ledger should also account for the depletion of capital: the wear-and-tear of machinery, the deterioration of roads and public spaces, and damage to the environment.”

“Building these benchmarks will demand a revolution in national statistical agencies as bold as the one that created GDP in the first place,” notes The Economist in conclusion.“Even then, since so much of what people value is a matter of judgment, no reckoning can be perfect.But the current measurement of prosperity is riddled with errors and omissions.Better to embrace a new approach than to ignore the progress that pervades modern life.”

The President’s Commission on Enhancing National Cybersecuritytag:typepad.com,2003:post-6a00d8341f443c53ef01bb0903de30970d2016-05-23T14:45:19-04:002016-11-10T14:15:01-05:00Last February, President Obama issued an Executive Order establishing the Commission on Enhancing National Cybersecurity within the Department of Commerce. The Commission is charged with “recommending bold, actionable steps that the government, private sector, and the nation as a whole...IWB

Last February, President Obama issued an Executive Order establishing the Commission on Enhancing National Cybersecurity within the Department of Commerce.The Commission is charged with “recommending bold, actionable steps that the government, private sector, and the nation as a whole can take to bolster cybersecurity in today’s digital world, and reporting back by the beginning of December.”

To gather the necessary information for its short- and long-term recommendations, the Commission is holding public meetings around the country, each focused on a different sector of the economy.On May 16, it met in New York City to discuss the challenges and opportunities facing the financial sector.The meeting included three panels, one on finance, one on insurance, and the third on research and development.

I was a member of the R&D panel, along with MIT professor Sandy Pentland, IBM Fellow Jerry Cuomo, and Greg Baxter, head of digital strategy at Citigroup.During our 90 minute panel, we each made introductory remarks based on our previously submitted briefing statements and then answered the commissioners’ questions.

In my introductory remarks I noted that, arguably, nowhere is the challenge of cybersecurity greater than in the financial industries.As I’ve learned since becoming involved with digital money ecosystems ten years ago or so, money makes the world go round, as the famous song from Cabaret succinctly puts it, - and it’s been doing so since time immemorial.

Money has played a central role in human affairs through the ages, facilitating transactions, trade and commerce.Gold coins were first developed around 2,500 years in Asia Minor. For a long time, money was embodied in precious metals like gold and silver.But with the introduction of banknotes, - most likely around 1000 AD in China, - money started to decouple from physical objects with intrinsic value.

The world’s financial community has developed a very sophisticated ecosystem, including the global payment infrastructures, the management of personal identities and personal data, the global financial flows among institutions and between institutions and individuals, the government regulatory regimes, and so on.

This financial ecosystem has served us well so far, but its rather complicated, inefficient and inflexible.We rightfully worry that it may not be up to the scalability, security and privacy requirements of our 21st century digital economy, especially when you include the additional few billion people around the world conducting financial transactions over their smartphones, let alone the 10s to 100s billions IoT devices whose transactions have to be carefully validated given their potential impact on our health and safety.

Transforming this highly complex ecosystem has proved to be very difficult.It requires the close collaboration of its various stakeholders, including a large variety of financial institutions, merchants of all sizes, government regulators in just about every country, and huge numbers of individuals around the world.All these stakeholders must somehow be incented to work together in developing and embracing new financial innovations.Not surprisingly, change comes slowly to such a complex ecosystem.

I then mentioned some pertinent lessons that I’ve learned over my long career in the IT industry.In the early decades of the IT industry, different vendors brought to market their own proprietary systems and networks.Just sending an e-mail using a particular vendor application to another user in a different institution using another vendor’s application was quite cumbersome.

The Internet changed all that.Once the Internet was widely embraced in the 1990s, it became no harder to send an e-mail between companies as within a company.Everyone was using the same standards, including open source implementations of key protocols.Rather than developing their own proprietary networks and struggling to interconnect with those of others, institutions now collaborated on developing the common Internet architecture, - and Internet-based applications like e-mail and the Web, - that they all now used.

Much of the success of the Internet, the Web, Linux and other widely used technologies is due to the close collaborations between universities, research labs, companies and government agencies around the world in their development and governance.These collaborations have led to the creation of standards, open source software and governance processes embraced by all participants.

Something similar is beginning to happen with blockchain technologies.The blockchain first came to light around 2008 as the architecture underpinning bitcoin, the best known and most widely held digital currency.Over the years, the blockchain has developed a following of its own as a distributed data base architecture with the ability to handle trust-less transactions where no parties need to know nor trust each other for transactions to complete.

Blockchains hold the promise to revolutionize the finance industry and other aspects of the digital economy by bringing one of the most important and oldest concepts, the ledger, to the Internet age.

Ledgers constitute a permanent record of all the economic transactions an institution handles, whether it’s a bank managing deposits, loans and payments; a brokerage house keeping track of stocks and bonds; or a government office recording births and deaths, the ownership and sale of land and houses, or legal identity documents like passports and driver licenses.Over the years, most institutions have automated their original paper-based ledgers with sophisticated IT applications and data bases.

But while most ledgers are now digital, their underlying structure has not changed.Each institution continues to own and manage its own ledger, synchronizing its records with those of other institutions, - a cumbersome process that often takes days.According to a 2014 report by the Bank of England, the classic ledger has not changed much since the 16th century.The report called the evolution toward a blockchain-based distributed ledger “a major technological innovation not only for payment systems but for the finance industry as a whole.”

I finished my remarks to the Commission by noting that the emergence of an innovative disruptive technology can serve as a catalyst to propel change forward by bringing key stakeholders together, - as has been the case with the Internet, the Web and Linux.We’re hopeful that blockchain could now be such a catalyst for transforming our global financial systems. But, it’s still a bleeding edge technology lacking the robustness of existing payment systems.The evolution of our complex, global financial infrastructures will be a tough and lengthy undertaking, no matter how innovative and exciting the new technologies might be.

Let me now briefly discuss some of the remarks of my fellow R&D panelists.

Jerry Cuomo, - IBM VP of Blockchain Technologies, - reminded us that 80 years ago, “IBM helped the United States government create the Social Security system, which, at the time, was the most complex financial system ever developed.Today, as financial transactions become increasingly digital and networked, government and industry must once again combine forces to make the financial systems of the future more efficient, effective and secure than those of the past.And, just as an individual’s Social Security number became the key to proving identity and accessing that system for generations of Americans, today’s institutions must collaborate to create new methods for establishing identity and managing other aspects of digital transactions.”

Cuomo then added:“At IBM, we believe that blockchain technology is becoming an essential tool as business and society navigate this shift - with the potential for transforming commerce and the interactions between governments and individuals.Blockchain has inherent qualities that provide trust and security, but, to fulfill its promise, the core technology must be further developed using an open source governance model to make it deployable on a grand scale.”

He listed four key areas where government, technology companies and industries should work together:

Proof of Identity.“The Social Security number has been a mainstay of our society for decades, but it’s not secure and certifiable enough to serve as the building block of identity in a blockchain ecosystem.So we believe a new identity management system must be created.”

Data provenance.“To make organizations and individuals comfortable exposing their data through the use of blockchain applications, the systems must automatically track every change that is made to data, so it’s auditable and completely trustworthy.”

Secure transaction processing.“While the parties in a transaction managed using blockchain are known to other participants in the system, the actual details of the transaction should be visible only to those involved (or others who are granted permission). So we have to enable the entities that monitor blockchain transactions to verify that contracts are being fulfilled but without revealing confidential information to them.”

Sharing intelligence.“Amid a rising tide of cyber-crime and fears of cyber-terrorism, the White Hats of the world are under pressure to change the game.Blockchain has the potential to do just that.Not only is it inherently more secure than other types of networks and financial management systems, but blockchain has the potential to be used by multiple parties to share cyber-threat intelligence.”

MIT professor Sandy Pentland talked about the need to develop a 21st century data ecology. Among his various activities, Pentland oversees the Internet Trust Consortium, an MIT initiative whose antecedents created the widely used Kerberos authentication protocols.

“Today’s data ecology is transforming due to the exponential growth of mobile and ubiquitous computing, together with big data analysis,” said Pentland.“These shifts are having a dramatic impact on people’s personal data sharing awareness and sensitivities and on their cybersecurity… We need a new deal on datawhere security concerns are matched with transparency, control and privacy, and are designed into the core of any data-driven service.”

“In order to demonstrate that such a sustainable data ecology is possible we have developed Enigma, a decentralized computation platform enabling different parties to jointly store and run computations on data while keeping the data completely private.Enigma enables a sustainable data ecology by supporting the requirements that data be always encrypted, with computation happening on encrypted data only, by allowing owners of the data to control access to their data precisely, absolutely, and auditably, and by reliably enabling payment to data owners for use of their data…”

“Since users in Enigma are owners of their data, we use the blockchain as a decentralized secure database that is not owned by any party. This also allows an owner to designate which services can access its data and under what conditions, and so parties can query the blockchain and ensure that it holds the appropriate permissions.In addition to being a secure and distributed public database, the blockchain is also used to facilitate payments from services to computing parties and owners, while enforcing correct permissions and verifying that queries execute correctly.”

Citigroup’s Greg Baxter reminded us that as the digital revolution transforms financial services, we will see “significant opportunities to provide access to financial tools and products that can facilitate individual and collective progress and prosperity…However, the migration to digital also brings new, sophisticated and rapidly growing cyber risks…”

“As innovation changes experiences, journeys, products and platforms, there is a shift in financial services from vertical products to horizontal services, integrated into open, digital ecosystems.While banks may have traditionally operated and protected core platforms and saw edge devices as channels, with distinct cybersecurity implications, the new reality is a much broader ecosystem with many more points for digital access and cyber threats.”

“To protect people and their assets requires new ways of identifying and authenticating customers and devices, new approaches to managing and using payment credentials and more sophisticated monitoring capabilities.”

Baxter recommended three key areas for close collaboration between the public and private sectors:

Intelligence sharing. Increase the speed and quality of two-way information flows, “which is essential for developing an intelligence led approach to cyber protection, and for mounting a holistic defense.”

Research and development. “We need to dramatically increase the speed and scale of cyber innovation in both the private and public sector.”

Workforce development.Companies face a serious shortage of cyber trained personnel.“We need to increase and maintain the available workforce, which may require greater educational capacity and incentives.”

“From buying products to running businesses to finding directions to communicating with the people we love, an online world has fundamentally reshaped our daily lives,” said the White House in the Fact Sheet accompanying the President’s Executive Order.“But just as the continually evolving digital age presents boundless opportunities for our economy, our businesses, and our people, it also presents a new generation of threats that we must adapt to meet…”

“The President believes that meeting these new threats is necessary and within our grasp.But it requires a bold reassessment of the way we approach security in the digital age.If we’re going to be connected, we need to be protected. We need to join together - Government, businesses, and individuals—to sustain the spirit that has always made America great.”

Can AI Help Translate Technological Advances into Strategic Advantage?tag:typepad.com,2003:post-6a00d8341f443c53ef01bb08f61b05970d2016-05-17T06:48:07-04:002016-05-17T06:50:13-04:00After many years of promise and hype, artificial intelligence is now being applied to activities that not long ago were viewed as the exclusive domain of humans. It wasn’t all that long ago that we were wowed by Watson, Siri,...IWB

After many years of promise and hype, artificial intelligence is now being applied to activities that not long ago were viewed as the exclusive domain of humans. It wasn’t all that long ago that we were wowed by Watson, Siri, and self-driving cars. But it’s getting harder for our smart machines to truly impress us. Earlier this year Google’sAlphaGo won a match against one of the world’s top Go players. Go is a very complex game, for which there are more possible board positions than there are particles in the universe.Yet, we seem to be taking AlphaGo’s impressive achievement in stride.

“Any sufficiently advanced technology is indistinguishable from magic,” is one of the most memorable quotes of science fiction writer Arthur C. Clarke.But, as we better understand its promise and limitations, technology becomes just another tool we rely on in our work and daily life.With familiarity, the romance begins to fade, - as was the case with electricity, cars, airplanes and TV in the early decades of the 20th century, and as has been the case more recently with computers, the Internet, and, - increasingly now, - with AI.

Over time, our feelings turn from wonderment and admiration for the seemingly magical achievements of the technology in its childhood years, to the far more practical questions of what the technology can actually achieve when it grows up.This has been a particular issue with information technologies in general, including the Internet and AI.

The IT industry has long been associated from what’s been called the Solow productivity paradox, in reference to Robert Solow's 1987 quip: “You can see the computer age everywhere but in the productivity statistics.We all thought that the Solow paradox was finally behind us when IT-driven productivity surged between 1996 and 2003.But despite the continuing advances in technology, productivity is now back to its slow pre-1995 levels, for reasons that are still not well understood.

Digital technologies are all around us.But, are they a major source of competitive differentiation?Are they a strategic value to business? Can they help increase innovation and productivity and drive long term growth?These questions have no easy answers, as we have learned over the years.

A recent Harvard Business Review article, - Designing the Machines that Will Design Strategy, - takes these questions a few steps further.Given that our advanced AI technologies can now play championship-level Go and assist in the diagnosis and treatment of rare forms of cancer, - can they help us address broad, open-ended and ambiguous problems like developing and executing a competitive business strategy?Can our increasingly smart machines assist us in translating technological advances into strategic advantage?, ask the paper’s authors, Martin Reeves and Daichi Ueda.

On its own, technology does not guarantee competitive advantage, argues the paper.“No matter how advanced technology is, it needs human partners to enhance competitive advantage.It must be embedded in what we call the integrated strategy machine,” which they define as “the collection of resources, both technological and human, that act in concert to develop and execute business strategies.”

In their view business strategy consists of a series of highly interrelated conceptual and analytical operations, includingproblem definition, signal processing, pattern recognition, abstraction and conceptualization, analysis, and prediction.There are feedback loops between all these various steps. Experimentation and real market data help to continuously redefine and reframe the problems being addressed as well as its solution.Within this integrated strategy machine, people and technology must closely collaborate, with each playing the particular role they are best at.

“Human beings are still unique in our capacity to think outside the immediate scope of a task or a problem and to deal with ambiguity.Machines are good at executing a well-defined task or solving a well-defined problem, but they can’t think beyond the specified context (at least not currently).Nor can they pose new questions, invent answers beyond what’s being asked, or reframe or connect the problem to a different challenge they’ve previously faced.”

Analysis is essentially rational, quantitative, data-driven decision making and problem solving. Given its reliance on data, and algorithms, this is what data science and AI are particularly good at. It’s the standard approach underlying management and engineering practice. It involves a relatively linear set of steps and works quite well when you’re looking for a solution to a relatively well defined problem.

But where do the problems come from in the first place? How do you decide what problems to work on and try to solve? This second kind of innovation, - which the book called interpretation - is very different in nature from analysis. You are not solving a problem but looking for a new insight about customers and the marketplace, a new idea for a product or a service, a new approach to producing and delivering them, a new business model.

Their research showed that interpretive innovation generally takes place through a process of conversations among people and organizations with different backgrounds and perspectives, until the problems can be identified and clarified to the point where a solution can be developed. It requires curiosity, imagination, and a business culture that encourages these conversations and removes the organizational barriers that might prevent them from taking place.

In their HBRarticle, Reeves and Ueda suggest that business leaders should start the design of an integrated strategy machine by asking a few key questions:

What strategic aims do I want to realize through a technology-enhanced process?“The initial set of questions must always come from human beings. Only people can define the objectives and use the holistic judgment necessary.”

What technology, people, and design do I need to address these aims? Companies must be realistic about what it takes to build a competitive strategy machine.The best such companies are continuously investing in both technology and talent.

How can people and machines interact in a way that augment each other?The objective should be “to enhance rather than marginalize or inhibit human thinking,” by stimulating people’s ability “to create new insights, challenge their own thinking, and continuously reframe their understanding.”

How can the machine evolve and update itself?A well designed AI system includes mechanisms enabling it to get feedback, learn from experience and get better over time.

How can the broader organization embrace the strategy machine?“In the end, a strategy is only valuable to the extent that it’s embraced and leveraged by the organization.Business leaders must pay attention to what can feasibly be achieved within organizational constraints- or have a clear path to removing them.”

“Electricity led to enormous productivity gains only when factory layout was revisited and optimized for the new technology,” write the authors in conclusion. “We believe that the integrated strategy machine can do for information technology what new factory designs did for electricity.In other words, the increasing intelligence of machines will be wasted unless businesses reshape the way they develop and execute strategy.Businesses leaders must start thinking now about how they can integrate their two key assets - people and technology - or risk falling behind.”

The Public Face of Sciencetag:typepad.com,2003:post-6a00d8341f443c53ef01b8d1b6f7bd970c2016-05-10T02:39:44-04:002016-05-11T12:27:57-04:00Last month, the American Academy of Arts and Sciences launched a three year initiative to address the complex relationship between scientists and the public. The Public Face of Science project will explore the interactions of the general public with science,...IWB

Throughout the 20th century, polls have consistently indicated strong public support for science and technology, especially during the Cold War decades following World War 2.But recent polling data, - and in particular, a recent study conducted by the Pew Research Center, -reveals a more complex relationship between the public and science.While scientific achievements are still recognized and valued, there is a large opinion gap between the general public and scientists on a number of scientific issues.

The Pew findings were released in a January, 2015 report based on two surveys of science-related issues conducted in collaboration with the American Association for the Advancement of Science (AAAS).The first survey is based on a representative sample of 2,002 general public adults and was conducted by landline and mobile phones.The second survey was conducted online, and is based on a representative sample of 3,748 US-based scientists who are members of AAAS.

In its Summary of Findings, the report highlights four major results.Three of them found broadly similar views between the public and scientists on the current, overall place of science in America:

“Science holds an esteemed place among citizens and professionals.Americans recognize the accomplishments of scientists in key fields and… there is broad public support for government investment in scientific research.”

“[B]oth the public and scientists are critical of the quality of science, technology, engineering, and math (STEM subjects) in grades K-12.”

“Compared with five years ago, both citizens and scientists are less upbeat about the scientific enterprise.”

But, the fourth major result found substantial differences:

“Despite broadly similar views about the overall place of science in America, citizens and scientists often see science-related issues through different sets of eyes.There are large differences in their views across a host of issues.”

The survey found sizable opinion differences between the general public and scientists on all 13 issues where a direct comparison was possible, some rather large, others much smaller.

The largest difference concerned the safety of eating genetically modified foods.88% of scientists said that they were generally safe, compared with 37% of the general public, - a gap of over 50 percentage points.Basically, two-thirds of the public believe that scientists don’t have a clear understanding of the health impact of genetically modified crops.

Here are the other issues where large gaps were found:

Favor the use of animals in research:public 47%; scientists 89%.

Safe to eat foods grown with pesticides:public 28%; scientists 68%.

Climate change is mostly due to human activity: public 50%; scientists 87%.

Humans have evolved over time: public 65%; scientists 98%.

Growing world population will be a major problem: public 59%; scientists 82%.

Favor increased use of bioengineered fuels: public 68%; scientists 78%.

Favor increased use of fracking: public 39%; scientists 31%.

Space stations has been a good US investment: public 64%; scientists 68%.

Despite differences across a wide rage of topics, both the public and scientists are positive about US scientific achievements.54% of Americans consider US scientific achievements to be among the best in the world or above average, compared to 92% of the AAAS scientists polled.Similarly, 51% consider US medical treatments in the top tier compared to other industrialized countries, while 64% of scientists do so.It’s thus not surprising that the majority of the public supports government investments in technology, engineering and basic science research, with over 70% saying that these investments benefit society, paying off in the long run.

But both the public and scientists are critical of the quality of K-12 STEM education.

Above average: 29% public; 16% scientists.

Average: 39% public; 38% scientists.

Below average: 29% public; 46% scientists.

How well informed is the general public on science-related issues?84% of scientists feel that the public’s limited knowledge is a major problem for science in general, and is the result of four main reasons:

Not enough K-12 STEM education - 75%.

Lack of public interest in science news - 57%.

Lack of media interest in science - 43%.

Too few scientists who communicate findings - 40%.

Which brings us back to The Public Face of Science.The project was launched in part to address the concerns and opinion gaps uncovered by the Pew survey findings.“Many factors inform Americans’ views on these issues, some more strongly than others, including political leanings, age, race, education, and religious beliefs.Any divergence between the views of scientists and the public could have implications for policy development and other public decision-making processes.”

The three-year study will examine a number of key activities regarding the trust and perception of science, including

How individual beliefs and scientific comprehension influence confidence in the scientific process;

How science impacts public policy;

How the physical, social, and life sciences are portrayed in the media; and

How journalistic practices could be refined to better convey the incremental and iterative nature of scientific research.

Science is more important than ever because of its increasing scope. Scientific revolutions are launched when new tools make possible all kinds of new observations and measurements.Our new digital tools, - the Internet, mobile devices, social media, analytics, data science,… - are ushering an information-based scientific revolution, helping us extract insights from all the data we’re now collecting by applying tried-and-true scientific methods, that is, empirical and measurable evidence subject to testable explanations and predictions.

We’ve long been applying scientific methods in the natural sciences and engineering.But given our newfound ability to gather valuable data on almost any area of interest, we can now bring out tried-and-true scientific methods to just about any domain of knowledge.

As the American Academy of Arts and Sciences succinctly observes about our 21st century digital age: “Scientific and technological innovations touch every corner of American life.By informing the economy, health and medicine, national resources and their use, scientific information deeply influences the choices made by Americans about how they live their lives and contribute to society.”

The Analog Foundations of the Digital Revolutiontag:typepad.com,2003:post-6a00d8341f443c53ef01b8d1b49d38970c2016-05-02T03:05:06-04:002016-05-02T03:05:06-04:00This past January, the World Bank issued the Digital Dividends report, a comprehensive study of the state of digital developments around the world. All in all, it’s a mixed picture. Digital technologies have been rapidly spreading in just about all...IWB

This past January, the World Bank issued the Digital Dividends report, a comprehensive study of the state of digital developments around the world.All in all, it’s a mixed picture. Digital technologies have been rapidly spreading in just about all nations, but their expected digital dividends, - i.e., their broad development benefits, - have lagged behind and are unevenly distributed.

“We find ourselves in the midst of the greatest information and communications revolution in human history,” notes the report in its Foreword.“More than 40 percent of the world’s population has access to the internet, with new users coming online every day.Among the poorest 20 percent of households, nearly 7 out of 10 have a mobile phone…” But, while this is great progress, much remains to be done.Many are still left behind due to their limited connectivity, and are thus unable to fully benefit from the global digital revolution.

While universal connectivity is necessary, it’s far from sufficient, the report adds.“[T]raditional development challenges are preventing the digital revolution from fulfilling its transformative potential… the full benefits of the information and communications transformation will not be realized unless countries continue to improve their business climate, invest in people’s education and health, and promote good governance.”

Digital technologies promote economic development by greatly lowering the overall costs of economic and social transactions.Such lower transactions costs translate into economic development through three key mechanisms:

Inclusion: Digital technologies have empowered consumers around the world, giving them more choices than ever in virtually every category of products and services as well as in the channels used to acquire them.In addition, ubiquitous communications, low transaction costs and highly scalable platforms are giving rise to a new on-demand economy, enabling individuals to exchange goods and services with each other.

Efficiency: By enabling the automation and coordination of business processes across the organization, digital technologies have significantly promoted efficiency across the whole economy.Companies can thus provide lower prices and better services to their customer, and governments can offer more convenient online services for a wide range of tasks.

Innovation: Digital technologies help develop and bring to market all kinds of innovative offerings.This is evident in the many new kinds of products and services we’ve seen over the past 20 years, including e-commerce platforms, digital payment systems, e-books, streaming music, social media, and on-demand companies.

Digital Dividends

The report introduces the concept of digital dividends.These are essentially the analog complements of the digital revolution, that is, the broad development benefits that countries should expect from deploying digital technologies.Three major such digital dividends are identified: business growth, individual opportunities, and public services.

Business growth: Digital technologies, and the Internet in particular have promoted the inclusion of small firms in the world economy by expanding trade and productivity.

About two years ago, the McKinsey Global Institute published an excellent study, -Global Flows in the Digital Age, - which took an in-depth look at the expansion of cross-borders flows in the economy. It carefully analyzed these flows in 5 different categories: goods, services, finance, people and data and communications.

The study noted that in the not too distant past, global flows were concentrated in the more advanced economies, as well as in large, global companies. But, digital technologies and rising prosperity are combining to significantly disperse global flows, making it possible to include a larger number of countries as well as a larger number of participants across all countries.

“The network of global flows is expanding rapidly as emerging economies join in. Rising incomes in the developing world are creating enormous new centers of consumer demand, global production, and commodities trade, as well as sending more people across borders for business and leisure. Existing routes of flows are broadening and deepening and new ones emerging as more countries participate…”

“Governments and multinational companies were once the only actors involved in cross-border exchanges. But today, digital technologies enable even the smallest company or solo entrepreneur to be a micromultinational, selling and sourcing products, services, and ideas across borders. Individuals can work remotely through online platforms, creating a virtual people flow. Microfinance platforms enable entrepreneurs and social innovators to raise money globally in ever-smaller amounts.”

Individual opportunities: Digital technologies have had a major impact in improving the overall standard of living in emerging and developing economies. Their reduced transaction costs have helped to lower the job barriers for hundreds of millions, going to billions, around the world.

“Significant numbers of people have been moving from well below the poverty threshold to relatively closer to it due to widespread economic development. Absent a global recession, the number of those living in extreme poverty is poised to decline as incomes continue to rise in most parts of the world. The number could drop by about 50 percent between 2010 and 2030, according to some models. . . Under most scenarios - except the most dire - significant strides in reducing extreme poverty will be achieved by 2030. . .”

“Middle classes most everywhere in the developing world are poised to expand substantially in terms of both absolute numbers and the percentage of the population that can claim middle-class status during the next 15-20 years. Even the more conservative models see a rise in the global total of those living in the middle class from the current 1 billion or so to over 2 billion people. Other see even more substantial rises with, for example, the global middle class reaching 3 billion people by 2030.”

Public services: Digital technologies are also making governments more capable and responsive. Given that governments are not subject to market competition, they have generally lagged business in the efficiency and quality of their service.However, there are quite a number of critical services that only governments can provide to all their citizens.The World Bank report cites India’s Aadhaar digital identification system and Nigeria’s e-ID initiative as examples of government-sponsored programs that expand the participation of all citizens.

“Lack of identity is an impediment for poor people to exercise their basic democratic and human rights.Where civil registration systems are weak or non-existent, many of the poor are simply not counted. Digital identification can help overcome barriers to participation.Many countries have introduced general-purpose digital identity (ID) schemes or specific systems for elections or to manage postconflict transfers - with numerous benefits, including making the public sector more efficient.”

“Nearly 900 million Indians have been issued digital IDs in the past five years, which they are using to open bank accounts, monitor attendance of civil servants, and identify recipients of government subsidies.Nigeria’s e-ID revealed 62,000 public sector ghost workers, saving US$1 billion annually.But the most important benefit may be in better integrating marginalized or disadvantaged groups into society.”

Risks and challenges

The report argues that these important digital dividends are not spreading fast enough for two main reasons.The first is limited connectivity, which makes it difficult for over 50% of people around the world to adequately participate in the digital economy. Persistent digital divides exist between advanced and developing nations, as well as across income, geography, gender and age within nations.

In addition, the benefits of digital dividends are being challenged by new risks:

Excessive concentration: The Internet’s economies of scale can lead to harmful concentration of market power in many sectors, inhibiting competition and innovation.

Automation and inequality: Automation can leave behind workers without the necessary skills, hollowing out labor markets and leading to greater inequality.

Government control: Without the proper accountability, governments can leverage digital technologies to exercise greater control over their citizens, rather than to empower them.

Analog Foundations

Beyond connectivity and access to digital devices, countries must strengthen the analog foundations of the digital revolution to help them realize the benefits of their technology investments.These include:

Regulations that promote competition: Lowering the cost of starting firms, avoiding monopolies, removing barriers to adoption of digital technologies, ensuring the efficient use of technology by businesses, enforcement of existing regulations, …

Education and skill development: Basic IT and digital literacy, helping workers adapt to the demands of the digital economy, preparing students, managers and government officials for an increasingly digital world, facilitate life-long learning, …

Institutions that are capable and accountable: Empowering citizens through digital platforms and information, e-government services, digital citizen engagements, increased incentives for good governance both in public sector and private firms, …

“Digital development strategies need to be broader than ICT strategies.Connectivity for all remains an important goal and a tremendous challenge.But countries also need to create favorable conditions for technology to be effective.When the analog complements are absent, the development impact will be disappointing.But when countries build a strong analog foundation, they will reap ample digital dividends - in faster growth, more jobs, and better services.”

Why Some Work Groups Thrive While Others Faltertag:typepad.com,2003:post-6a00d8341f443c53ef01bb08e972b0970d2016-04-25T16:14:32-04:002016-04-25T16:14:32-04:00I recently read an interesting article published earlier this year in the NY Times Magazine by reporter Charles Duhigg, - What Google Learned From its Quest to Build the Perfect Team. The article is focused on Project Aristotle, an initiative...IWB

“[M]any of today’s most valuable firms have come to realize that analyzing and improving individual workers - a practice known as employee performance optimization -isn’t enough,” writes Duhigg. “As commerce becomes increasingly global and complex, the bulk of modern work is more and more team-based… at many companies, more than three-quarters of an employee’s day is spent communicating with colleagues.”

“In Silicon Valley, software engineers are encouraged to work together, in part because studies show that groups tend to innovate faster, see mistakes more quickly and find better solutions to problems,” he later adds.“Studies also show that people working in teams tend to achieve better results and report higher job satisfaction.In a 2015 study, executives said that profitability increases when workers are persuaded to collaborate more.Within companies and conglomerates, as well as in government agencies and schools, teams are now the fundamental unit of organization. If a company wants to outstrip its competitors, it needs to influence not only how people work but also how they work together.”

The article devotes several paragraphs to the pioneering research on group performance by MIT professor Tom Malone, CMU professor Anita Woolley and their various collaborators over the past 8 years. Their work addressed a very intriguing set of questions: Do groups exhibit characteristic levels of intelligence which can be measured and used to predict the group’s performance across a wide variety of cognitive tasks? If so, can you devise tests to measure the group’s intelligence using methodologies and statistical techniques similar to those that have been applied to measure the IQs of individuals over the past hundred years?

Early in the 20th century, it was discovered that people’s cognitive abilities tend to be similar across a wide variety of tasks. That is, if you perform well in one such cognitive task you are likely to also perform well in others even though they may be quite different. These include solving problems, comprehending complex ideas, reasoning and planing, learning quickly, adapting to new situations, and so on. Such a general cognitive intelligence has long been statistically captured by tests measuring the Intelligence Quotient or IQ of individuals.

In their initial set of studies, they randomly assigned nearly 700 volunteers into groups of two to five members.Each group worked together on a diverse set of short tasks selected to represent the kinds of problems that groups work on in the real world.These included tasks requiring logical analysis such as solving visual puzzles; tasks emphasizing collective brainstorming and moral judgements; and tasks based on coordination and planning such as negotiating over limited resources.They also measured the individual IQs of each of the participants.

They did indeed find a statistically significant collective intelligence factor that predicted how well each group would do on a wide range of tasks. But, neither the average intelligence of the individual group members nor the highest individual intelligence were strong predictors of the group’s overall performance. They also looked at group cohesion, motivation and satisfaction, but none of them worked either.

Instead, the best performing groups exhibited three key characteristics:

More equal contributions.Group members contributed more equally, instead of letting one or two dominate the conversation.

More women.Groups with more women outperformed groups with more men. This is likely because as previous research at the Autism Research Center has shown, women generally score higher than men in the Reading-the-Mind social sensitivity tests.

In a later study, Woolley, Malone, et al replicated their earlier findings, but with a twist.They wanted to explore whether groups that worked online instead of face-to-face also exhibited collective intelligence.To do so, they assembled 68 teams, half of which worked face to face like those in their earlier studies, and half worked online with no ability to see the other group members.

They found that whether online or off, some teams consistently outperformed the others.And, just like in the earlier studies, the best performing teams were better at communicating with each other, participating equally in the process and exhibiting higher emotion-reading skills.

Deming’s research found that since 1980, social-skill intensive occupations have enjoyed most of the employment growth across the whole wage spectrum. Employment and wage growth have been particularly strong in jobs requiring both high cognitive and high social skills. But since 2000, they have fallen in occupations with high cognitive but low social skill requirements, - “suggesting that cognitive skills are increasingly a necessary but not sufficient condition for obtaining a high-paying job.”

Deming writes that “computers are still very poor at simulating human interaction. Reading the minds of others and reacting is an unconscious process, and skill in social settings has evolved in humans over thousands of years. Human interaction in the workplace involves team production, with workers playing off of each other’s strengths and adapting flexibly to changing circumstances. Such nonroutine interaction is at the heart of the human advantage over machines. The growing importance of social skills can potentially explain a number of other trends in educational outcomes and the labor market, such as the narrowing - and in some cases reversal - of gender gaps in completed education and earnings.”

As work is becoming more team-based, you’d expect that workers with strong social skills who are more able to work well with others are becoming more valuable. Good teamwork increases productivity through comparative advantage, that is, the notion that different members of a team should specialize in those tasks that they’re best at. Workers with good social skills should be better able to play off each others strengths, quickly learn which tasks they are each best at, and flexibly adapt to changing circumstances.

It found that the traditional hard skills typically provided by engineering and business schools must be complemented with a set of so-called soft skills or attributes. Five such attributes were identified in the study: adaptability, cultural competence, 360-degree thinking, intellectual curiosity, and empathy.

Empathy turned to be the most important of the five attributes. “Frankly, when empathy kept coming up in our research, I was surprised,” said Dean Wilson. “All of the people we interviewed were serious business executives. Empathy was not the first virtue I associated with the rough and tumble of today’s highly competitive business world. I expected to hear about boldness, perseverance, and toughness.”

Toward the end of his article, Duhigg notes the irony inherent in the findings of Google’s Project Aristotle.“The technology industry is not just one of the fastest growing parts of our economy; it is also increasingly the world’s dominant commercial culture.And at the core of Silicon Valley are certain self-mythologies and dictums: Everything is different now, data reigns supreme, today’s winners deserve to triumph because they are cleareyed enough to discard yesterday’s conventional wisdoms and search out the disruptive and the new.”

“The paradox, of course, is that Google’s intense data collection and number crunching have led it to the same conclusions that good managers have always known. In the best teams, members listen to one another and show sensitivity to feelings and needs.”

Could Blockchain Prove to Be “The Next Big Thing”?tag:typepad.com,2003:post-6a00d8341f443c53ef01bb08dc01ce970d2016-04-18T07:16:14-04:002016-04-18T07:16:14-04:00Over the past several decades, information technologies (IT) have been fundamentally transforming companies, industries and the economy in general. In its early years, - ’60s, ’70s, ’80s - companies deployed IT primarily to automate their existing processes, - leaving the...IWB

Over the past several decades, information technologies (IT) have been fundamentally transforming companies, industries and the economy in general.In its early years, - ’60s, ’70s, ’80s - companies deployed IT primarily to automate their existing processes, - leaving the underlying structure of the business in place.It wasn’t until the 1990s, - with the pioneering work of Michael Hammer and others on business process reengineering, - that companies realized that just automating existing processes wasn’t enough. Rather, to achieve the promise of IT, it was necessary to fundamentally redesign their operations, examine closely the flow of work across the organization, and eliminate legacy processes that no longer added value to the business.

Organizational transformation was then taken beyond the boundaries of the company with the explosive growth of the Internet. The Internet made it significantly easier to obtain goods and services outside the firm, enabling companies to rely on business partners for many of the functions once done in-house.To compete effectively in an increasingly interconnected global economy, companies now had to optimize not only the flow of work within their own organizations but across their supply chain ecosystems.Over the past 20 years, such ecosystem-wide transformations have been disrupting the business models of industry after industry, - from retail and manufacturing to media and entertainment.

The banking industry has long been one of the major users of IT, - among the first to automate its back-end and front-office processes and to later embrace the Internet and smartphones.However, banking has been relatively less disrupted by digital transformations than other industries.In particular, change has come rather slowly to the world’s banking infrastructure.

“With advances in technology, the relationship that customers have with their bank and with their finances has changed,…” notes a recently released Citigroup report, - Digital Disruption: How FinTech is Forcing Banking to a Tipping Point.“So far these have been seen more as additive to a customer's banking experience… Despite all of the investment and continuous speculation about banks facing extinction, only about 1% of North American consumer banking revenue has migrated to new digital models,… we have not yet reached the tipping point of digital disruption in either the US or Europe.”

Last week I discussed some of the highlights of Citi’s excellent FinTech report. Investments in financial technologies have increased by a factor of 10 over the past 5 years.The majority of these investments have been concentrated in consumer payments, particularly on the user experience at the point of sale, while continuing to rely on the existing legacy payment infrastructures. I’d like to now focus on the potential evolution of the backbone payment infrastructures.

Transforming this highly complex global payment ecosystem has proved to be very difficult. It requires the close collaboration of its various stakeholders, including a variety of financial institutions, merchants of all sizes, government regulators in just about every country, and huge numbers of individuals around the world.All these stakeholders must somehow be incented to work together in developing and embracing new payment innovations.Not surprisingly, change comes slowly to such a complex ecosystem.

But sometimes, the emergence of an innovative disruptive technology can help propel change forward. The Internet proved to be such a catalyst in the transformation of global supply chain ecosystems. Could blockchain technologies now become the needed catalyst for the evolution of legacy payment ecosystems?

The blockchain first came to light around 2008 as the architecture underpinning bitcoin, the best known and most widely held digital currency.Over the years, blockchain has developed a following of its own as a distributed data base architecture with the ability to handle trust-less transactions where no parties need to know nor trust each other for transactions to complete.Blockchain holds the promise to revolutionize the finance industry and other aspects of the digital economy by bringing one of the most important and oldest concepts, the ledger, to the Internet age.

Ledgers constitute a permanent record of all the economic transactions an institution handles, whether it’s a bank managing deposits, loans and payments; a brokerage house keeping track of stocks and bonds; or a government office recording births and deaths, the ownership and sale of land and houses, or legal identity documents like passports and diver licenses.Over the years, institutions have automated their original paper-based ledgers with sophisticated IT applications and data bases.

But while most ledgers are now digital, their underlying structure has not changed. Each institution continues to own and manage its own ledger, synchronizing its records with those of other institutions as appropriate, - a cumbersome process that often takes days.While these legacy systems operate with a high degree of robustness, they’re rather inflexible and inefficient.

“Since the dawn of record-keeping, ledgers have been where receipts and disbursements of cash and goods are recorded.If you buy a sandwich at a deli, for example, several ledgers come into play. The money you carry to pay for it registers a slight deduction on your personal ledger.The deli owner’s ledger records a gain in one place and in another a payment for the ham it bought from a supplier.A ledger at the bank records the deposit it receives from the deli owner.”

“In a world where every business has its own books, payments tend to stop and start between different ledgers.An overseas transfer leaves the ledger of one business, then goes on another ledger at a domestic bank. It then might hit the ledger of a bank in the international transfer system.It travels to another bank in the foreign country, before ending up on the ledger of the company being paid.Each time it moves to a different ledger, the money has a different identity, taking up time and potentially causing confusion.For some companies, it is a nightmare that can’t end soon enough.”

Blockchain-based distributed ledgers could do for global financial systems what the Internet has done for global supply chain systems.As Citi’s Digital Disruption report notes, blockchain technologies “could replace the current payment rail of centralized clearing with a distributed ledger for many aspects of financial services, especially in the B2B world… But even if Blockchain does not end up replacing the core current financial infrastructure, it may be a catalyst to rethink and re-engineer legacy systems that could work more efficiently.”The report goes on to explain why the blockchain might well prove to be a kind of Next Big Thing.

“Blockchain is a distributed ledger database that uses a cryptographic network to provide a single source of truth.Blockchain allows untrusting parties with common interests to co-create a permanent, unchangeable, and transparent record of exchange and processing without relying on a central authority. In contrast to traditional payment model where a central clearing is required to transfer money between the sender and the recipient, Blockchain relies on a distributed ledger and consensus of the network of processors, i.e. a super majority is required by the servers for a transfer to take place.If the Internet is a disruptive platform designed to facilitate the dissemination of information, then Blockchain technology is a disruptive platform designed to facilitate the exchange of value.”

Automation: Programmability enables automation of capabilities on the ledger (e.g. smart contracts), that can be executed once agreed upon conditions are met.

Certainty:System-wide audit trails make it possible to track the ownership history of an asset, providing irrefutable proof of existence, proof of process and proof of provenance.

But much, much work remains to be done. Blockchain is still at the bleeding edge, lacking the robustness of legacy payment systems.Distributed ledger systems have only been around for less than a decade, and are thus quite immature compared to the existing, decades-old financial infrastructures.While legacy payment infrastructures are complicated, inefficient and inflexible, they actually work quite well, being both safe and fast.Replacing them will be a tough and lengthy undertaking, no matter how innovative and exciting the new technologies might be.

It’s too early to know if the blockchain will join the pantheon of Next Big Things and become a major transformational innovation.As we’ve seen with other such successful innovations, - e.g., the Internet, the Web, Linux, - collaborations between universities, research labs, companies and government agencies are absolutely essential.So are close collaborations among technology developers and users in order to get the architecture right, agree on open standards, develop open source platforms and set up governance processes embraced by all.

In a short number of years, blockchain technologies have made a lot of progress.We might well be close to an ecosystem-wide FinTech tipping point.It will be fascinating to see how it all plays out in the years to come.

Is FinTech Forcing Banking to a Tipping Point?tag:typepad.com,2003:post-6a00d8341f443c53ef01b8d1ba0272970c2016-04-11T14:21:22-04:002016-04-15T17:35:41-04:00For the past few decades, digital technologies have been systematically transforming one industry after another. The transformations have generally proceeded along three different stages. First comes the use of IT to improve the productivity and quality of production-oriented, back-end processes....IWB

For the past few decades, digital technologies have been systematically transforming one industry after another. The transformations have generally proceeded along three different stages. First comes the use of IT to improve the productivity and quality of production-oriented, back-end processes.Distribution comes next, leveraging the universal reach and connectivity of the Internet over the past 20 years.The transformation then reaches a tipping point when technology radically changes the user experience, - as has happened with the rise of smartphones over the past decade, - leading to a fundamental disruption of the industry and its business models.

While this digital disruption journey is ultimately inevitable, the pace varies widely across industries.The IT industry has been the most disrupted, - often by its own digital creations in a kind of sorcerer’s apprentice scenario.Over my long career, I’ve seen many once powerful IT companies done in by technology and market changes, and either disappear altogether or become shadows of their former selves.

Beyond IT, few industries have felt the impact of digital forces like media. Everything seems to be changing at once, from the way content is produced and delivered, to the sources of revenue and profits. In less than two decades, the global recorded music industry has lost over half its revenues, while the drop in newspaper advertising revenue in the US has been even steeper.Retail has also been undergoing major changes with the rise of e-commerce, as has telecommunications with the transition to mobile phones and wireless data.

How about the banking industry, which has long been a major user of information technologies, - including back- and front-office automation, ATM’s, Internet banking, data-driven risk management, fraud detection, and mobile financial apps?

“Despite all of the investment and continuous speculation about banks facing extinction, only about 1% of North American consumer banking revenue has migrated to new digital models,” says an excellent report that was recently released by Citigroup, - Digital Disruption: How FinTech is Forcing Banking to a Tipping Point. “Although FinTech companies have the advantage of new innovation, incumbent financial institutions still have the upper hand in terms of scale and we have not yet reached the tipping point of digital disruption in either the US or Europe. Given the growth in FinTech investment, this isn't likely to continue for long.”

In the last few years, FinTech, - short for Financial Technology, - has become a widely used term for technology-based innovations in financial services.FinTech companies have generally been startups looking to disrupt larger companies, although incumbent financial companies have started to establish their own FinTech units.Last October, for example, Citigroup launched the Citi FinTech division, aimed at developing new mobile banking services and business models.

“Silicon Valley is coming,” wrote JPMC CEO Jamie Dimon in his 2015 Shareholder Letter. “There are hundreds of startups with a lot of brains and money working on various alternatives to traditional banking.”Competitors are coming in the payment area as well as in the lending business. “[T]hey can make loans in minutes, which might take banks weeks…there is much for us to learn in terms of real-time systems, better encryption techniques, and reduction of costs and pain points for customers.”

Citi’s report highlights a number of important FinTech trends:

FinTech investments increased 10X in the past 5 years.Investments have risen “from $1.8 billion in 2010 to $19 billion in 2015 -with over 70% of this investment focusing on the last mile of user experience in the consumer space.The majority of this investment has also been concentrated in the payments area and this is where banks are seeing the most competition with new entrants.”Competitors are emerging in established digital markets, such as PayPal and Square in e-commerce, as well as in underserved segments like micropayments and small businesses.

FinTech is targeting the most profitable areas of global banking. Citi Research analysts estimate that over 70% of FinTech investments have been aimed at individuals and small and medium enterprises, segments which account for about half of banking’s profit pool. Given the growing importance of smartphones in financial transactions, it’s not surprising that Business-to-Consumer (B2C) innovations dominate, mostly aimed at improving user experiences.

The US and Europe are at the tipping point.Greg Baxter, - Citi’s Global Head of Digital Strategy, - notes that in the US and other developed markets “We are not even at the end of the beginning.”While really difficult to forecast this early in the cycle, his Citi team estimates that “currently only about 1% of North American consumer banking revenue has migrated to new digital business models (either at new entrants or incumbents) but that this will increase to about 10% by 2020 and 17% by 2023.”

China is already past the tipping point.Top Chinese FinTech companies, - such as Alipay or Tencent, - already have as many, if not more clients as the major banks around the world.“China's FinTech companies have grown fast due to a combination of: (1) high national Internet and mobile penetration, (2) a large e-commerce system with domestic Internet companies focused on payments, (3) relatively unsophisticated incumbent consumer banking, and (4) accommodative regulations.”

Emerging Markets are going through a financial inclusion revolution.Emerging markets are ripe for FinTech disruptions due to “a high percentage of unbanked population, relatively weak consumer banks, and a high penetration of mobile phones.”Kenya has been the leader in mobile phone-based financial services since the launch of M-PESA in 2007, which currently has 23 million active customers in 11 countries.In Somalia, - despite or because of its serious political instability, over 40% of adults use mobile money.For billions around the world, FinTech innovations are their tickets to financial inclusion in the global digital economy.

India is likely the next major FinTech frontier.India represents a major opportunity space for FinTech, given its large population of over 1.2 billion, its low number of banking accounts and its relatively high digital penetration.This opportunity is helped a long by three major enablers: the AADHAR national biometric identity program, which now covers over 900 million people; the Jan Dhan financial inclusion initiative which has already opened almost 200 million new bank accounts; and the ubiquity of mobile phones, with ~ 80% penetration.In addition, India’s central bank, - the Reserve Bank of India, - has been aggressively expanding the banking ecosystem.

Digital payments face intense competition.The payment space is being challenged by competitors from a variety of industries, including e-commerce (Alipay, Paypal), technology (Apple, Google), telcos (MPESA, SoftCard), and merchants (MCX, Walmart Pay).“Although payment is a relatively small part of banks’ revenue pool (~7%), the incumbent banks are at risk of losing important customer transaction data and client relationships.” These competitors are a potential threat to incumbent banks due to their technical, financial and market strengths.

Blockchain could be a catalyst for transforming legacy infrastructures.Payment innovations have mostly focused on the user experience at the point of sale, while continuing to rely on the existing backbone payment infrastructures.But over time, blockchain technologies could start replacing these legacy, proprietary infrastructures with distributed shared infrastructures more appropriate for dealing with the huge scalability and security challenges of the digital economy.

Blockchain could be “a catalyst for the transformation of many existing legacy systems that operate with a high degree of robustness but may not be the most cost or capital efficient way of doing business.” But blockchain is still at the bleeding edge, lacking the robustness of legacy payment systems.

Banks face an Uber Moment.As Jonathan Larsen, - Citi’s Global Head of Retail Banking, - said:“The future of the branches is about advisory and consultation rather than transactions… clients over time will move almost entirely to mobile channels.Banks will look like Uber.Uber has nothing to do with cars. It created an entirely new user experience.It tracks all your transaction histories, expenses, drivers’ ratings and so on.It created needs you never had.”

“What it means to banks is first and foremost the centrality of mobile as the main channel of interaction between customers and the bank.More importantly, there is a diminishing return on physical assets - especially the branch network.I won’t say that banks won’t have a balance sheet in the future, but the way customers interface with the bank will be revamped.”

The diminishing importance of physical branches, the explosive growth of the mobile Internet, and increased competition from FinTech startups are putting huge pressure on banks, - leading to a kind of Uber moment, when technology is making the old ways of doing business obsolete.To survive, banks will be compelled to embrace many of the FinTech innovations introduced by startups and staffing levels could decline by up to 30% over the next 10 years.

“The death of banks has been much foretold in recent decades,” most notably by Bill Gates in a 1994 article where he argued “that the world needed banking services but not necessarily banks…” Since then, “we have seen digital disruption fundamentally erode value across many industries including: music sales, video rentals, travel booking, and newspapers. In each of these cases, incumbents either transformed or became marginalized.”

Is the digital revolution finally reaching the banking industry? Will incumbent banks embrace FinTech innovations before FinTech startups gain scale and distribution? How will it all play out, - fierce competition, an increasing number of partnerships, or some combination of both? There are so many factors at play, that it’s extremely difficult to predict how this digital disruption will play out over the years. But, as Citi’s report says in its very title, all the signs seem to indicate that “FinTech is [Finally] Forcing Banking to a Tipping Point.”

Computing Beyond Moore’s Law tag:typepad.com,2003:post-6a00d8341f443c53ef01bb08ca123f970d2016-04-05T06:27:58-04:002016-04-05T06:27:58-04:00The March 12 issue of The Economist includes a special report on the future of computing after the very impressive 50-years run of Moore’s Law. In his now legendary 1965 paper, Intel co-founder Gordon Moore first made the empirical observation...IWB

The March 12 issue of The Economist includes a special report on the future of computing after the very impressive 50-years run of Moore’s Law.

In his now legendary 1965 paper, Intel co-founder Gordon Moore first made the empirical observation that the number of components in integrated circuits had doubled every year since their invention in 1958, and predicted that the trend would continue for at least ten years, a prediction he subsequently changed to a doubling every two years. The semi-log graphs associated with Moore’s Law have since become a visual metaphor for the technology revolution unleashed by the exponential improvements of just about all digital components, from processing speeds and storage capacity to networking bandwidth and pixels.

The 4004, Intel’s first commercial microprocessor, was launched in November, 1971.The 4-bit chip contained 2,300 transistors.The Intel Skylake, launched in August, 2015, contains 1.75 billion transistors which collective deliver about 400,000 more computing power than the 4004.Moore’s Law has had quite a run, but like all good things, especially those based on exponential improvements, it must eventually slow down and flatten out.

In its overview article, The Economist reminds us that Moore’s Law was never meant to be a physical law like Newton’s Laws of Motion, but rather “a self-fulfilling prophecy - a triumph of central planning by which the technology industry co-ordinated and synchronised its actions.”It also reminds us that its demise has been long anticipated: for a while now, the number of people predicting the death of Moore’s Law has also been doubling every two years.

Cells, the basic building blocks of all life, first emerged on Earth around 4 billion years ago. Evolution continued to perfect the cell over the next few billion years, giving rise to a variety of single-celled organisms, followed later by multi celled organisms of various types. Then around 550 million years ago, a dramatic change took place in life on Earth, - the Cambrian Explosion.Evolution took off in a totally separate direction. Cells were now good-enough, - not worth devoting evolution’s precious energies to their continuing optimization.The time had come to use the cells as building blocks and, essentially, go up-the-stack.

Over the next 70 to 80 million years, evolution accelerated by an order of magnitude, ushering a diverse set of organisms far larger and more complex than anything that existed before, and organizing living organisms into mutually dependent ecosystems.By the end of the Cambrian geological period, the diversity and complexity of life began to resemble that of today.

Something similar has been taking place in the world of IT.For the past 50-60 years, we’ve been perfecting our digital components - microprocessors, memory chips, disks, and so on.Initially, we used these components to develop what in retrospect were not-very-powerful, stand-alone computer systems.High component costs and a lack of industry standards made it difficult to cluster and/or network large numbers of individual computers into larger, more powerful systems.

But this all started to change about 20-25 years ago.First, as the cost of components continued to drop, we were able to develop far more powerful computers.At the same time, the Internet brought a culture of standards to just about all aspects of the IT industry.Our Internet-based systems were now increasingly made-up of a diverse set of interconnected computers of all sizes, most notably, highly parallel systems in centralized data centers supporting huge numbers of distributed mobile and IoT devices.

Once IT started moving up-the-stack, we became less dependent on the exponential improvements in components, relying more on innovations in systems architecture, algorithms and applications.The slowdown and potential end of Moore’s Law, is no longer as big a deal as it once would have been because, as The Economist points out, “the future of computing will be defined by improvements in three other areas, beyond raw hardware performance:” software, specialized architectures, and cloud computing.

Software

As Marc Andreessen noted in a 2011 essay, Software is Eating the World.“My own theory is that we are in the middle of a dramatic and broad technological and economic shift in which software companies are poised to take over large swathes of the economy.More and more major businesses and industries are being run on software and delivered as online services - from movies to agriculture to national defense.”Entrepreneurial companies all over the world are disrupting established industries, with innovative software-based solutions.

“Why is this happening now,?” he asked. “Six decades into the computer revolution, four decades since the invention of the microprocessor, and two decades into the rise of the modern Internet, all of the technology required to transform industries through software finally works and can be widely delivered at global scale.”

Software has long been applied to digitize and automate an increasing number of every-day activities, from back- and front-office business processes to personal payments and navigation.And, as digitization increasingly permeates every nook and cranny of society, including our personal lives, it’s generating vast amounts of data.

All these data is now enabling us to better understand many aspects of the world that have never been quantified before.A whole new round of tools and applications has been emerging to augment our intelligence by analyzing vast amounts of information. Software is now being increasingly applied to activities requiring intelligence and cognitive capabilities that not long ago were viewed as the exclusive domain of humans.

We were all wowed when in 1997 Deep Blue won a celebrated chess match against then reigning champion Gary Kasparov, and when in 2011, Watsonwon the Jeopardy! Challenge against the two best human Jeopardy! players.But AI wowing has now become commonplace. A few weeks ago, AlphaGo claimed victory against Lee Sedol, - one of the world’s top Go players, - in a best-of-five match, winning four games and losing only one. Go is a much more complex game than chess, for which there are more possible board positions than there are particles in the universe.

“As a result, a Go-playing system cannot simply rely on computational brute force, provided by Moore’s law, to prevail, “notes The Economist.“AlphaGo relies instead on deep learning technology, modelled partly on the way the human brain works.Its success… shows that huge performance gains can be achieved through new algorithms.Indeed, slowing progress in hardware will provide stronger incentives to develop cleverer software.”

Specialized architectures

Tissues, organs and organ systems have evolved in living organisms to organize cells so that together they can better carry outa variety of common biological functions.In mammals, organ systems include the cardiovascular, digestive, nervous, respiratory and reproductive systems, each of which is composed of multiple organs.

General purpose computers have long included separate architectures for their input/output functions.Supercomputers have long relied on vector architectures to significantly accelerate the performance of numerically-intensive calculations.Graphic processing units (GPUs) are used today in a number of high performance PCs, servers, and game consoles. Most smartphones include a number of specialized chips for dealing with multimedia content, user interactions, security and other functions. Neural network architectures are increasingly found in advanced AI systems.

As in evolution, innovations in special-purpose chips and architectures will be increasingly important as Moore’s Law fades away.

Cloud computing

“When computers were stand-alone devices, whether mainframes or desktop PCs, their performance depended above all on the speed of their processor chips,” observesThe Economist.“Today computers become more powerful without changes to their hardware. They can draw upon the vast (and flexible) number-crunching resources of the cloud when doing things like searching through e-mails or calculating the best route for a road trip.And interconnectedness adds to their capabilities: smartphone features such as satellite positioning, motion sensors and wireless-payment support now matter as much as processor speed.”

As is the case in biology, most computing now takes place in mutually dependent ecosystems.User-oriented and IoT devices get the bulk of their services over the Internet, - from clouds of all sorts out there. In the digital economy, clouds are essentially the production plants of services, supporting billions of mobile devices and 10s of billions of IoT devices.To achieve the required scalability, security and efficiency, cloud-based data centers have had to become much more disciplined in every aspect of their operations.

“The twilight of Moore’s law… will bring change, disorder and plenty of creative destruction,” writesThe Economist in conclusion.“An industry that used to rely on steady improvements in a handful of devices will splinter.Software firms may begin to dabble in hardware; hardware makers will have to tailor their offerings more closely to their customers’ increasingly diverse needs.”

But, in the end, consumers do not care about Moore’s Law, transistors, or computers per se.“They simply want the products they buy to keep getting ever better and more useful.In the past, that meant mostly going for exponential growth in speed.That road is beginning to run out.But there will still be plenty of other ways to make better computers.”

How is the Real America Doing?tag:typepad.com,2003:post-6a00d8341f443c53ef01b7c81b22ef970b2016-03-28T06:42:14-04:002016-03-28T06:31:31-04:00Listening to cable news, talk radio, and our heated election campaigns, you might quickly conclude that the US is going to hell, that the best days of America are behind us and that the country is unraveling right in front...IWB

Listening to cable news, talk radio, and our heated election campaigns, you might quickly conclude that the US is going to hell, that the best days of America are behind us and that the country is unraveling right in front of our eyes. To a greater or lesser extent, these sentiment are to be expected in a presidential-election year, when the out-of-power party always argues that things are bad and getting worse. But for the last several years, the apocalyptic sentiments are coming not only from most on the right but from many on the left as well.

Beltway Washington is going through a particularly polarized, volatile phase, as has sometimes been the case through the country’s history. Moreover, the amplifying effects of social networks and 24/7 information channels make it even harder to find common ground and get anything done.

But, what’s going on beyond the beltway, - in cities and towns across America?Beyond the surreal world of Washington politics, is the country truly falling apart? How are Americans faring, despite the hard times so many have been going through? What’s going on in the real America, where real people live and work and where the proverbial rubber meets the road?

A few years ago, James Fallows, - author, journalist, and national correspondent for The Atlantic, - decided to explore these questions by embarking on a trip across America.In 2013, he asked Atlantic readers to recommend what cities he should visit that would give him a good picture of the actual state of the country.He specifically avoided the big urban centers along the East and West Coasts, which are generally doing quite well. He wanted to visit smaller cities and towns, especially those that had suffered economic upheavals and other hardships over the past few decades.

There’s a long tradition of road trips as a means of discovering the real America, going back to French historian Alexis de Tocqueville, who in 1831 spent 9 months traveling through the young country.He later published his famous book, Democracy in America, an insightful analysis of why representative democracy had succeeded in the US while failing in so many other places.

Fallows received over 1,000 responses to his request, and in 2013, he and his wife Deb embarked on a 3 year trip across America in their small, single-engine propeller plane.They spent 10-14 days in two dozen cities and towns across the country, and had shorter visits in two dozen more.The resulting chronicle of his travels, How America is Putting Itself Back Together, was recently published in The Atlantic.

Fallows grew up in Redlands, California, next-door to San Bernardino, a town we’ve all now heard of because of last December’s terrorist attack.He spent time in San Bernardino before the shootings, and found it to be “a poor, troubled town that sadly managed to combine nearly every destructive economic, political, and social trend of the country as a whole.San Bernardino went into bankruptcy in 2012 and was only beginning to emerge at the time of the shootings.Crime is high, household income is low, the downtown is nearly abandoned in the daytime and dangerous at night, and unemployment and welfare rates are persistently the worst in the state.”

“But that was not the only thing, or even the most interesting thing, that we saw during our time there. If news is what you didn’t know before you went to look, the news of San Bernardino, from our perspective, was not the unraveling but the reverse.The familiar background was the long decline.The surprise was how wide a range of people, of different generations and races and political outlooks, believed that the city was on the upswing, and that their own efforts could help speed that trend… From a distance, the San Bernardino story is of wall-to-wall failure.From the inside, the story includes rapidly progressing civic and individual reinvention…”

The Fallows found similar reinvention stories just about everywhere they went, - from Duluth, Minnesota to northeastern Mississippi. “What Americans have heard and read about the country since Deb and I started our travels is the familiar chronicle of stagnation and strain.The kinds of things we have seen make us believe that the real news includes a process of revival and reinvention that has largely if understandably been overlooked in the political and media concentration on the strains of this Second Gilded Age.”

The Gilded Age is the name given to the era of rapid economic and population growth following the Civil War, from the 1870s to about 1900. It was a time of major industrial and technological advances, including railroads, steel and factories of all sorts. The telephone, electricity and automobiles were invented and initially deployed during this period.

In an excellent 2010 essay, Stanford University historian David Kennedy examined whether the country had gone through times in the past as volatile and polarized as our present ones.

“Explanations for our current political volatility abound: toxic partisanship, the ever more fragmented and strident news media, high unemployment, economic upheaval and the clamorous upwelling of inchoate populist angst,” he wrote.“But the political instability of our own time pales when compared with the late 19th century. In the Gilded Age the American ship of state pitched and yawed on a howling sea of electoral turbulence. For decades on end, divided government was the norm. In only 12 of the 30 years after 1870 did the same party control the House, the Senate and the White House.”

The country was then going through serious challenging times. Foremost among them was navigating the transition from an agricultural to an industrial economy, and from a rural to an urban society; managing the labor unrest in this new industrialized, urban economy, as workers worried about their jobs, pay and working conditions; absorbing millions of immigrants; and recovering from the wounds of the Civil War.

Some of the major challenges we face today are similar, some are new. We’re once more navigating an economic transition, this time to an information-based digital economy; dealing with the employment challenges caused by this historical transition; and trying to come up with a reasonable immigration policy. Other challenges are more unique to our times, such as increased global competition; high healthcare costs; and energy, sustainability and climate issues.

“What’s instructive to us now is the similarity between the Gilded Age’s combination of extraordinary social and economic dynamism and abject political paralysis,” added Kennedy. “In the face of all those challenges, like our Gilded Age forebears, we have a political system that manages to be both volatile and gridlocked - indeed, it may be gridlocked not least because it is so volatile. And, like their 19th-century forebears, today’s politicians have great difficulty gaining traction on any of those challenges. Now as then, it’s hard to lead citizens who are so eager to ‘throw the bums out’ at every opportunity.”

Yet, the country kept moving on and making progress, including the major advances in what is often referred to as the Second Industrial Revolution, as well as the development of the world’s most comprehensive public education system. By the beginning of the 20th century, the US emerged as an increasingly prosperous country, one of the world’s industrial and economic powers.

It was a tale of two Americas, as is the case today.In March, 2013, The Economist published a special report on US competitiveness, - The America that Works.“Luckily, dysfunction in Washington is only one side of America’s story,” noted the report in its overview article.

Inside-the-Beltway America deals primarily with issues concerning the federal government, the many lobbyists and contractors surrounding it, and the 24/7 media that covers it. “Its debt is rising, its population is ageing in a budget-threatening way, its schools are mediocre by international standards, its infrastructure rickety, its regulations dense, its tax code byzantine, its immigration system hare-brained - and it has fallen from first position in the World Economic Forum’s competitiveness rankings to seventh in just four years…”

“Yet there is also another America, where things work. One hint comes from what those bosses like to call the real economy. Recent numbers from the jobs market and the housing sector have been quite healthy. Consumer balance-sheets are being repaired. The stockmarket has just hit a record high. Some of this is cyclical: the private sector is rebounding from the crunch. But it also reflects the fact that, beyond the District of Columbia, the rest of the country is starting to tackle some of its deeper competitive problems. Businesses and politicians are not waiting for the federal government to ride to their rescue. Instead,… they are getting to grips with the failings Congress is ignoring.”

What about immigration, - an issue that’s quite personal for me?Along with my family, I came to the US in 1960 as a 15 year old refugee from Cuba.I’ve always felt welcome, as have the many millions of immigrants that have long been coming to our country.But listening to the current immigration debates, you might conclude that it’s time to retire the Statue of Liberty, or at least to replace Emma Lazarus’ iconic words, - “Give me your tired, your poor/Your huddled masses yearning to breathe free” - with something closer to “Go Away.”

But, that’s not what Fallows found in his travels across the country.Far from it.“Almost every place we went, the changes in America’s ethnic makeup were obvious.Almost no place did this come up as an economic, cultural, or political emergency, or even as the most pressing local issue.Based on everything we could see, the problems of immigration that presidential candidates have seized on for political advantage were largely another rest of America problem… If you hadn’t heard the speeches and read the stories about an immigration-driven crisis in America, you might conclude city by city that the American assimilation machine was still functioning.”

He found such a spirit, for example, in Sioux Falls, South Dakota, a city that’s nearly 90% white.But walking the streets of Sioux Falls, he came across a substantial number of people from Somalia, Sudan, Nepal, Burma and other sites of recent turmoil.“Sioux Falls, despite being relatively nondiverse and remote, is a city with one of the best records of absorbing refugees (Burlington, Vermont, is another).The civic and business leaders of Sioux Falls we spoke with, most of them white, seemed proud rather than beleaguered about their city’s new role as a melting pot.”

Let me conclude with these words by James Fallows that truly lifted my spirits.

“As a whole, the country may seem to be going to hell.That jeremiad view is a great constant through American history…But here is what I now know about America that I didn’t know when we started these travels, and that I think almost no one would infer from the normal diet of news coverage and political discourse… Many people are discouraged by what they hear and read about America, but the closer they are to the action at home, the better they like what they see…”

“After our current Gilded Age, the national mood will change again.When it does, a new set of ideas and plans will be at hand.We’ve seen them being tested in places we never would have suspected, by people who would never join forces in the national capital.But their projects, the progress they have made, and their goals are more congruent than even they would ever imagine.Until the country’s mood does change, the people who have been reweaving the national fabric will be more effective if they realize how many other people are working toward the same end.”

Amen.

Anticipating the Future of the IT Industrytag:typepad.com,2003:post-6a00d8341f443c53ef01bb08c3d4ef970d2016-03-22T06:07:11-04:002016-03-22T05:43:18-04:00How can we best anticipate the future of a complex, fast changing industry like IT? Which hot technology innovations, - e.g., artificial intelligence, the blockchain, cloud computing, - will end up having a big impact and which are destined to...IWB

How can we best anticipate the future of a complex, fast changing industry like IT?Which hot technology innovations, - e.g., artificial intelligence, the blockchain, cloud computing, - will end up having a big impact and which are destined to fizzle?What can we learn from the IT industry’s 60-year history that might help us better prepare for whatever lies ahead?

A major way of anticipating the future of any economic or social entity, - be it a company, industry, university, government agency or city, - is to explore and learn from its history.While there’s no guarantee that historical patterns will continue to apply going forward, they might well be our most important guides as we peer into an otherwise unpredictable future.

I’ve been involved with computers since the early 1960s, first as a student at the University of Chicago, then in my long career at IBM, and subsequently through my relationship with a number of companies and universities.I’ve thus had a ringside seat from which to observe the journey the IT industry’s been on since those early days.

Let me share some of my personal impressions of this journey through the lens of three key areas, each of which has played a major role throughout IT’s history, and will continue to do so well into its future: data centers, transaction processing, and data analysis.

Data Centers → Cloud

Back in 1962, the year I entered college, I got a part-time programming job in the newly created university computation center. In those days, I keypunched my programs onto the 80-column so-called IBM cards, brought the card decks to the machine room and gave them to an operator, who then submitted them to the computer via a card reader. Jobs were run one-at-a-time, generally resulting in a multi-hour wait. I nostalgically associate those early computer days with Chicago-style pizza, becausemany a night, while waiting for my job to run, Iwent with friends to Uno’s or Due’s, - which are still serving their famous deep-dish pizza in Chicago’s near north side.

What happened to those computers and machine rooms that we rarely see these days?In a special report on Corporate IT published by The Economist in October of 2008, technology editor Ludwig Siegele offered a very elegant answer to this question:

“In the beginning computers were human. Then they took the shape of metal boxes, filling entire rooms before becoming ever smaller and more widespread.Now they are evaporating altogether and becoming accessible from anywhere.”

“That is about as brief a history of computers as anyone can make it.The point is that they are much more than devices in a box or in a data centre.Computing has constantly changed shape and location - mainly as a result of new technology, but often also because of shifts in demand.”

“Now… computing is taking on yet another new shape. It is becoming more centralised again as some of the activity moves into data centres. But more importantly, it is turning into what has come to be called a cloud, or collections of clouds. Computing power will become more and more disembodied and will be consumed where and when it is needed.”

As Siegele predicted, computers and data centers have been disappearing into the cloud. It’s what has always happened when consumer-oriented technologies become ubiquitous in society, - electricity, telephones, television. While consumer devices are now everywhere, their back-ends are nowhere to be seen. They are quietly doing their highly disciplined work of generating and transmitting their output to the many billions of users and devices out there. With the success of cloud computing, IT has now been undergoing a similar transformation.

Cloud represents the industrialization of the data center. Many data centers evolved over the years with limited architectural discipline or company-wide governance. IT organizations often spent the bulk of their energies on the maintenance and integration of their legacy applications, some having been developed by different departments within the business and some having been inherited through mergers and acquisitions.Not surprisingly, these older companies were then challenged to respond to fast changing market conditions, especially when it came to new customer-facing applications, which generally require massive scalability, flexibility and agility.

IT has had to become much more disciplined in every aspect of its operations. Data centers have now become the production plants of cloud-based services, a transformation that’s been pioneered by born-to-the-cloud companies like Amazon, Google, and Salesforce.Cloud computing requires well-engineered infrastructures, applications and services.

The architectural standards and management disciplines of public cloud providers are being embraced by many older companies, as they develop private clouds so they too can efficiently deliver high quality services to their own customers, business partners and employees. At the same time, companies are acquiring IT or business services from the growing number of companies offering such services in the marketplace. Most companies are embracing a hybrid model - delivering some services from their own data centers, and acquiring others from service providers.

Transaction volumes have been rapidly increasing over the past 20 years.First came the explosive growth of the Internet and World Wide Web, attracting 10s -100s of millions of users to the world of computing, and giving rise to all kinds of online transactions.

Several years later the Internet transitioned to its mobile phase, based on billions of users around the world connected via smart personal devices and broadband wireless networks, giving rise to many innovative apps.Mobile digital payments, for example, is still in its early stages, but over time, will likely generate huge volumes of transactions all around the world.

More recently, we’ve seen the rise of the Internet of Things, supporting 10s of billions of smart devices on its way to 100s of billions, and generating an ever growing volume of transactions from our newly digitized physical world, including homes, transportation, cities and even our bodies.

In addition to keeping up with the explosive scalability requirements, IT infrastructure face very serious challenges in the areas of security, privacy and robustness.Many of these new applications, - dealing with payments, health, cities and others areas, - are mission critical and must be up just about all the time, even while withstanding constant cyber attacks.In addition, IT is increasingly operating in an environment wherea significant number of players cannot be trusted.

We need mission critical IT infrastructures that are capable of near unlimited scalability, near 100% reliability, privacy, and security, and the ability to deal with transactions where the parties involved don’t necessarily trust each other.Blockchains and blockchain-inspired distributed ledger architectures are among the most exciting technologies with the potential to revolutionize transaction processing.

Data Analysis → Data Science, Cognitive, AI

The relationship between computing and data goes back to the early days of what we then referred to as the data processing industry. Beyond their use in operations, the data generated by transactional applications were also used to improve the efficiency, financial performance, and overall management of the organization. The information was generally collected in data warehouses, and a variety of business intelligence tools were developed to analyze the data and generate the appropriate management reports.

These early analytics applications dealt mostly with structured information. But at the same time, research communities were developing methodologies for dealing with high volumes of unstructured data, as well as analytical techniques, like data mining, for discovering patterns and extracting insights from all that data.

The explosive growth of the Internet since the mid-1990s has taken data analysis to whole new levels.Data is now being generated by just about everything and everybody around us, including the growing volume of online and offline transactions, web searches, social media interactions, billions of smart mobile devices and 10s of billions of IoT smart sensors.

As a result, we can now capture as data many aspects of our world that have never been quantified before.All these data is now enabling us to better understand the world’s physical, economic and social infrastructures, as well as to infuse information-based intelligence into every aspect of their operations. It’s making it possible to not just better understand what’s happening in the present, but to also make more accurate predictions about the future.

In the end, innovation is a journey into the future. It’s not possible to anticipate where we might be heading without understanding where we’ve come from. It’s been quite an incredible journey. Over the decades, we’ve seen digital technologies increasingly permeate just about every nook and cranny of the economy, society and our personal lives. Given the dramatic technology advances still to come, we can expect an equally exciting journey well into the future.

The MIT Inclusive Innovation Competitiontag:typepad.com,2003:post-6a00d8341f443c53ef01b8d1adb4f3970c2016-03-14T18:14:37-04:002016-03-14T17:55:58-04:00About three years ago, MIT launched the Initiative on the Digital Economy (IDE), a major effort focused on the broad changes brought about by the relentless advances of digital technologies. As its website explains: “While digital technologies are rapidly transforming...IWB

About three years ago, MIT launched the Initiative on the Digital Economy (IDE), a major effort focused on the broad changes brought about by the relentless advances of digital technologies. As its website explains:

“While digital technologies are rapidly transforming both business practices and societies and are integral to the innovation-driven economies of the future, they are also the core driver of the great economic paradox of our time. On one hand, productivity, wealth, and profits are each at record highs; on the other hand, the median worker in America is poorer than in 1997, and fewer people have jobs.Rapid advances in technology are creating unprecedented benefits and efficiencies, but there is no economic law that says everyone, or even a majority of people, will share in these gains.

The future of work and jobs is one of the major areas being addressed by IDE.What will the workforce of the future look like?; Where will jobs come from in the coming years?; Will the nature of work be significantly different in the digital economy?; How can we accelerate the transformation of institutions, organizations, and human skills to keep up with the quickening pace of digital innovation?

To help come up with breakthrough answers to these very challenging questions, IDE just launched its first annual Inclusive Innovation Competition.The competition aims to identify, celebrate and award prizes to “organizations that are inventing a more sustainable, productive, and inclusive future for all by focusing on improving economic opportunity for middle- and base-level income earners.”

The competition is open to for-profit and non-profit organizations of any size, age or type, in any nation around the world. It seeks creative solutions in four major categories:

Skills:Prepare members of the workforce for opportunities of the future, including the necessary education to help them acquire new kills as well as improve their existing ones.

Matching:Help qualified unemployed or underemployed individuals gain access to meaningful, productive and engaging work by improving the matching of labor supply with demand.

Humans + Machines:Use technology to augment human labor so that the outcome is greater than either human or machine could achieve alone, and develop innovative offerings that improve the human capacity for effective physical or cognitive work.

New Models:Come up with innovative jobs, business models and operational practices that will revolutionize the labor markets, help create new economic opportunities, and enable workers to succeed in meaningful ways.

A total of $1 million will be awarded.In each of the four categories, the grand prize winner will receive $125,000, and four runner-ups will each receive $25,000.In addition, a handful of additional awards will be given to organizations deemed by the judges to be uniquely inventive.

Let me briefly discuss why I think that the Inclusive Innovation Competition is such an important initiative.

Few topics are as critical, - and as challenging to anticipate, - than the future of jobs in the digital economy.Along with its many benefits, the digital revolution has resulted in enormous dislocations in labor markets and a sharp polarization in job opportunities over the past several decades.

Jobs requiring expert problem solving and complex communications skills have significantly expanded, with the earnings of the college educated workers needed to fill such jobs rising steadily. But, opportunities have significantly declined for middle-skill jobs dealing with the kinds of routine physical or cognitive tasks that can be well described by a set of rules and have thus been prime candidates for technology substitution.Low-skill jobs involving physical tasks have been growing, but their wages have been stagnant or declining.Moreover, as intelligent machines become more capable and less expensive, they will increasingly compete with and replace unskilled human labor all around the world.

Harvard economics professor Larry Summers noted in a 2014 articlethat, from time immemorial, the greatest economic problem had been coping with scarcity, as humanity could not produce enough to satisfy everybody. But the problem has now been changing, initially in advanced economies, and over time in most of the world. “The economic challenge of the future will not be producing enough. It will be providing enough good jobs.”

Technology has been replacing workers and improving productivity ever since the advent of the Industrial Revolution in the second half of the 18th century. In past technology-based economic revolutions, the periods of creative destruction and high unemployment eventually worked themselves out. Over time, these same disruptive technologies and innovations led to the transformation of the economy and the creation of new industries and new jobs.

“Previous technological innovation has always delivered more long-run employment, not less. But things can change,” notes a 2014 Economistarticle on the future of jobs. “Nowadays, the majority of economists confidently wave such worries away. By raising productivity, they argue, any automation which economises on the use of labour will increase incomes. That will generate demand for new products and services, which will in turn create new jobs for displaced workers… Yet some now fear that a new era of automation enabled by ever more powerful and capable computers could work out differently.”

The machines of the industrial economy made up for our physical limitations, - steam engines enhanced our physical power, railroads and cars helped us go faster, and airplanes gave us the ability to fly. The machines of the digital economy are now making up for our cognitive limitations, augmenting our intelligence and our ability to process vast amounts of information. They are now being increasingly applied to activities requiring intelligence and cognitive capabilities that not long ago were viewed as the exclusive domain of humans.These technology advances are truly pushing the boundaries between human and machines.

Many workers are learning to co-evolve with our intelligent machines, and as has been the case in the past, they will be ready for whatever new jobs are created. But, our fear is that this time is different and the long predicted era of technological unemployment is finally upon us. Technology advances are running so far ahead that large numbers of people may not be able to keep up, and the future will bring even more serious economic disruptions.

Just before the launch of the Initiative on the Digital Economy, I participated in an MIT roundtable on the future of jobs that discussed many of the issues IDE is now addressing.After listening to a number of experts throughout the day, the main conclusion I took away from the roundtable is that, - while having lots of ideas, hypotheses and hopes, - we truly don’t know where jobs will come from in the coming years, particularly for the middle- and low-level income earners being left behind by the digital revolution.

That’s why it’s particularly heartening that MIT, which has been at the forefront of many of these technology innovations, is also seriously addressing their painful impact on so many, and searching for breakthrough innovations that will improve the economic prospects of workers everywhere.

The deadline for submittingapplication for the Inclusive Innovation Competition is June 15. The winners will be announced toward the end of September.I’m truly looking forward to the creative solutions they all come up with.

The Coming Robotics Revolutiontag:typepad.com,2003:post-6a00d8341f443c53ef01b7c815e354970b2016-03-08T07:32:11-05:002016-03-08T07:17:04-05:00Hi, Robot: Work and Life in the Age of Automation, reads the Hamlet-inspired cover of the July, 2105 issue of Foreign Affairs. The clever cover highlights the issue’s main theme, artificial intelligence and robotics, - a set of technologies that...IWB

As its introductory article reminds us “new technologies have been revolutionizing the world for centuries, transforming life and labor and enabling an extraordinary flourishing of human development.Now some argue that advances in automation and artificial intelligence are causing us to take yet another world-historical leap into the unknown.”

Speculations about a future radically transformed by technology are nothing new.But, with AI and robots seemingly everywhere, the concerns surrounding their long term impact may well be in a class by themselves. Like no other technologies, AI and robots force us to explore the boundaries between machines and humans. Will they turn out like other major innovations, e.g., steam power, electricity, cars, - highly disruptive in the near term, but ultimately beneficial to society?Or, will we see a more dystopian future, as smart machines increasingly encroach on activities requiring intelligence and cognitive capabilities that not long ago were the exclusive domain of humans?Opinions abound, but in the end, we don’t really know.

Daniela Rus, - MIT professor and Director of the Computer Science and AI Lab (CSAIL), - is in the techno-optimist camp, - as am I.In her Foreign Affairs article, - The Robots Are Coming: How Technological Breakthroughs Will Transform Everyday Life, - Rus writes that“Robots have the potential to greatly improve the quality of our lives at home, at work, and at play.Customized robots working alongside people will create new jobs, improve the quality of existing jobs, and give people more time to focus on what they find interesting, important, and exciting…Yet the objective of robotics is not to replace humans by mechanizing and automating tasks; it is to find ways for machines to assist and collaborate with humans more effectively… By working together, robots and humans can augment and complement each other’s skills.”

Robots, in her opinion, are a major part of the natural evolution of computing.In the beginning, all computers were relatively expensive big boxes.Then came personal computers, which over time more than fulfilled Bill Gates’ once quixotic dream of “a computer on every desk and in every home.”Over the past decade, the mantle has been passed to the billions of smartphones carried by a large portion of the world.

In a 2014 interview, Professor Rus said that in 10 - 15 years she expects robots to be as commonplace as smartphones, “with personal robots that can help with everything from doing search-and-rescue operations to folding the laundry.”Her MIT research group, the Distributed Robotics Lab, has built robots that can “tend a garden, bake cookies from scratch, cut a birthday cake, fly in swarms without human aid to perform surveillance functions, and dance with humans.”The line between consumer appliances and robotics is already blurring with products like the Roomba vacuum cleaner.Moderately priced programmable, collaborative robots, initially aimed at manufacturing, are already available from a number of companies.

“Still, there are significant gaps between where robots are today and the promise of a future era of pervasive robotics, when robots will be integrated into the fabric of daily life, becoming as common as computers and smartphones are today, performing many specialized tasks, and often operating side by side with humans,” adds Rus.“Current research aims to improve the way robots are made, how they move themselves and manipulate objects, how they reason, how they perceive their environments, and how they cooperate with one another and with humans.”

Why is robotics such a hot discipline? All computers, - whether mainframes, PCs or smartphones, - are defined by what their brains, - that is, their hardware and software, - are capable of computing and controlling.Robots are computers that have both a brain and a body.A robot’s capabilities is defined by what its brains and body can jointly do.

Digital technology advances have greatly benefited the robot’s brains, as they have those of all other computers.In addition, the electromechanical components used in robotic devices are also advancing rapidly, making it possible to imagine the leap to whole families of personal robots in the not too distant future. According to Rus, the integration of robots into everyday life requires progress in three key areas.

It takes too long to make new robots.For the most part, general purpose computers all use similar hardware and software components.But, there are many different kinds of robots available today, each one having a body specifically designed for its task.As a result, each kind of robot body must be individually designed and produced, and its body capabilities are difficult to extend to new tasks and applications.

“What’s needed are design and fabrication tools that will speed up the customized manufacturing of robots,” notes Rus.We need something like a robot compiler “that could take a particular specification - for example, I want a robot to tidy up the room - and compute a robot design, a fabrication plan, and a custom programming environment for using the robot.”

Robots have a limited ability to perceive and reason about their surroundings. Computers have made huge advances in automating those human tasks that can be well described by a set of rules. But, despite continuing advances in AI, the challenges of applying computers and robots to tasks requiring flexibility, judgment, and common sense are still quite large.

The reason is that our actions are guided by two very different kinds of knowledge.Explicit knowledge is formal, codified, and can be readily explained to people and captured in a computer program. Tacit knowledge, on the other hand, is the kind of knowledge we are often not aware we have, and is therefore difficult to transfer to another person, let alone to a machine. Tacit knowledge is generally learned through personal interactions and practical experiences. Everyday examples include speaking a language, riding a bike, driving a car, and easily recognizing many different objects and animals.

“Today’s robots can perform only limited reasoning due to the fact that their computations are carefully specified. Everything a robot does is spelled out with simple instructions, and the scope of the robot’s reasoning is entirely contained in its program.Furthermore, a robot’s perception of its environment through its sensors is quite limited.Tasks that humans take for granted - for example, answering the question, Have I been here before? - are extremely difficult for robots… it is hard for a machine to differentiate between features that belong to a scene it has already observed and features of a new scene that happens to contain some of the same objects.”

Robotic communication is not reliable.Much progress is also needed to significantly improve communications among robots and between robots and humans.Robots can’t function effectively without adequate bandwidth, or if attempting to communicate in a noisy environment where extraneous signals make it difficult to send and receive commands.This is a particularly serious problem when trying to use robots in the kind of general environment humans live in, as would be the case with transportation or search-and-rescue missions.

“Personal computers, wireless technology, smartphones, and easy-to-download apps have already democratized access to information and computation and transformed the way people live and work.In the years to come, robots will extend this digital revolution further into the physical realm and deeper into everyday life, with consequences that will be equally profound.”

What’s the Value of a Liberal Arts Education in Our 21st Century Digital Economy?tag:typepad.com,2003:post-6a00d8341f443c53ef01bb08bd4b5e970d2016-03-01T05:49:05-05:002016-03-01T05:23:06-05:00A few years ago, I came across a very interesting article about the efforts of Roger Martin to transform business education. At the time, Martin was the Dean of the Rotman School of Management at the University of Toronto. He...IWB

A few years ago, I came across a very interesting article about the efforts of Roger Martin to transform business education. At the time, Martin was the Dean of the Rotman School of Management at the University of Toronto. He had long been advocating “that students needed to learn how to think critically and creatively every bit as much as they needed to learn finance or accounting. More specifically, they needed to learn how to approach problems from many perspectives and to combine various approaches to find innovative solutions.”

“Learning how to think critically - how to imaginatively frame questions and consider multiple perspectives - has historically been associated with a liberal arts education, not a business school curriculum, so this change represents something of a tectonic shift for business school leaders.” Achieving this goal would require business schools to move into territory “more traditionally associated with the liberal arts: multidisciplinary approaches, an understanding of global and historical context and perspectives, a greater focus on leadership and social responsibility and, yes, learning how to think critically.”

Similarly, in a 2006 report, the National Academy of Engineering called for reforming engineering education. “New graduates were technically well prepared but lacked the professional skills for success in a competitive, innovative, global marketplace. Employers complained that new hires had poor communication and teamwork skills and did not appreciate the social and nontechnical influences on engineering solutions and quality processes.”

More recently, USC’s Annenberg School of Communications and Journalism conducted a studyto better understand the key competencies companies were looking for, and whether their talent requirements were being adequately addressed by universities. Future leaders, the study found, must be strong in quantitative, technical and business skills. But to advance in their careers, they also need to be good strategic thinkers and must have strong social and communications skills.

In other words, a good education should include soft as well as hard competencies. Business and engineering schools do a pretty good job when it comes to teaching hard skills. But they don’t do so well with the softer competencies companies are also looking for. Finding and retaining talented individuals with this mix of capabilities is a challenge regardless of geography or industry.

Given that we generally associate these kinds of competencies with the liberal arts, you would expect a slew of articles singing the praises of a solid liberal arts education. But, that’s not the case. There’s been an ongoing debate about the value of the liberal arts in today’s economy especially when compared to the STEM disciplines - science, technology, engineering and math. As this recent NY Times article noted in its very title, we’re seeing “A Rising Call to Promote STEM Education and Cut Liberal Arts Funding.”

This is a topic I’m very interested in. I went to the University of Chicago as both an undergraduate and graduate student, - which to this day takes great pride in being a top liberal arts school. And, I just joined the Board of Trustees of Manhattanville, - a small liberal arts college in the New York metropolitan area.

Let me first take a brief look at what is meant by the liberal arts.According to Wikipedia, “The liberal arts are those subjects or skills that in classical antiquity were considered essential for a free person to know in order to take an active part in civic life, something that (for Ancient Greece) included participating in public debate, defending oneself in court, serving on juries, and most importantly, military service.”

I particularly like the following, much more up-to-date and practical definition.“A liberal arts education is by nature broad and diverse, rather than narrow and specialized… [it] is not intended to train you for a specific job, though it does prepare you for the world of work by providing you with an invaluable set of employability skills, including the ability to think for yourself, the skills to communicate effectively, and the capacity for lifelong learning.”

These are very important points. My degrees were in physics, but I then switched to computer sciences when I joined IBM’s research labs in 1970. Over the course of my long career, I’ve held a number of positions that required management, communications and business skills in addition to STEM skills. In retrospect, the biggest benefit of my excellent University of Chicago education has been less the physics, math, and computer skills that I learned, and more the ability to address complex problems, to work with a variety of people, and to keep acquiring new knowledge throughout my career.

In a recent, provocative article, Is majoring in liberal arts a mistake for students?, entrepreneur, investor and technologist Vinod Khosla raises a number of important issues. “Little of the material taught in Liberal Arts programs today is relevant to the future…” he writes in his opening paragraph.“I feel that liberal arts education in the United States is a minor evolution of 18th century European education.The world needs something more than that.Non-professional undergraduate education needs a new system that teaches students how to learn and judge using the scientific process on issues relating to science, society, and business.”

Khosla is very concerned about the current state of the liberal arts, which he feels is too focused on past knowledge at the expense of the many advances in the sciences, economics, psychology and other disciplines.In his opinion, liberal arts programs aren’t adequately dealing with the pressing problems and exciting opportunities of the 21st century. “Though Jane Austen and Shakespeare might be important, they are far less important than many other things that are more relevant to make an intelligent, continuously learning citizen, and a more adaptable human being in our increasingly more complex, diverse and dynamic world.”

For him, the true test of a good basic education would be quite simple: “at the end of an undergraduate education, is a student roughly able to understand and discuss the Economist, end-to-end, every week. This modern, non-professional education would meet the original Greek life purpose of a liberal arts education, updated for today’s world.” He references a 2014 New Republicarticle by Harvard professor Steven Pinker, who nicely summarized what an educated person should know:

“It seems to me that educated people should know something about the 13-billion-year prehistory of our species and the basic laws governing the physical and living world, including our bodies and brains.They should grasp the timeline of human history from the dawn of agriculture to the present.They should be exposed to the diversity of human cultures, and the major systems of belief and value with which they have made sense of their lives.They should know about the formative events in human history, including the blunders we can hope not to repeat.They should understand the principles behind democratic governance and the rule of law.They should know how to appreciate works of fiction and art as sources of aesthetic pleasure and as impetuses to reflect on the human condition.”

“On top of this knowledge, a liberal education should make certain habits of rationality second nature. Educated people should be able to express complex ideas in clear writing and speech.They should appreciate that objective knowledge is a precious commodity, and know how to distinguish vetted fact from superstition, rumor, and unexamined conventional wisdom.They should know how to reason logically and statistically, avoiding the fallacies and biases to which the untutored human mind is vulnerable.They should think causally rather than magically, and know what it takes to distinguish causation from correlation and coincidence.They should be acutely aware of human fallibility, most notably their own, and appreciate that people who disagree with them are not stupid or evil.Accordingly, they should appreciate the value of trying to change minds by persuasion rather than intimidation or demagoguery.”

In his article, Khosla proposes that every student should master what he calls a Liberal Science curriculum:

“The fundamental tools of learning and analysis, primarily critical thinking, the scientific process or methodology, and approaches to problem solving and diversity.”

“Knowledge of a few generally applicable topics and knowledge of the basics such as logic, mathematics, and statistics to judge and model conceptually almost anything one might run into over the next few decades.”

“The skills to dig deep into their areas of interest in order to understand how these tools can be applied to one domain and to be equipped to change domains every so often.”

“Preparation for jobs in a competitive and evolving global economy or preparation for uncertainty about one’s future direction, interest, or areas where opportunities will exist.”

“Preparation to continuously evolve and stay current as informed and intelligent citizens of a democracy.”

Other than his point about Austen and Shakespeare, - I went to the University of Chicago, after all, - I agree with much of what Khosla has to say. I would expect that many educators would also agree with his views of what a proper 21st century education should encompass. I suspect that the major area of disagreement might likely be on the actual state of a liberal arts education. Are liberal arts programs really stuck in the past? Are they relevant for the future? Do they properly prepare students for our increasingly complex, diverse and dynamic world? I frankly don’ know the answers to these important questions.

In the end, it’s all a matter of balance. As Lisa Dolling, - Manhattanville College Provost, - recently wrote: “Among the many false dichotomies fostered by the continuing debates surrounding higher education, one that I find especially disconcerting is that which pits the professional against the personal. While it is expressed in a variety of ways, it boils down to this: Either you believe the purpose of going to college is to be able to secure a (preferably high-paying) job, or you think there is something more intrinsically valuable to be gained from the years spent earning a degree. My question is: When did these become mutually exclusive?”

The Fourth Industrial Revolutiontag:typepad.com,2003:post-6a00d8341f443c53ef01b8d198ec29970c2016-02-23T06:26:25-05:002016-02-23T06:26:25-05:00The Fourth Industrial Revolution: what it means, how to respond was the central theme of the 2016 World Economic Forum (WEF) that took place earlier this year in Davos, Switzerland. The theme was nicely explained by Klaus Schwab, WEF founder...IWB

Dr. Schwab positions the Fourth Industrial Revolution within the historical context of three previous industrial revolutions. The First, - in the last third of the 18th century, - introduced new tools and manufacturing processes based on steam and water power, ushering the transition from hand-made goods to mechanized, machine-based production.The Second, - a century later, - revolved around steel, railroads, cars, chemicals, petroleum, electricity, the telephone and radio, leading to the age of mass production. The Third, - starting in the 1960s, - saw the advent of digital technologies, computers, the IT industry, and the automation of process in just about all industries.

“Now a Fourth Industrial Revolution is building on the Third, the digital revolution that has been occurring since the middle of the last century,” he noted.“It is characterized by a fusion of technologies that is blurring the lines between the physical, digital, and biological spheres.”

Most everyone agrees that there was a major qualitative distinction between the First and Second Industrial Revolutions. While some believe that the Fourth is merely the evolution of the Third, Schwab argues that they’re qualitatively different for 3 major reasons:

Velocity: Compared to the previous three revolutions, “the Fourth is evolving at an exponential rather than a linear pace.”

Scope: Disruptions are taking place in “almost every industry in every country.”

Systems impact: “The breadth and depth of these changes herald the transformation of entire systems of production, management, and governance.”

He further adds that there are no historical precedents for the opportunities ahead.Technology advances keep expanding the benefits of the digital revolution across the planet. “The possibilities of billions of people connected by mobile devices, with unprecedented processing power, storage capacity, and access to knowledge, are unlimited.And these possibilities will be multiplied by emerging technology breakthroughs in fields such as artificial intelligence, robotics, the Internet of Things, autonomous vehicles, 3-D printing, nanotechnology, biotechnology, materials science, energy storage, and quantum computing.”

This digital revolution could significantly improve the quality of life of billions around the world. But, it will be accompanied by serious challenges. We must not forget that technological revolutions are highly disruptive to economies and societies. In particular, we’ve already seen major dislocations in labor markets over the past few decades, benefiting workers in low-cost developing nations while severely impacting workers in advanced economies. Moreover, as intelligent machines become more capable and less expensive, they will increasingly replace unskilled human labor even in developing economies.

“We cannot foresee at this point which scenario is likely to emerge, and history suggests that the outcome is likely to be some combination of the two,” writes Schwab.“However, I am convinced of one thing - that in the future, talent, more than capital, will represent the critical factor of production.This will give rise to a job market increasingly segregated into low-skill/low-pay and high-skill/high-pay segments, which in turn will lead to an increase in social tensions.”

The article examines the impact of the Fourth Industrial Revolution on three major segments: business, government and individuals.

Impact on Business.Adapting to the ongoing digitization of the economy is already a major challenge for many companies. The accelerating velocity, expanded scope and systemic impact of the Fourth Industrial Revolution will lead to constant surprises and serious disruptions, including:

Customer expectations - No transformation is more challenging than meeting the service expectations of digitally empowered customers.Digital technologies enable companies to better engage with their customers and offer them a superior experience at affordable costs. But, providing such an experience to increasingly savvy, - and fickle, - customers is getting harder.

Product enhancements - Technology advances are giving rise to a large variety of smart connected products and services, combining sensors, software, data, analytics and connectivity in all kinds of ways.These innovative offerings are restructuring industry boundaries and leading to the creation of whole new industries.

Collaborative innovations - Companies have to become much more innovative to better respond to the fast-changing, highly competitive business environment.Collaboration is indispensable for innovation, both within the company’s own boundaries and beyond, - including customers, partners, startups, universities and research communities.

Organizational forms - Company structures and culture must be rethought to better deal with new market environments and business models. The hierarchic organization that prevailed in the 20th century was appropriate to a production oriented, industrial economy, but it will not work so well in the more global and fast-changing digital economy.

Impact on Government. Aligning government with 21st century technological, economic and social realities will require innovations at least as disruptive and profound as those embraced by the private sector.

Realizing the potential of the Fourth Industrial Revolution, - e.g., Internet of Things, health analytics, smart cities, autonomous vehicles, - requires considerable support and actions from policy makers. These include policies to protect the privacy and rights of consumers and businesses; stronger security requirements for critical devices and systems; incentives that promote fair data sharing across companies; and new regulations to help us deal with increasingly intelligent machines.

Support from policy makers is also required to help address job creation, - one of the toughest challenges facing nations around the world. Higher educational attainments are necessary for many of the kinds of jobs that will be available over the next few decades, thus requiring a national commitment to education.Creating an environment that fosters a talented workforce, well-paying jobs and a decent standard of living is one of the primary responsibilities of government.

“Ultimately, the ability of government systems and public authorities to adapt will determine their survival,…” writes Schwab.“How, then, can they preserve the interest of the consumers and the public at large while continuing to support innovation and technological development?By embracing agile governance, just as the private sector has increasingly adopted agile responses to software development and business operations more generally.This means regulators must continuously adapt to a new, fast-changing environment, reinventing themselves so they can truly understand what it is they are regulating.To do so, governments and regulatory agencies will need to collaborate closely with business and civil society.”

Impact on People.“The Fourth Industrial Revolution, finally, will change not only what we do but also who we are. It will affect our identity and all the issues associated with it: our sense of privacy, our notions of ownership, our consumption patterns, the time we devote to work and leisure, and how we develop our careers, cultivate our skills, meet people, and nurture relationships.It is already changing our health and leading to a quantified self, and sooner than we think it may lead to human augmentation. The list is endless because it is bound only by our imagination.”

What will digital life be like in the decades ahead?A couple of years ago, the Pew Research Center conducted a study, Digital Life in 2025, where it asked over 1,800 experts what will likely be the most significant impact of the Internet on humanity by 2025.

The study’s overriding consensus was that “the Internet will become like electricity - less visible, yet more deeply embedded in people’s lives for good and ill… Information sharing over the Internet will be so effortlessly interwoven into daily life that it will become invisible, flowing like electricity, often through machine intermediaries.” But, along with its many benefits, a ubiquitous digital life will be accompanied by serious issues, especially the loss of privacy: “you may be tracked/watched/recorded without you even knowing it.”

“In the end, it all comes down to people and values,” writes Schwab in his concluding paragraph.“We need to shape a future that works for all of us by putting people first and empowering them.In its most pessimistic, dehumanized form, the Fourth Industrial Revolution may indeed have the potential to robotize humanity and thus to deprive us of our heart and soul.But as a complement to the best parts of human nature - creativity, empathy, stewardship - it can also lift humanity into a new collective and moral consciousness based on a shared sense of destiny.It is incumbent on us all to make sure the latter prevails.”

Shaping the Future of the Internettag:typepad.com,2003:post-6a00d8341f443c53ef01bb08b7a867970d2016-02-16T06:22:58-05:002016-02-16T06:22:58-05:00The Winter 2016 issue of Dædalus, the Journal of the American Academy of Arts and Sciences, is devoted to the Internet. Its 7 articles explore some of the major issues shaping the Internet’s future. The issue was curated by Harvard...IWB

The Winter 2016 issue of Dædalus, the Journal of the American Academy of Arts and Sciences, is devoted to the Internet.Its 7 articles explore some of the major issues shaping the Internet’s future.The issue was curated by Harvard professor Yochai Benkler and MIT research scientist David Clark, whose joint introduction and individual papers frame the overall discussion.

“The Internet was born in 1983,” write Benkler and Clark in the Introduction.That year saw the adoption of TCP/IP, - the communications protocol that defines the Internet to this day. Its technical directions and overall culture have been largely determined by the academic and research communities, who were also its early users.With strong government support, they developed a system that “only a researcher could love: general, abstract, optimized for nothing, and open to exploration of more or less anything imaginable using connected computers.”

The Internet has since evolved “from a network that primarily delivered email among academics and government employees, to a network over which the World Wide Web arose, to the video and mobile platform it has become - and the control network for embedded computing that it is fast becoming.”From its niche research beginnings, it’s become the universal platform for collaborative innovation, enabling startups, large institutions, and everyone in between to quickly develop and bring to market many new digital products and services.

What design choices have led to the Internet as we know it?Could it still evolve into a fundamentally different platform than the one we use today?What decisions could significantly shape its future?These are among the very important questions explored in the Dædalus issue.In their Introduction, Benkler and Clark summarized some of the key themes that emerged from the various papers, - a few of which I’d now like to discuss.

The Technical is Political

As is the case with any complex system, the Internet’s design has been the result of many different choices.Some have involved technical matters, e.g. scalability, performance, robustness. Other choices have been more political in nature, reflecting the conflicting requirements of its various stakeholders in government, business and user communities. Most have been part technical, part political.

“[T]he character of the Internet as we experience it today is, in fact, contingent on key decisions made in the past by its designers, those who have invested in it, and those who have regulated it,” wrote Clark in the The Contingent Internet.“With different choices, we might have a very different Internet today.”It’s important to understand the impact of past decisions on today’s Internet as a lens through which to explore how decisions made today might shape its future, “including how open, how diverse, how funded, and how protective of the rights of its users it may be.”

Fundamentally, the Internet is a general purpose data network supporting a remarkable variety of applications.Being general purpose is a major design choice, best appreciated when considering the alternatives, such as the telephone network, which was designed specifically to carry telephone calls. This generality has enabled the Internet to become a powerful platform for innovation, but it’s come at a price: it’s good-enough but not optimal for any one application.

Over the years, the Internet has faced a number of serious challenges. For example, until the adoption of IPv6 in the late 1990s, there was a serious concern that we would soon run out of IP addresses.IPv6 can theoretically support around 3.4×1038 addresses, compared to the 4.3×109 addresses previously supported by IPv4. Bandwidth has been another major concern, especially given the rapidly growing requirements for streaming high-quality video.

So far, the Internet has been up to its challenges. A major reason for its adaptability is that it’s stuck to its basic data-transport mission, i.e., just moving bits around.The Internet has no idea what the bits mean or what they’re trying to accomplish.That’s all the responsibility of the applications running on top of it.

Consequently, there’s no one overall owner responsible for security, - arguably the biggest challenge facing the Internet.The responsibility for security is divided among several actors, making it significantly harder to achieve. As Clark points out, “the design decisions that shaped the Internet as we know it likely did not optimize secure and trustworthy operation.”

“The design of the original Internet was biased in favor of decentralization of power and freedom to act,” notes Benkler in, Degrees of Freedom, Dimensions of Power.“As a result, we benefited from an explosion of decentralized entrepreneurial activity and expressive individual work, as well as extensive participatory activity. But the design characteristics that underwrote these gains also supported cybercrime, spam, and malice.”

Centralized power is generally in the hands of governments or large companies; decentralized power tends to be distributed among individuals or loose collectives.It’s quite possible that a more centralized Internet architecture and/or governance could have avoided some of the major security, cybercrime and spam issues we currently face.But at what cost?

“To imagine either that all centralized power is good and all decentralized power is criminal and mob-like, or that all decentralized power is participatory and expressive and all centralized power is extractive and authoritarian is wildly ahistorical…” adds Benkler.“If we allow that power can be good or bad, whether centralized or decentralized, and that existing dynamics are tending toward greater centralization and stabilization of power, then we are left with a singular task: to design a system that will disrupt forms of power - old and new - as they emerge, and that will provide a range of degrees of freedom, allowing individuals and groups to bob and weave among the sources and forms of power that the Internet is coming to instantiate.”

Smartphones and Things

The majority of Internet users now connect via smartphones.There are already close to 3.5 billion Internet users and around 2.5 billion smartphone subscriptions around the world. Over the past decade, we’ve been transitioning from the connected Internet of PCs, browsers and web servers to the hyperconnected Internet of smart mobile devices, cloud computing and broadband wireless networks.This transition has major implications for the future of the Internet.

The connected phase of the Internet was based on an open business model.Neither PC or server vendors, nor Internet service providers could control the applications offered on their platforms by independent software vendors (ISVs).This open model is now changing in the world of smartphones, app stores and cloud-based service providers.Smartphone vendors, - e.g., Apple, Google’s Android, Samsung, - can control the apps running on their platforms and distributed through their cloud-based app stores. Consequently, the smartphone-based Internet is becoming a less open platform, potentially controlled by a small set of large and powerful institutions.

Benkler fears that “While the Internet protocol itself remain open, as does the IETF [Internet Engineering Task Force], other control points counter the dynamics of the early Internet…Unless something dramatic changes these trends, the future of conscious Internet use is based in handheld devices running apps.Moreover, as connected sensors and controllers (origin of the Internet of Things as a concept) become pervasive, an increasingly larger portion of Internet use will not be conscious at all.”

The explosive growth of the Internet in the 1990s took the digital revolution to a whole new level.Now, the Internet is ushering an equally impactful data revolution.Data is being generated by just about everything and everybody around us, including not only the growing volume of online and offline transactions, but also web searches, social media interactions, billions of smart mobile devices and 10s of billions of IoT smart sensors.

“Who owns these data and how they are secured so that unauthorized actors do not have the capacity to act maliciously from a distance are central to the questions of security, privacy, and control,” note Benkler and Clark in their introduction.“No less important, a mixture of data-analysis techniques and the personalized data available from Internet use today makes data about individuals actionable.” Great care must be taken to prevent governments and companies from misusing the powerful insights embodied in all that data.

“The Internet started its life as public infrastructure, largely dedicated to communications among academic and public institutions… It has become indispensable to an ever growing range of human activity.Understanding the design challenges these changes pose, subjecting them to continuous critical reflection informed by real-world analysis of the rapidly changing character of the Internet, and insisting on open, rational, democratic debate over the implications of our choices is perhaps the most important role of academic reflection about the past and future Internet.”

The Rise of the Platform Economytag:typepad.com,2003:post-6a00d8341f443c53ef01bb08b0b31b970d2016-02-09T05:35:48-05:002016-02-09T16:04:04-05:00What do we mean by platform? I particularly like this definition by MIT Professor Michael Cusumano: “A platform or complement strategy differs from a product strategy in that it requires an external ecosystem to generate complementary product or service innovations...IWB

What do we mean by platform?I particularly like this definition by MIT Professor Michael Cusumano: “A platform or complement strategy differs from a product strategy in that it requires an external ecosystem to generate complementary product or service innovations and build positive feedback between the complements and the platform. The effect is much greater potential for innovation and growth than a single product-oriented firm can generate alone.”

The importance of platforms is closely linked to the concept of network effects - the more products or services it offers, the more users it will attract.Scale increases the platform’s value, helping it attract more complementary offerings which in turn brings in more users, which then makes the platform even more valuable… and on and on and on.

Platforms have long played a key role in the IT industry.IBM’s System 360 family of mainframes, announced in 1964, featured a common hardware architecture and operating system, enabling customers to upgrade their systems with no need to rewrite their applications.The ecosystem of add-on hardware, software and services that developed around System 360 helped it become the premier platform for commercial computing over the next 25 years.

In the 1980s, the explosive growth of personal computers was largely driven by the emergence of the Wintel platform based on Microsoft’s operating systems and Intel’s microprocessors, which attracted a large ecosystem of hardware and software developers.

The 1990s saw the commercial success of the Internet and World Wide Web, driving platforms to a whole new level.Internet-based platforms connected large numbers of PC users to a wide variety of web sites and online applications. The power of platforms has grown even more dramatically over the past decade, with billions of users now connecting via smart mobile devices to all kinds of cloud-based applications and services.

What’s the current state and growth potential of platform companies?How many large platforms are currently operating around the world?What’s their impact on established enterprises?These are among the questions addressed in in a recent report, The Rise of the Platform Enterprise: A Global Survey led by Peter Evans and Annabelle Gawer and sponsored by the Center for Global Enterprise.The report is based on a comprehensive survey of the 176 platform companies around the world with an individual valuation exceeding $1 billion. Their aggregate market value was over $4.3 trillion.

The study identified 4 major types of platforms.

Innovation platforms serve as the foundation on top of which developers offer complementary products and services.Innovation platforms enable the platform leaders to attract a very large pool of external innovators, in what is called an innovation ecosystem.S/360 and Wintel platforms developed such innovation ecosystems around mainframes and PCs respectively.More recently, Apple’s iOS and Google’s Android have established very large innovation ecosystems of app developers for their various mobile devices.

Transaction platforms help individuals and institutions find each other, facilitating their various interactions and commercial transactions.In the 1990s, the Internet led to the creation of e-commerce platforms, - e.g., Amazon, eBay, Ticketmaster, LL Bean, Lands End.The last few years have seen the emergence of so-called on-demand platforms, - e.g., Uber, Lyft, Airbnb, Zipcar, Etsy, - which enable the exchange of goods and services between individuals. These platforms are giving rise to a new class of on-demand companies, which are exerting considerable pressure on more traditional firms.

A few large companies offer the capabilities of both transaction and innovation platforms in their integration platforms.Apple and Google, for example, have established innovation platforms for their developer ecosystems, whose apps are then made available in their respective transactional platforms, - the App Store and Google Play. Similarly, Amazon and Alibaba serve as transactional platforms for their individual users, and as innovation platforms for the many vendors who also sell their wares on their e-commerce platforms.

Finally, some of the companies included in the survey are essentially investment platforms, who have invested in, and/or are managing a portfolio of platform companies.The Priceline Group, for example, is focused on online travel and related services, including Priceline, Kayak and Open Table.

While predominant in the US and China, platform companies have an increasing global presence.Of the 176 platform companies included in the survey, Asia has the largest number with 82, - 64 of which are in China.North America has 64, - 63 in the US.Europe is a major consumer of platform services, but it’s home to relatively few platform companies, - 27 spread across 10 countries, 9 of which are in the UK.Africa and Latin America have a number of small platform companies, only 3 of which have met the $1 billion valuation threshold for inclusion in the survey.

Asia has the largest number of platforms, but the US leads in aggregate market value.Of the $4.3 trillion total global valuation, US platform companies comprise over 70%, compared to Asia’s 22% and Europe’s 4%.The valuation of the 44 platform companies headquartered in the San Francisco Bay area alone represent over 50% of the global market value.

160 of the 176 companies included in the survey run transactional platforms, although since many of them are relatively small, their aggregate market value is around 25% of the total.The 5 platforms in the innovation category, - Microsoft, Oracle, Intel, SAP and Salesforce, - comprise 21% of the total market value.Not surprisingly, the 6 companies offering both innovation and transactional platforms, are the giants in the industry, - Apple, Google, Facebook, Amazon, Alibaba and XiaoMi, - with almost 50% of the global market value.

Platform companies are major drivers of innovation.The top such companies, - e.g., Amazon, Google, Facebook, Apple, - are setting the standards for the digital transformation taking place around the world.Traditional companies are challenged to keep up or risk being left behind.A number of such companies have been setting up their own platforms, - often through acquisitions and alliances, - offering a variety of cloud-based apps and services, many of which are aimed at business (BtoB) clients.

For established firms, managing a platform ecosystem raises a number of organizational and governance issues, including “who has access to the platform, how to divide value, and how to resolve a conflict,” notes the CGE report.“The goal is to arrange complementors and consumer rules to maximize ecosystem profits…All of this must be done recognizing that the platform leader is orchestrating free agents rather than directing employees in a hierarchical command-and-control structure.”

“Approaches to platform governance must also consider the way value is created.While traditional business models would incent managers to maximize the price of each product or services, different approaches are needed to manage platforms…Pursuing broader ecosystem profits over specific products and services may only be achieved with significant changes to managerial incentives and organizational culture.”

“There is clearly a rising platform economy shaping our global business landscape and affecting the lives of citizens worldwide,” says the report in its concluding paragraphs.“This new form of organization seems to be a robust- some would even say dominant - form of business enterprise in the digital economy…While significant challenges lie ahead, the opportunities that platforms reveal are enormous, tapping into an unprecedented level of global Internet connectivity, and a large supply of talent and software skills…to develop the platforms of tomorrow.”

Leadership in a Changing Worldtag:typepad.com,2003:post-6a00d8341f443c53ef01b8d1930be8970c2016-02-02T06:11:02-05:002016-02-02T05:51:41-05:00What are the critical competencies needed to lead in our fast-changing business environment? This question was the focus of a study jointly conducted by the professional services firm Heidrick & Struggles and the University of Oxford Said Business School. The...IWB

“Declining tenure rates and levels of public trust suggest that CEO leadership has not kept pace with increased expectations,…” notes the report in its opening paragraphs.“Indeed, the role of the CEO is becoming more complex as competing and increasingly vocal stakeholders permeate organizations.”Leading in today’s world requires new thinking: “past experience is no longer a reliable guide for future action…success as a CEO today hinges on continual growth in the role, even more than on the preparation beforehand.”

The key challenge confronting CEOs, - and senior executives in general, - is the relentless pace of change. But, while speed is the major dimension of change, it’s not the only one.Beyond speed, executives must also focus on scope and significance. They must carefully consider a few fundamental questions about each:

Speed.Is the change evolutionary or revolutionary in nature?Sustaining or disruptive?While ignoring major changes is bad, acting before understanding its implications could be worse.Organizations need a modicum of stability to function effectively.Wise leaders consider the implications of a major change, formulate the appropriate strategy, and then act.

Scope.Is the change contained within a technology, product or business unit or is it systemic in nature, that is, affecting the overall business, perhaps even a whole industry sector?While systemic changes require a business-wide strategy, others may generate substantial short term attention but end up having a more contained impact.It’s not so easy to discern the scope of a potentially disruptive change in its early days.

Significance.Are we talking about a foundational change, e.g., the Internet, mobile devices, cloud computing?How deep is its potential impact on the business?Is it the kind of change that will inevitably happen, whether we like it or not, so in the end we have no choice but to embrace it?If so, how can we best embrace the change without abandoning our existing assets and customer base?

Company-wide, transformational innovations require the attention of the CEO and other senior executives, while evolutionary, incremental ones are best handled within a line of business.“Understanding the potential scope and significance of change, in addition to speed, allows CEOs to prioritize where and on what they spend their time.How far-reaching is it and how deeply will it shake the business?”

Leading in turbulent times is very difficult indeed. Based on its more than 150 CEO interviews, the study identified 4 critical competencies that transformational leaders must have: ripple intelligence, power of doubt, authenticity and balancing paradoxes.Let’s take a close look at each.

Ripple Intelligence.Ripple intelligence is “the ability to predict how trends and contexts may intersect, interact, and change direction, helping CEOs anticipate disruptions, make time to plan, and protect against unexpected events.”It’s the ability to see through the turbulence of multiple interacting changes, - as if they were ripples spreading across a pond.Ripple intelligence acts as a kind-of early warning system, helping leaders understand the impact and risks of a systemic, fundamental transformation.

In general, executives get promoted because they’re good at managing operations.Operational excellence requires detailed analysis of technologies, quality, processes, competitors, customer satisfaction and market segments.But, the skills that served them well when dealing with relatively concrete, well-defined operations are much less effective when dealing with highly complex, fast-changing problems, where it’s often not clear what’s going on in the present, let alone how things will evolve going forward.

A more holistic, collaborative approach is now required to pull together everything that’s known about a problem.Leaders must be able to think systemically to help them appreciate what in the world is going on.“Anticipating complex interactions gives CEOs the time to plan a range of responses, and then choose and execute as appropriate.Faster identification of incoming ripples could help mitigate delays in execution.Moreover, developing a keener sense of ripple intelligence assists in strategy development, as it helps CEOs better understand risks and connect the dots to reveal new business opportunities at the confluence of multiple emergent trends.”

The Power of Doubt.“Doubts are to CEOs what nerves are to elite athletes: a source of focus and insight when harnessed constructively, a threat to peak performance when not.”We often think of doubt as a weakness to be overcome. Why then is doubt an important capability when facing really difficult decisions?

The power of doubt, the report explains, can best be appreciated by looking at a 2x2 matrix, with fearlessness and anxiety along one dimension, and knowing and not-knowing along the other.Understanding which quadrant a problem fits in helps executives better tailor how to go about making a difficult decision.

Fearlessness & Knowing:Key here is to avoid hubris and overconfidence by encouraging a diversity of views and open debate, and strongly challenging the information and advice they’ve been given to make sure it stands up to scrutiny.

Fearlessness & Not-Knowing:Sometimes we have a strong feeling about a decision, - i.e., we know what to do despite limited information, based on intuition rather than logical reasoning and analysis.Such cases call for additional gut checks, such as discussions with experts and other people whose opinions we trust, as well as additional preparations to protect against the negative consequences of a wrong decision.

Anxiety & Knowing:What if you’re not comfortable with the decision despite the availability of good information?This often happens with decisions on a subject beyond our comfort zone.Additional validation is then required to make sure that we are making the right decision, including discussions with trusted colleagues, mentors, friends or family; and benchmarking against what others have done under comparable circumstances.

Anxiety & Not-Knowing:What should you do when a decision must be made despite insufficient information and no good gut-instinct?Such cases call for expanding your horizons, gathering information from a wide selection of sources, and talking to a diverse set of people whose different points of view might shed light on the problem at hand.

Authenticity:The CEO interviews didn’t include questions dealing directly with authenticity.Yet the topic kept coming up either explicitly or when discussing leadership qualities like integrity, honesty, personal values, self-awareness, and trustworthiness. As the report succinctly puts it - “authenticity is the fuel that drives trust.”

But, it’s not so easy to be an authentic leader.Stakeholders often view authenticity as a commitment to stay the course.This runs counter to the equally important requirement to be pragmatic and capable of adapting to a changing environment.Given the competing demands of an ever-widening group of stakeholders, leaders have to carefully navigate these authenticity-adaptability conflicts.

“The capacity to adapt authentically is critical for maintaining trust and buy-in when responding to competing demands and volatility.”Doing so successfully requires an overriding sense of purpose or risk becoming everything to everyone.In the end, authenticity means being true to one’s personal purpose while avoiding indiscriminate adaptation.

Balancing paradoxes:Balancing authenticity-adaptability conflicts is just one of the seemingly irreconcilable demands faced by leaders.Almost two-thirds of the CEOs interviewed raised the challenge of balance as a major concern.

“As more stakeholders make competing - yet equally valid - demands, CEOs face perplexing choices between right and right, as one put it, rather than simply right and wrong.It is these dilemmas that make decisions so vexing and alignment so difficult.How can CEOs give their various internal and external stakeholders confidence that they are choosing the right right, and get support for their decisions?”

To raise the organization’s confidence in their decisions, leaders must carefully balance the various personal paradoxes involved in the decision-making process, including:

patience - the right pacing or timing of decisions versus detrimental haste and hesitation.

“The CEOs we interviewed lead some of the largest companies around the world,” notes the H&S-Oxford report in its concluding section.“Collectively, their conduct and performance affect everyone: employees, investors, consumers - all citizens of an increasingly interconnected world.Future leaders therefore need to embrace the challenge - but especially the responsibility - of leadership to do justice to a multitude of accountabilities and societal expectations.”

Corporate Survival: Lessons from Biologytag:typepad.com,2003:post-6a00d8341f443c53ef01b8d18ecee2970c2016-01-26T06:07:25-05:002016-01-26T06:07:25-05:00Having lived through IBM’s near-death experience in the early 1990s, respect for the forces of the marketplace is edged deep in my psyche. It’s frankly sobering how many once powerful IT companies are no longer around or are shadows of...IWB

Having lived through IBM’s near-death experience in the early 1990s, respect for the forces of the marketplace is edged deep in my psyche.It’s frankly sobering how many once powerful IT companies are no longer around or are shadows of their former selves. The carnage might be more pronounced in the fast-changing IT industry, but no industry is immune. It would seem as if darwinian principles apply in business almost as much as they do in biology.

A few weeks ago I discussed a paper published last year, The Mortality of Companies, by physicist Geoffrey West and his collaborators at the Santa Fe Institute.Based on their extensive analysis of data about publicly traded US companies, they were surprised to discover that a typical firm lasts about ten years before it gets merged, acquired or liquidated, and that a firm’s mortality rate is independent of its age, how well established it is or what it does.While beyond the scope of their study, the authors speculated that biological ecosystems are likely to shed valuable insights into their findings.

“Some business thinkers have argued that companies are like biological species and have tried to extract business lessons from biology, with uneven success,” note the authors.“We stress that companies are identical to biological species in an important respect: Both are what’s known as complex adaptive systems.Therefore, the principles that confer robustness in these systems, whether natural or manmade, are directly applicable to business.”

After analyzing the longevity of more than 30,000 public US firms over a 50-year span, the authors found that companies are disappearing faster than ever before. “Public companies have a one in three chance of being delisted in the next five years, whether because of bankruptcy, liquidation, M&A, or other causes. That’s six times the delisting rate of companies 40 years ago. Although we may perceive corporations as enduring institutions, they now die, on average, at a younger age than their employees. And the rise in mortality applies regardless of size, age, or sector. Neither scale nor experience guards against an early demise.”

Complex systems, whether natural, social or engineered, are composed of many parts. But it’s not the mere number of component parts that makes them complex. After all, a stone or a table is composed of huge numbers of molecules, yet we would not consider them complex. A truly complex system consists of many different kinds of parts, intricate organizations and highly different structures at different levels of scale. Humans, bacteria, advanced microprocessors, modern airplanes, global enterprises, urban environments and national economies are all examples of complex systems exhibiting these massive, heterogeneous, intricate characteristics.

What makes such systems complex is not their basic functionality but their adaptability, that is, the various mechanisms that help the system evolve and survive in a changing environment.But these adaptive mechanisms will in turn render the system more complex.They often lead to unintended consequences such as new failure modes, which are then corrected over time with additional adaptive mechanisms, which then further add to the complexity of the system, and so on and on and on.

The HBR paper notes that “companies are dying younger because they are failing to adapt to the growing complexity of their environment.Many misread the environment, select the wrong approach to strategy, or fail to support a viable approach with the right behaviors and capabilities.”

Corporate mortality is further exacerbated by three broad trends: a more diverse, harsher and less predictable business environment; the faster pace of technological innovation which forces companies to adapt to new business cycles at about twice the rate as 30 years ago; and the increasing integration of business ecosystems and global markets, which while good for economic vitality, adds to the risks of system-wide shocks.

“In a complex adaptive system, local events and interactions among the agents, whether ants, trees, or people, can cascade and reshape the entire system - a property called emergence.The system’s new structure then influences the individual agents, resulting in further changes to the overall system. Thus the system continually evolves in hard-to-predict ways through a cycle of local interactions, emergence, and feedback…In business we see workers and management, through their local actions and interactions, shape the overall structure, behavior, and performance of a firm…Whether we look at team dynamics, the evolution of strategies, or the behavior of markets, the pattern of local interactions, emergence, and feedback is apparent.”

What does this mean for business leaders?Drawing on their research at the intersection of business, biology, and complex systems, the authors identify six key principles to help increase the robustness of complex business systems.

Maintain heterogeneity of people, ideas, and endeavors.Diversity is key to evolutionary adaptation in biology.Similarly, diversity increases the flexibility of business systems, helping them adapt to unpredictable changes from within or outside their industry that might obsolete their business models.While diversity might decrease short-term efficiency, it’s absolutely essential for a company’s survival in the longer term.

Diversity is necessary, but not sufficient.A commitment to innovation is essential.While It may not always help the business stay out of serious trouble, anticipating major changes and understanding its various options will help the company adapt to new market realities. Doing so effectively requires the kind of leadership that will attract and retain the best possible talent and create a culture of innovation that encourages them to come up with creative solutions to the challenges the firm faces.

Sustain a modular structure of loosely connected components.The more integrated its components, the higher the risks that a shock in one part of the system will rapidly spread across to other parts.A modular adaptive system, that is, one consisting of more loosely connected components, is in a better position to resist the spread of shocks and make the overall system more robust.

But, there are trade-offs.Within a company, integration across business units or geographical regions can enhance information flows, efficiency gains, and innovation.So can integration with other business stakeholders in the broader ecosystem.However, such tighter interconnections also amplify risk, making the company vulnerable to adverse events.“Despite the trade-offs, modularity is a defining feature of robust systems. A bias against it for the sake of short-term gains carries long-term risks.”

Preserve redundancy among components.Over millions of years, biological organisms have developed redundant mechanisms to increase their chances of surviving disease or accidents.When one mechanism fails, another can step in.The immune system, for example, has developed multiple, overlapping lines of defense against pathogens of all sorts.

Companies often view redundancy as the antithesis of efficiency.But, as business systems become more complex, the interactions between their internal components and those of the environment in which they operate dominate the overall design.Serious problems and system-wide shocks can increasingly occur from the unanticipated interactions among these various components.Resiliency must be therefore built into the design of the system, including alternative mechanisms to keep the system operating despite accidents or shocks.A resilient company must make the proper trade-offs between efficiency and redundancy.

Expect surprise, but reduce uncertainty.Few things are harder to predict than the potential impact of new technologies, business models or market changes.Every company, no matter how large or small, should view itself as being but one asteroid away from trouble or oblivion, even more so now given our accelerated pace of change.But while the future is indeed unpredictable, we can continuously do the equivalent of scanning the sky to see if any asteroids might be heading our way.

Doing so requires talent and R&D investments. Talented people can generally anticipate disruptive changes years before they happen through their own research as well as their interactions with research communities, universities and startups.They can take preemptive actions against potential threats by developing alternative products and business strategies.

Create feedback loops and adaptive mechanisms.Natural selection is the process by which “traits that enhance survival and reproduction become more common in successive generations of a population.”Mutation is nature’s way of conducting many diverse experiments, and natural selection is the feedback mechanism used to amplify those traits most likely to survive.

Companies have traditionally relied on forecasting and planning to develop their long-term strategies.But such exercises are less relevant and reliable in a fast changing, unpredictable business world.They must instead embrace a more iterative innovation process based on marketplace experimentation, feedback loops and adaptive mechanisms, - the business equivalent of natural selection.

Mutual trust is also essential to the robustness of complex adaptive systems.The individual interests of a firm will often conflict with the interests of the overall business system.If enough individual firms pursue their own selfish interests, the overall system becomes weaker and everyone suffers.While seeking to maximize their own profits, leaders should consider how their firms are adding value and contributing to the overall business ecosystem.

“Rising corporate mortality is an increasing threat, and the forces driving it - the dynamism and complexity of the business environment - are likely to remain strong for the foreseeable future,” note the authors in conclusion… “Understanding the principles that confer robustness in complex systems can mean the difference between survival and extinction.”

The (Uneven) Digitization of the US Economytag:typepad.com,2003:post-6a00d8341f443c53ef01b8d18ab215970c2016-01-19T06:24:49-05:002016-01-19T06:24:49-05:00Digital technologies are all around us, - increasingly ubiquitous and commoditized. But, are they a major source of competitive differentiation? Are they still a strategic value to business? Can digital innovation drive long term economic growth? Several weeks ago, the...IWB

Digital technologies are all around us, - increasingly ubiquitous and commoditized.But, are they a major source of competitive differentiation? Are they still a strategic value to business?Can digital innovation drive long term economic growth?

Several weeks ago, the McKinsey’s Global Institute (MGI) published a report addressing these questions.Digital America: A tale of the haves and have-mores aims to quantify the state of digitization of the US economy.The report introduces the MGI Industry Digitization Index, a methodology for exploring the various ways US companies are going about their digital journey, - based on 27 indicators that measure how they’re building digital assets, expanding digital usage, and creating a digital workforce.

“Digital innovation, adoption, and usage are evolving at a supercharged pace across the US economy,” notes the report in its opening paragraph. “As successive waves of innovation expand the definition of what is possible, the most sophisticated users have pulled far ahead of everyone else in the race to keep up with technology and devise the most effective business uses for it.”

Let me discuss a few of the findings in this extensive report.

The ICT (information and communications technologies) sector is the engine of digitization for the broader economy.ICT made up approximately 5% of the US GDP in 2014.But, unlike the prices of goods and services in most other industries, ICT prices have declined 63% between 1983 and 2010, driven by the dramatic advances in digital technologies over that time span.Accounting for this price decline and adjusting for its impact on other sectors, MGI estimates that the ICT sector represents around 10% of the 2014 US GDP.

Digitization, like electricity a century ago, is a general-purpose technology that underpins a very large share of economic activity.Digital technologies now touch the economy in myriads of ways.64% of adults use smartphones and 84% use the Internet. Usage has been steadily going up in digital applications like e-commerce, social media, digital payments, video streaming, e-filed tax returns and online freelance workers.

The 27 metrics of the Industry Digitization Index are organized into three major categories:digital assets - a measure of the digital share of a company’s total spending and assets; digital usage, - e.g., digital transactions, external communications, customer service, back- and front-office processes; and digital labor, - e.g., digital expenditures and assets per worker, share of tasks and jobs that are digital. The Index is then used to quantify the extent of digitization in 22 industry sectors.

The report’s overriding finding is that the US economy is digitizing unevenly, with large disparities among sectors. Not surprisingly, ICT is the most digitized economic sector, followed by media, professional services and financial services. On the other hand, health care, hospitality, construction and agriculture are the least digitized sectors.

A number of sectors are poised to significantly increase their digital capabilities over the next several years.The rapid growth of the Internet of Things should help industries like manufacturing, utilities and mining by enabling them to digitize and interconnect their physical assets.Smart devices and sensors of all sorts will also assist in the digitization of labor-intensive sectors like retail and health care.

“The gap between those on the frontier and the rest of the economy is about the sophistication of digital usage,” notes the report.“Much has been written over the years about the digital divide and the Americans who remain offline, but now a new and more pervasive dynamic appears to be at work.The gap between the digital haves and have-mores is growing as the most advanced users pull away from everyone else.They have moved beyond expanding access and adding users; now they are focused on deepening engagement and capabilities.”

The Index quantifies this growing gap between the most digitized sectors, - the have-mores, and the rest, - the haves.In 1997, the haves’ overall Index was only 8% of the of the have-mores’ Index.By 2005, the have-mores Index had increased by a factor of 1.7, and by 2013 it was over 4 times its 1997 value.The Index of the haves also increased in 2005 and 2013, but still comprised a fraction of the have-more’ Index at 12% and 14% respectively.

“The have-mores continue to push the boundaries of digitization, particularly in terms of augmenting what their workers do, while everyone else scrambles to keep up with them.This gap points to substantial room for much of the economy to boost productivity.In fact, since some of the lagging sectors are the largest in terms of GDP contribution and employment, we find that the US economy as a whole is reaching only 18 percent of its digital potential (defined as the upper bounds of digitization in the leading sectors).”

A similar conclusion was reached by Erik Brynjolfsson and Adam Saunders in their 2009 book Wired for Innovation: How Information Technology is Reshaping the Economy. “Although some say that technology has matured and become commoditized in business, we see the technological revolution as just beginning. Our reading of the evidence suggests that the strategic value of technology to business is still increasing. For example, since the mid 1990s there has been a dramatic widening in the disparity in profits between the leading and lagging firms in industries that use technology intensively (as opposed to producing technology). Non-IT intensive industries have not seen a comparable widening of the performance gap - an indication that deployment of technology can be an important differentiator of firms’ strategies and their degree of success…”

“The companies with the highest returns on their technology investments did more than just buy technology; they invested in organizational capital to become digital organizations. Productivity studies at both the firm level and the establishment (or plant) level during the period 1995-2008 reveal that the firms that saw high returns on their technology investments were the same firms that adopted certain productivity-enhancing business practices.”

What about the impact of digitization on future economic growth?McKinsey estimates that digitization could increase the 2015 GDP by over $2 trillion based on its impact on three major areas of the economy:

Multi factor productivity: Big data and analytics, IoT, mobile devices, cloud computing, AI and other technology advancescould lead to major new innovations, faster product development, improved energy efficiency and smarter overall operations across just about all industry sectors.

But the report also warns that companies must adapt or risk being left behind in our rapidly advancing digital economy, and lists some of the most pressing issues that companies need to consider to keep up, including:

Prepare for tougher, 360-degree competition. Sector boundaries mean little in a digital world where new competitors can become market leaders practically overnight.

Take advantage of new innovation models. Embrace open, collaborative innovation across the whole organization, supply chain partners, research communities and customers.

Emphasize agility and learning. In a fast changing marketplace, agility is more critical than long-term forecasting exercises.

Think differently about your workforce. To keep up with fast-changing technologies, companies need to invest in talent and learning programs.

“Keeping up with the relentless pace of digital innovation is both a sprint and a marathon… There is no room for inertia on the digital frontier. It takes investment, agility, and relentless focus to stay ahead, but the organizations and individuals that can establish themselves as digital leaders can find outsized opportunities.”

The Blockchain and Open Innovationtag:typepad.com,2003:post-6a00d8341f443c53ef01b7c7fef1f3970b2016-01-12T05:58:14-05:002016-01-12T05:47:41-05:00Transformational innovations don’t always play out as originally envisioned. Once in the marketplace, they seem to acquire a life of their own. Lest we forget, the Internet started out as a DARPA sponsored project aimed at developing a robust, fault-tolerant...IWB

Transformational innovations don’t always play out as originally envisioned.Once in the marketplace, they seem to acquire a life of their own.Lest we forget, the Internet started out as a DARPA sponsored project aimed at developing a robust, fault-tolerant computer network.ARPANET was launched in 1969, and by the mid-1980s, it had grown and evolved into NSFNET, a network widely used in the academic and research communities.And, the World Wide Web was first developed by Tim Berners-Lee at CERN in the late 1980s to facilitate the sharing of information among researchers around the world. They’ve both gone on to change the world, - to say the least.

The blockchain first came to light around 2008 as the architecture underpinningbitcoin, the best known and most widely held digital currency.But, as with the Internet, the Web and other major technologies, the blockchain has now transcended its original objective.It has the potential to revolutionize the finance industry and transform many aspects of the digital economy.

Two press announcements released in mid-December are serious milestones in its evolution. Let me explain.

The Internet and World Wide Web introduced a culture of standards to the IT industry.Until the early 1990s, it was quite difficult to get different IT systems to talk to each other.Just sending an e-mail across two different applications from two different vendors was quite a chore, as was sharing information across disparate systems.

Then in the 1990s, the open Internet protocol, - TCP/IP, - was widely embraced by the general IT marketplace, making it possible to interconnect systems and applications from any vendors.Internet e-mail protocols, - SMTP, MIME, POP, IMAP, - enabled people to easily communicate with anyone on any system.At the same time,the Web’s open set of standards, - HTML, HTTP, URLs, - and the easy-to-use, graphical Mosaicweb browser, - enabled any PC connected to the Internet to access information on any web server anywhere in the world.

Much of the success of the Internet and Web are due to the international organizations created to oversee their evolution, - IETF, the Internet Engineering Task Force, and W3C, the World Wide Web Consortium, founded in 1989 and 1994 respectively.Similarly, Linux has been successfully overseen by the Linux Foundation, whose antecedent organizations were established in 2000.More recently, the cloud community came together to develop OpenStack, an open source cloud computing platform, and in 2012 founded the OpenStack Foundation.In addition to developing standards and organizing promotional activities, these various organization make available open source implementations of their software releases, thus encouraging collaborative, open innovation.

In the last few years, the blockchain has been gathering momentum beyond its original bitcoin role, -promising, in particular, to propel the ledger to the Internet age.Ledgers constitute a permanent record of all the economic transactions an institution handles, whether it’s a bank managing deposits, loans and payments; a brokerage house keeping track of stocks and bonds; or a government office recording births and deaths, the ownership and sale of land and houses, or legal identity documents like passports and driver licenses.They’re one of the oldest and most important concepts in finance and other mission critical transaction applications.

A 2014 report by the Bank of England notes that the classic ledger that’s been long used in payment systems has not changed much since the 16th century.Financial institutions have transformed their paper-based ledgers into highly sophisticated IT applications and data bases, but while their ledgers and transaction applications are now digital, their underlying structure has not changed much. Each institution continues to own and manage its own proprietary ledger, synchronizing its records with those of other institutions as appropriate, - a cumbersome process that often takes days. As the Bank of England reminds us, for this system to work, “people must trust that these centralized ledgers will be maintained in a reliable, timely and honest manner.”

With a blockchain-based financial system, users own their personal assets and payments are made directly between payer and payee, removing the risk that a financial institution may become insolvent.The main responsibility of the institutions involved is to oversee the trustworthiness, security and efficiency of the distributed ledger system, ensuring that the cryptographic technologies and protocols have been properly implemented. And, like the Internet, such a distributed system should be significantly more resilient than our current banking systems, given the large numbers of redundant blockchains and pathways in the network.

The blockchain offers a way for ledgers to now follow the collaborative, distributed, standard-based approach of the Internet and other major digital innovations, enabling financial institutions “to implement a fully decentralized payment system, in which copies of the ledger are shared between all participants, and a process is established by which users agree on changes to the ledger (that is, on which transactions are valid). Since anybody can check any proposed transaction against the ledger, this approach removes the need for a central authority and thus for participants to have confidence in the integrity of any single entity.”

On December 17, R3 CEV, a consortium of banks working on blockchain standards announced that 42 banks have now joined its distributed ledger initiative, including most of the world’s global banks.Starting in January, R3 plans to invite the broader financial services community to join the initiative.In a separate press release the same day, the Linux Foundation announced a new collaborative effort to develop an enterprise grade, open source distributed ledger framework based on blockchain technologies.R3 and a number of banks have committed to join this project, as have several technology companies including Accenture, Cisco, IBM, Intel and VMware.

“Many of the founding members are already investing considerable research and development efforts exploring blockchain applications for industry,” said the Linux Foundation announcement. “IBM intends to contribute tens of thousands of lines of its existing codebase and its corresponding intellectual property to this open source community. Digital Asset is contributing the Hyperledger mark, which will be used as the project name, as well as enterprise grade code and developer resources. R3 is contributing a new financial transaction architectural framework designed to specifically meet the requirements of its global bank members and other financial institutions…”

“With distributed ledgers, virtually anything of value can be tracked and traded.The application of this emerging technology is showing great promise in the enterprise.For example, it allows securities to be settled in minutes instead of days.It can be used to help companies manage the flow of goods and related payments or enable manufacturers to share production logs with OEMs and regulators to reduce product recalls…”

“Distributed ledger systems today are being built in a variety of industries but to realize the promise of this emerging technology, an open source and collaborative development strategy that supports multiple players in multiple industries is required.This development can enable the adoption of blockchain technology at a pace and depth not achievable by any one company or industry. This type of shared or external Research & Development (R&D) has proven to deliver billions in economic value.”

It’s too early to know if the blockchain will become another major transformational innovation.But these two recent announcements represent a serious milestone in that direction.As we’ve seen with other successful innovations, collaborations among its technology developers and users are absolutely necessary to get the architecture right, as are open standards, open source implementations, and a governance process embraced by all.In a short number of years, the blockchain has already transcended its original objectives.It will be very interesting to see how it all plays out in the years to come.

Why People and Companies Die While Cities Keep Growingtag:typepad.com,2003:post-6a00d8341f443c53ef01b8d18730e9970c2016-01-05T06:30:46-05:002016-01-05T06:30:46-05:00The Winter issue of strategy+business includes a very interesting article on theoretical physicist Geoffrey West, - The Fortune 500 Teller. Dr. West is Distinguished Professor and Past President of the Santa Fe Institute (SFI), a non-profit research institute dedicated to...IWB

About 20 years ago, West got interested in whether some of the techniques and principles from the world of physics could be applied to complex biological and social systems.In particular, he wondered if we could apply empirical, quantifiable and predictive scientific methods to help us better understand complex biological organisms and social organizations like cities and companies.

In the 1990s, his attention first turned to biology.There are enormous variations in the characteristics of living creatures, their live spans, pulse rates, metabolism, and so on. How do these characteristics change with body size? Why do human beings live roughly 80 to 100 years, while mice live only two to three years. Are there some common principles that apply to all living creatures regardless of size?Can we find empirical mathematical models that might allow scientists to ask big questions about life, aging and death?

As it turns out, such a model was developed in the 1930s by biologist Max Kleiber.Kleiber observed that for the vast majority of animals, their metabolic rate, the amount of energy expended by the animal, is proportional to its mass M raised to the ¾ power, that is M¾. Kleiber’s Law applies to an amazing range of sizes, from bacteria to blue whales.

Because the scaling is sublinear, that is, the exponent is less than 1, larger species are more efficient than smaller ones, needing less energy per pound. While an elephant is 10,000 times the size of a guinea pig, it needs only 1000 times as much energy. Furthermore, the bigger the organism, the longer it lives, and the longer it takes to grow and mature, all predicted by the same sublinear power law. This simple scaling applies to a large number of physiological variables besides metabolic rate, including how long the organism lives, how long it takes to mature, its growth rate and so on.

Along with SFI colleagues, West studied these scaling laws, and concluded that they were due to the internal structure that makes life possible, - the nutrient networks that have to reach every cell and capillary in a living organism. They modeled such networks, assuming that evolution would arrive at the most efficient structures possible, and came up with the ¾ power scaling between metabolic rate and mass that Max Kleiber had empirically observed in the 1930.

Their theory also explained why organisms grow rapidly when young but eventually stop growing.As the number of cells doubles, the number of capillaries rises by only 75 percent. As the organisms grow larger, the delivery system fails to keep up with the growth in cells, so eventually the growth must stop.

He then turned his attention from biology to social organizations.Could you view cities and companies as large-scale organisms, each with its own internal infrastructure connecting all its various components?Do similar scaling laws apply?“Is New York just actually, in some ways, a great big whale? And is Microsoft a great big elephant?, he asked in a fascinating video conversation, - Why Cities Keep Growing, Corporations and People Always Die, and Life Gets Faster.“Metaphorically we use biological terms, for example the DNA of the company or the ecology of the marketplace. But are those just metaphors or is there some serious substance that we can quantify with those?”

West and his SFI collaborators analyzed an extensive body of data about cities around the world to explore the scaling relations between population and a wide range of attributes, including energy consumption, economic activity, demographics, infrastructure, innovation, employment, and patterns of human behavior.

They discovered that the measurable infrastructure of cities, - e.g., the lengths of roadways and electrical lines, the number of gas stations, - scale sublinearly, just like in biological organisms, but with a scaling factor of .85.That means that if the population of a city doubles, its infrastructure needs to only increase by a factor of 1.85.This .85 scaling factor was true for cities of any size across the world as well as for any measurable infrastructure.Cities are real centers of sustainability, enjoying a 15% benefit in economies of scale.

The results were different for socioeconomic measures associated with the interactions of city residents. They also scale with populations, but instead of following a sublinear scale of .85, socioeconomic attributes scale at a superlinear factor of 1.15.

“That says that systematically, the bigger the city, the more wages you can expect, the more educational institutions in principle, more cultural events, more patents are produced, it's more innovative and so on,” explains West in the aforementioned video conversation. “Remarkably, all to the same degree. There was a universal exponent which turned out to be approximately 1.15 which translated to English says something like the following: If you double the size of a city from 50,000 to a hundred thousand, a million to two million, five million to ten million, it doesn't matter what, systematically, you get a roughly 15 percent increase in productivity, patents, the number of research institutions, wages and so on, and you get systematically a 15 percent saving in length of roads and general infrastructure.”

“However, some bad and ugly come with it. And the bad and ugly are things like a systematic increase in crime and various diseases, like AIDS, flu and so on. Interestingly enough, it scales all to the same 15 percent, if you double the size. Or put slightly differently, another way of saying it is, if you have a city of a million people and you broke it down into ten cities of a hundred thousand, you would require for that ten cities of a hundred thousand, 30 to 40 percent more roads, and 30 to 40 percent general infrastructure. And you would get a systematic decrease in wages and productivity and invention. Amazing. But you’d also get a decrease in crime, pollution and disease, systematically. So there are these trade-offs.”

West and his SFI colleagues then went on to explore another social organization - the firm.How come cities live forever, while companies do not?What’s the lifespan of a typical firm?Do scaling laws apply to companies?To find answers to these questions, they analyzed data about 30,000 publicly traded US companies from 1950 to 2009 across multiple industry sectors.Earlier this year, they published their findings in The Mortality of Companies.

The study found that scaling laws apply to companies as well.But unlike cities, which produce more per capita as they grow bigger, companies scale sublinearly, becoming somewhat less efficient as they get bigger. Revenue per employee and profits as a percentage of sales also decrease systematically, - with the exception of outliers like Apple.

Most surprisingly, their analysis led to a result that no one had predicted. Using a statistical technique called survival analysis, the SFI researchers discovered that “a firm’s mortality rate - its risk of dying in, say, the next year - had nothing to do with how long it had already been in business or what kinds of products it produce.” With the exception of outliers that’ve been around for a very long time, - DuPont was founded in 1802, Citi in 1812, GE in 1892, IBM in 1911, “the team estimated that the typical company lasts about ten years before it’s bought out, merges, or gets liquidated.”

The study showed that mergers and acquisitions are a more common reason for a company’s disappearance than outright liquidation. 45% of firms cease to exist because they’re either acquired or involved in a merger, so they persist in some form as part of some other entity. Rather that being a business failure, a merger or acquisition may actually make the resulting organization stronger and more productive.

Why does the typical firm live around 10 years irrespective of how well-established it is or what it actually does? The reasons for this finding are beyond the study’s scope, although the article hints that the competition for resources in biological ecosystems might provide some sort of insight.Perhaps companies can be seen as “competing for finite resources within a complex ecology of economic interactions… and their longevity is the result of their successes of learning and adaptation in these environments.”

“West sees his role as continually prodding others to look deeper, to apply more mathematical rigor, and to try to understand the big picture of life in scientific terms,” notes the s+b article in conclusion. “Otherwise, he says, ‘one is doomed to failure.Understanding is critical to mitigating problems, to innovating, to sustainability.You can argue that is why we are here.We are the universe’s way of knowing itself, we are that vehicle, and to play a role in that is wonderful.’”

Learning to Apply Data Science to Business Problemstag:typepad.com,2003:post-6a00d8341f443c53ef01bb089e2cb1970d2015-12-29T07:02:37-05:002015-12-29T07:02:37-05:00This past semester I was involved in an interesting course at MIT’s Sloan School of Management, - Analytics Labs (A-Lab). A-Lab’s objective is to teach students how to use data sets and analytics to address real-world business problems. Companies submit...IWB

This past semester I was involved in an interesting course at MIT’s Sloan School of Management, - Analytics Labs (A-Lab).A-Lab’s objective is to teach students how to use data sets and analytics to address real-world business problems. Companies submit project proposals prior to the start of the class, including the business problem to be addressed and the data on which the project will be based. Students are then matched with the project they’re most interested in and grouped into teams of 3-4 students.

A-Lab received over 20 project proposals from different companies, of which 13 were selected by the students. Each project team was assigned a research mentor to provide guidance as appropriate. I mentored a 3-student team that worked on a project sponsored by MasterCard.The students explored the possibility of improving on predictions of the economic performance of emerging markets by coupling existing economic indicators data with consumer behavior based on MasterCard’s transaction data. This is a particularly interesting project because economic data in emerging markets is often not as reliable as the data in more advanced markets.

But more important for the students, the various A-Lab projects served as a concrete learning experience on what data science is all about, - how to leverage messy, incomplete, real-world data to shed light on a complex and not-so-well-defined problem.

A-Lab is taught by professors Erik Brynjolfsson and Sinan Aral. The course was first given in 2014, so this is only its second year. The 2015 Syllabus offers a good overview of the class, including the various companies that submitted project proposals. Projects are considered confidential unless the companies involved give permission to talk about them publicly, as several did in 2014.

Amazon, for example, sponsored a project on how to raise the share of wallet of Amazon Prime customers, based on the analysis of over 200 million anonymized data points. And IBM sponsored a project to uncover a potential Watson application. The students recommended using Watson as a kind-of regulatory analyst assistance, to help financial institutions better understand how to comply with the over 1700 pages of regulations in the Dodd-Frank legislation.

The 2015 A-Lab class culminated with short presentations of each of the student projects before an audience that included company sponsors and mentors.As I listened to the 13 presentations, I was impressed by the potential of applying big data and analytics to all kinds of business problems, from the tactical decisions every business makes as part of its normal operations to the strategic decisions they must also make to help them better compete in a fast-changing marketplace.But, at the same time, the presentations once more reminded me that we’re still in the early stages of data science as a profession and academic discipline. We still have much to learn.

In recent years, data science has emerged as a hot new profession.A 2012 Harvard Business Review article named Data Scientist: the Sexiest Job of the 21st Century in its very title.Its authors, Tom Davenport and D. J. Patil, defined data scientist as “a high-ranking professional with the training and curiosity to make discoveries in the world of big data,… Their sudden appearance on the business scene reflects the fact that companies are now wrestling with information that comes in varieties and volumes never encountered before.”

The demand for data scientists has been racing ahead of supply. People with the necessary skills are scarce, primarily because the discipline is so new. Universities only started offering courses and advanced degrees in data science in the past 5 years.

Furthermore, data science is a highly complex discipline, a veritable mashup of several different fields. The data part of its name refers to acquiring, ingesting, transforming, storing and retrieving vast volumes and varieties of information, whereas the science part seeks to extract insights from the data by applying tried-and-true scientific methods, that is, empirical and measurable evidence subject to testable explanations and predictions.

It’s very exciting to contemplate the emergence of a major new discipline. It reminds me of the advent of computer science in the 1960s and 1970s. In its early years, the field attracted people from a variety of other disciplines who started out using computers in their work or studies, and eventually switched to computer science from their original field.

This was the case with me. I used computers extensively while a physics student at the University of Chicago in the 1960s.When the time came to look for a job, I realized that I had enjoyed the computing side of my research more than the physics, and in 1970 joined the computer science department at IBM’s Watson Research Center.

Like data science, computer science also had its roots in a number of disciplines, primarily math for its more theoretical foundations, and engineering and business for its more applied aspects. Computer science became an established, widely accepted discipline around the mid-late 1970s, and expanded in multiple directions over the next decades with the advent of personal computers in the 1980s and the Internet in the 1990s. Computing has become an integral part of many disciplines, given that digital technologies now permeate just about every nook and cranny of business and society.

As has been the case with IT over the past several decades, we’re now seeing the growing value of capturing as data many aspects of business, society, and of our very lives that’ve not been quantified before.Data is now being generated by just about everything and everybody around us, including web searches, social media interactions, financial transactions, mobile devices and IoT smart sensors.All these data is enabling us to better understand the world’s physical, economic and social infrastructures, and to infuse information-based intelligence into every aspect of their operations. It’s making it possible to not just better understand what’s happening in the present, but to also make more accurate predictions about the future.

One of the most exciting part of data science is that it can be applied to many domains of knowledge, given our newfound ability to gather valuable data on almost any topic.But, doing so effectively requires domain expertise to identify the important problems to solve in a given area, the kinds of questions we should be asking and the kinds of answers we should be looking for, as well as how to best present whatever insights are discovered so they can be understood by domain practitioners in their own terms. We’re just learning how to do this.

“The growth in big data and analytics is transforming decision-making, operations, marketing, finance, and product innovation,” notes the A-Lab Syllabus as it describes the objectives of the course. “Businesses across the world are wrestling with challenges and opportunities that call for the application of analytics. We are on the cusp of a second machine age - a digital era that holds opportunities and challenges for both individuals and the economy. Workers and professionals in all fields are racing to acquire the skills and capabilities necessary to survive and thrive in this digital revolution.” It’ll be fascinating to see where this will lead us as the discipline matures over the next few decades.