A collection of observations, news and resources on the changing nature of innovation, technology, leadership, and other subjects.

April 02, 2018

One can look at the transformational impact of IT on companies and industries through a variety of lenses, - e.g., products and services, revenue and profit, sales and support, market strategy.One of the most important such lenses is the impact of IT on business processes, that is, on how work is actually done across the various functions of an organization.

Examining the impact of IT over the past several decades through a process point of view, one can identify three distinct phases: first came process automation in the early years of IT, followed around the 1990s by enterprise-wide process reengineering and management.The third phase, is now starting, focused on the processes and transactions that determine how institutions interact with each other around the world.Let me discuss each of these phases.

March 26, 2018

How does human intelligence work, in biological as well as in engineering terms? And how can we use such an understanding of human intelligence to build wiser and more useful machines?On February 1, MIT launched the Intelligence Quest (MIT IQ), - an initiative aimed at addressing these big questions by advancing the science and engineering of both human and machine intelligence.MIT IQ aims “to discover the foundations of human intelligence and drive the development of technological tools that can positively influence virtually every aspect of society.”

MIT has been deeply involved in artificial intelligence since the field’s inception in the 1950s. MIT professors John McCarthy and Marvin Minsky were among the founders and most prominent leaders of the new discipline.AI was one of the most exciting areas in computer sciences in the 1960s and 1970s.Many of the AI leaders in those days were convinced that a machine as intelligent as a human being would be developed within a couple of decades.They were trying to do so by somehow programming the machines to exhibit intelligent behavior, even though to this day we have little idea what intelligence is about, let alone how to translate intelligence into a set of instructions to be executed by a machine.Eventually, all these early AI approaches met with disappointment and were abandoned in the 1980s. After years of unfulfilled promises, a so called AI winter of reduced interest and funding set in that nearly killed the field.

AI was reborn in the 1990s when it adopted a more applied, engineering-oriented paradigm.The new AI paradigm enabled computers to acquire intelligent capabilities by ingesting and analyzing large amounts of data using powerful computers and sophisticated algorithms.Instead of trying to explicitly program intelligence, this new approach was based on feeding lots and lots of data to the machine, and then letting the algorithms discover patterns and extract insights from all that data.

Such a data-driven, machine learning approach produced something akin to intelligence or knowledge.Moreover, unlike the explicit programming-based approaches, the statistical-based ones scaled very nicely. The more data you had, the more powerful the supercomputers, the more sophisticated the algorithms, the better the results.Machine learning and related advances like deep learning, have played a major role in AI’s recent achievements.

March 19, 2018

The January-February issue of the Harvard Business Review spotlights The Culture Factor, with five articles on the subject. “Culture is the tacit social order of an organization: It shapes attitudes and behaviors in wide-ranging and durable ways,” notes the issue’s lead article. “Cultural norms define what is encouraged, discouraged, accepted, or rejected within a group.When properly aligned with personal values, drives, and needs, culture can unleash tremendous amounts of energy toward a shared purpose and foster an organization’s capacity to thrive.”

In Who Says Elephants Can't Dance?, - his excellent chronicle of the IBM transformation he led as Chairman and CEO from 1993 to 2002, - Lou Gerstner wrote: “I came to see in my time at IBM that culture isn’t just one aspect of the game - it is the game. In the end, an organization is nothing more than the collective capacity of its people to create value. Vision, strategy, marketing, financial management - any management system, in fact - can set you on the right path and can carry you for a while. But no enterprise - whether in business, government, education, health care or any area of human endeavor - will succeed over the long haul if those elements aren't part of its DNA.”

March 12, 2018

More than ever, scientific and technological innovations touch every aspect of our lives and influence the choices we make in a wide variety of areas. A 2015 study by the Pew Research Center revealed that most Americans have a positive view of science and of the accomplishments of scientists.But, despite broadly similar views about the place of science in the country, there were large differences between the general public and scientist across a number of important issues, including genetically modified foods, vaccine safety and climate change.

What do we mean by science?“The term science encompasses a range of disciplines in the physical, social, and life sciences, along with applied fields, such as engineering and medicine,” notes the report. “Science can be defined as ‘the use of evidence to construct testable explanations and predictions of natural, as well as the knowledge generated through this process.’”But, one must keep in mind that the word science is interpreted differently by different individuals, - e.g., medical treatments, technological advances, fundamental research.“When interpreting the available survey data, it is important to consider the range of responses a question might prompt.”

March 05, 2018

A killer app is an IT application whose value is simple to explain, its use is fairly intuitive, and the application turns out to be so useful that it helps propel the success of a product or service.Spreadsheets and word processing were major factors in the 1980s adoption of personal computers, for example.A decade later, e-mail, and the World Wide Web played crucial roles in the mainstream success of the Internet.While there are a number of strong candidates for smartphone killer-apps, navigation and music are among the most widely used.

Blockchain has been in the news lately, but beyond knowing that it has something to do with payments and digital currencies, most people don’t know what blockchain is or why they should care. A major part of the reason is that we still don’t have the kind of easy-to-explain blockchain killer-apps that propelled the Internet forward.

Blockchain has yet to cross the chasm from technology enthusiasts and visionaries to the wider marketplace that’s more interested in business value and applications. There’s considerable research on blockchain technologies, platforms and applications as well as market experimentation in a number of industries - roughly where the Internet was in the mid-late 1980s: full of promise but still confined to a niche audience.

In addition, outside of digital currencies, -a somewhat exotic topic of interest to a relatively small audience, - blockchain applications are primarily aimed at institutions. And, given that blockchain is all about the creation, exchange and management of valuable assets, its applications are significantly more complex to understand and explain than Internet applications.

The management of information is quite different from the management of transactions.The latter, especially for transactions dealing with valuable or sensitive assets, requires deep contractual negotiations among companies and jurisdictional negotiations among governments.Moreover, since blockchain is inherently multi-institutional in nature, its applications involve close collaboration among companies, governments and other entities.

But, despite its market acceptance, a recent McKinsey report found that AI adoption is still at an early, experimental stage, especially outside the tech sector.Based on a survey of over 3,000 AI-aware C-level executives across 10 countries and 14 sectors, the report found that 20 percent of respondents had adopted AI at scale in a core part of their business, 40 percent were partial adopters or experimenters, while another 40 percent were still waiting to take their first steps.

The report adds that the gap between the early AI adopters and everyone else is growing.While many companies have yet to be convinced of AI’s benefits, leading edge firms are charging ahead.Companies need to start experimenting with AI and get on the learning curve, or they risk falling further behind.

AI will likely become the most important technology of our era as it’s improved upon over time, but we’re still in the early stages of deployment.It’s only been in the last few years that complementary innovations, especially machine learning have taken AI from the lab to early marketplace adopters.And, history shows that even after technologies start crossing over into mainstream markets, it takes considerable time, - often decades, - for the new technologies and business models to be widely embraced by companies and industries across the economy.

February 19, 2018

The January 21 issue of the NY Times Magazine included a very good article, Beyond the Bitcoin Bubble, by science writer Steven Johnson.His article aims to explain what blockchain, cryptocurrencies and related technologies are all about as well as their potential impact on the economy and society. These are not only complex but esoteric subjects, as nicely illustrated in this funny video. While blockchain and Bitcoin have been frequently in the news lately, beyond knowing that they have something to do with money, most people don’t really know what they are or why they should care.

This is not surprising.In their early years, it’s hard for most people to appreciate the potential of a transformative technology. There’s generally a considerable time-lag between the time we start reading articles about their exciting potential and their adoption by companies and industries across the marketplace.This is particularly the case for general purpose technologies, like electricity, the Internet and blockchain, which are the most potentially transformative due to their widespread use. Their deployment time-lags are longer because attaining their full benefits requires a number of complementary co-inventions and investments, including additional technologies, applications, processes, business models, and regulatory policies.

February 12, 2018

It wasn’t that long ago that we didn’t have much use for probabilities in our daily life, - outside of weather reports, election predictions, baseball and financial markets.But that’s all changed with the growing datafication of the economy and society. Probabilities have been playing an increasing role in our work and personal life given our newfound ability to quantify just about anything.In all kinds of everyday situations, - from medical diagnoses to financial decisions, - we now have to accept the fact that it’s impossible to predict what will actually happen.Instead, we have to get used to living in a complex world of uncertainties and probabilities.We have to learn how to deal with the very messy world of big data, and how to best apply our learning to make good decisions and predictions.

Physics went through such a transition about 100 years ago. In the deterministic world of classical mechanics, there’s always a real truth. The same objects, subject to the same forces, will always yield the same results. Elegant mathematical models can be used to make perfect predictions within the accuracy of their human-scale measurements. Early in the 19th century, French mathematician and scientist Pierre-Simon Laplaceobserved that if we knew the precise state of the universe as represented by the position and speed of every one of its particles, classical mechanics would enable us to calculate all past and future states of the universe.

But, this predictable world began to fall apart in the early 20th century. 19th century classical physics was replaced by a kind of 20th century magical mystery tour, ruled by the new principles of quantum mechanics and relativity.Classical mechanics could not explain the counter-intuitive and seemingly absurd behavior of energy and matter at atomic and cosmological scales.

At these scales, the world is intrinsically unpredictable.Instead of a deterministic world, we now have a world based on probabilities. You cannot predict all the future states of an electron, for example, based on its present state. You can map out its behavior, but only as probability distributions of all the possible states it could be at. Moreover, the Heisenberg uncertainty principle tells you that it’s impossible to know the exact state of a particle. You cannot simultaneously determine its exact position and velocity with any great degree of accuracy no matter how good your measurement tools are.

February 05, 2018

Will there be enough work in the future? Opinions are fairly dividedbetween those who believe that technology advances will reduce human jobs, and those who believe that technology advances will produce as many jobs as they displace. It’s easier to predict the jobs that will be automated away by technology, but much more difficult to predict the new jobs that these same technologies will create. In the end, we don’t really know.

In December of 2017, the McKinsey Global Institute published Jobs Lost, Jobs Gained: Workforce Transition in a Time of Automation, a report that directly addresses this question.The McKinsey study examined in great detail the work that’s likely to be displaced by automation through 2030, as well as the jobs that are likely to be created over the same period.It analyzed data from 46 countries comprising almost 90 percent of global GDP, focusing particularly on six countries: China, Germany, India, Japan, Mexico and the US.For each of this six countries, the study modeled the potential for employment changes in more than 800 occupations based on different scenarios for the pace of automation adoption and for future labor demand.

The report’s overall conclusion is that a growing technology-based economy will create a significant number of new occupations, - as has been the case in the past, - which will more than offset declines in occupations displaced by automation.However, “while there may be enough work to maintain full employment to 2030 under most scenarios, the transitions will be very challenging - matching or even exceeding the scale of shifts out of agriculture and manufacturing we have seen in the past.”

January 29, 2018

The 2018 MIT CIO Symposium will take place on May 23. This year’s Symposium will be focused on the challenges of executing a digital transformation strategy. It’s no longer enough to have formulated a digital strategy for the business. As a number of recent reports have indicated, leading edge companies are pulling ahead of everyone else by pushing the boundaries of digitization.And, having a strong digital foundation gives those companies a leg up in embracing data analytics, AI, blockchain, and other advanced technologies, further widening their advantage.“The time for merely thinking digital has passed; the future belongs to the doers,” notes the Symposium’s website.

Why do so many organizations struggle to implement their well formulated strategies, especially transformational strategies like the transition to digital, that encompass new technologies, business models and management practices?This is a question I’ve long been thinking about, having lived through IBM’s near-death experience in the early1990s.

Is it that in spite of all their efforts in strategy formulation, their management was unable to anticipate the new technology and market changes and were caught by surprise? Or it is that, as if actors in a kind of Greek tragedy, they saw the changes coming and understood what had to be done, but were somehow unable to execute their strategies?

MIT’s Donald Sull has been exploring these questions for over 20 years.As this 2015 HBRarticle notes, we know a lot more about strategy than about translating strategy into results.“Books and articles on strategy outnumber those on execution by an order of magnitude…A recent survey of more than 400 global CEOs found that executional excellence was the number one challenge facing corporate leaders in Asia, Europe, and the United States, heading a list of some 80 issues, including innovation, geopolitical instability, and top-line growth.We also know that execution is difficult.Studies have found that two-thirds to three-quarters of large organizations struggle to implement their strategies.”

January 22, 2018

This past October, the Pew Research Center released Automation in Everyday Life, a report on what Americans think about advanced technologies like artificial intelligence and robotics, and the impact they expect them to have on their everyday lives.The report is based on a national survey of 4,135 randomly selected American adults.To help gauge their opinion on such complex topics, the survey questions were framed around four specific scenarios related to these advanced technologies: driverless cars, workplace automation, robot caregivers, and computer algorithms that evaluate and hire job applicants.

“Americans anticipate significant impacts from various automation technologies in the course of their lifetime,” noted the report.“Although they expect certain positive outcomes from these developments, their attitudes more frequently reflect worry and concern over the implications of these technologies for society as a whole.”

January 15, 2018

In 2008 I gave a talk at a conference on The Promise and Reality of Cloud Computing.In his closing remarks, the conference organizer noted that most everyone had agreed that something big and profound was going on, but they weren’t quite sure what it was they were excited about.“There is a clear consensus that there is no real consensus on what cloud computing is,” he said.

At the time, most people associated cloud with utility computing and IT-as-a-Service.Technology writer Nicholas Carr was also a speaker at the conference.In his talk, based on his then recently published book The Big Switch, Carr nicely framed the historical shift to cloud computing, comparing the evolution of IT to that of power plants.In the early days, companies usually generated their own power with steam engines and dynamos. But with the rise of highly sophisticated, professionally run electric utilities, companies stopped generating their own power and plugged into the newly built electric grid.

IT was then undergoing a similar transformation, as cloud service providers were starting to offer IT-as-a-Service with near unlimited scalability at very attractive prices, based on a flexible pay-as-you-go delivery model.Early cloud adopters were mostly focused on improving the economics of IT, giving them the ability to launch new offerings or expand their current ones without major investments in additional IT infrastructure while paying for whatever cloud resources they actually used.

Cloud continued to evolve and advance over the ensuing years.As an article noted about a 2013 cloud conference: “There wasn’t a single speaker who started off a session by saying, ‘Let's define cloud computing.’That gets tiresome when seen in session after session, year after year, so its absence is gratefully received. This is a clear indication that the industry has moved beyond elementary knowledge-gathering and onto the practicalities associated with cloud implementation and rollout.” Still, debates continued on the truedefinition of cloud computing. Do private and hybrid clouds qualify or only public ones?

The paper’s authors, - MIT professor Erik Brynjolfsson, MIT PhD candidate Daniel Rock, and University of Chicago professor Chad Syverson, - note that “aggregate labor productivity growth in the U.S. averaged only 1.3% per year from 2005 to 2016, less than half of the 2.8% annual growth rate sustained from 1995 to 2004…What’s more, real median income has stagnated since the late 1990s and non-economic measures of well-being, like life expectancy, have fallen for some groups.”

This productivity puzzle isn’t confined to AI.“Few topics in economics today in most large economies generate as much debate as the productivity puzzle,” said McKinsey in a March, 2017 report.“Inthe United States, productivity growth has declined sharply since 2004 yet digital technology has been widely apparent during this period…The answer to this puzzle holds the key to future prosperity because now more than ever our economy depends on productivity improvements for long-term economic growth.Economists have proposed competing explanations for declining productivity growth and so far have failed to reach a consensus.”

January 02, 2018

Just about every industry has been significantly transformed in the past few decades. But few have been as disrupted as the music industry. Everything seems to be changing at once, from the way content is produced and delivered, to the sources of revenue and profits. Digital technologies, - the Internet, smartphones, cloud computing, … - have literally turned dollars into pennies. Now, blockchain and related technologies may once more play a major role in the music industry, - this time helping to turn those pennies back into dollars.

We’re truly surrounded by music as never before, - in a wide variety of styles; in physical and digital formats; over the Internet, satellite, and broadcasts; in mobile devices and home music systems.But, the shift from physical to digital, and then from downloads to streaming have wreaked havoc on the business of music.US retail revenues of recorded music were close to $14 billion in 1998 before starting their decline.According to the Recording Industry Association of America (RIAA), revenues fell from roughly $12 billion in 2006 to around $7 billion in 2010.They stayed flat at $7 billion though 2015, starting to increase in 2016 mostly due to a growth in paid streaming music subscriptions.Revenues are expected to be around $8 billion in 2017.

Turning dollars into pennies succinctly captures the music industry’s precipitous revenue declines.As this article noted, “During the heyday of the 80s, the music industry was rolling in so much money it could afford to leave some on the table… which it often did…There were so many dollars coming in, they didn’t bother paying attention to the pennies.Well, as we all know those days are gone.And after a decade of losses from its traditional revenue stream (sales), we’ve seen the music industry forced to look for revenue in new places and in different ways.In the absence of dollars, the pennies now matter.”

As a result, the music industry now finds itself trying to figure out how to keep track of all those pennies in the hope of turning them back into dollars.In the first half of 2017, streaming services accounted for 62% of the total market, with digital downloads and physical sales accounting for 19% and 16% respectively, said the RIAA.But streaming pays royalties in fractions of pennies, so legislation, new business models and technological innovations are needed to make sure that all the necessary data are efficiently collected so the various artists involved in a song are paid.

“The most important and innovative industries and the most talented, most ambitious, and wealthiest people are converging as never before in a relative handful of leading superstar cities that are knowledge and tech hubs,” wrote Florida.“This small group of elite places forge ever forward, while most others struggle, stagnate, or fall behind.This process is one I like to call winner-take-all urbanism.”

Winner-take-all urbanism is another manifestation of the winner-take-all economics of the past few decades, in which a relatively small number of players reap a very large share of the rewards, upending whole industries with their outsized returns.

New technologies have been replacing workers and transforming economies over the past two centuries.But, over time, these same technologies led to the creation of whole new industries and new jobs.While the technologies of the industrial economy helped to make up for our physical limitations, the technologies of the digital economy are now enhancing our cognitive capabilities. They’re being increasingly applied to activities that not long ago were viewed as the exclusive domain of humans.Will the AI revolution play out like past technology revolutions, - short term disruptions followed by long-term benefits, - or will this time be different?

Conference participants generally agreed that AI will have a major impact on jobs and the very nature of work.But, for the most part, they viewed AI as mostly augmenting rather than replacing human capabilities, automating the more routine parts of a job and increasing the productivity and quality of workers, so they can focus on those aspect of the job that most require human attention.Overall, a small percentage of jobs will be fully automated, while many more will be significantly transformed.

Conference participants also generally agreed that the more advanced AI-based transformations will not happen rapidly, but are likely decades away.Much progress has been recently made in the ability to extract features from all the data we now have access to, as well as in machine learning algorithms that give computers the ability to learn by ingesting large amounts of data instead of being explicitly programmed.While such statistical pattern recognition approaches can be applied to many tasks, they’re no substitute for model formation, the main approach used by humans, - from toddlers to physicists, - to understand how the world works. We’re a long way from the development of AIs that truly learn and reason like people.

December 11, 2017

Over the past few years, Industry 4.0, - aka the 4th Industrial Revolution, - has been the subject of several articles, studies, and surveys, as well as a book published earlier this year by Klaus Schwab, founder and executive chairman of the World Economic Forum (WEF).They all attempt to describe and put a name to the disruptive changes taking place all around us within the context of the past 250 years.

“Industrial revolutions are momentous events,” said A Strategist Guide to Industry 4.0, - a 2016 article in strategy+business.“By most reckonings, there have been only three.”The First Industrial Revolution, - starting in the last third of the 18th century, - introduced new tools and manufacturing processes based on steam and water power, ushering the transition from hand-made goods to mechanized, machine-based production.The Second starting a century later, brought us steel, cars, chemicals, petroleum, electricity, the telephone and radio while creating the age of mass production.The Third, - following World War II - saw the advent of computers, digital technologies, the IT industry, and the automation of process in just about all industries.

There’s general agreement that the 4th Industrial Revolution is primarily driven by a fusion of once separate technologies that when joined together are integrating the physical and digital worlds.But, there’s a spectrum of opinions as to the scope of its impact, some arguing that Industry 4.0 applies primarily to manufacturing technologies and industries, while others argue that it’s profoundly transforming our economies and societies.

According to Perez, since the onset of the Industrial Revolution we’ve had 5 major technology-driven economic cycles, each one lasting roughly 50-60 years. First was the age of machines, factories and canals starting in 1771. This was followed by the age of steam, railways, iron and coal, starting in 1829; steel, electricity and heavy engineering in 1875; oil, automobiles, and mass production in 1908; and our present ICT-based digital age starting in 1971.

Each economic cycle is composed of two very different periods, each lasting roughly 20 - 30 years.The installation period is the time of creative destruction, when new technologies emerge from the labs into the marketplace, entrepreneurs start many new businesses based on the new technologies, VCs encourage business model experimentation, and the new ventures attract considerable investments and financial speculation.Inevitably this all leads to a financial bubble, which eventually bursts in spectacular fashion leading to a time of crisis.

After the crash, comes the deployment period, which Perez views as a time of economic transformation and institutional recomposition. The now well accepted technologies and business paradigms become the norm; infrastructures and industries start getting better defined and more stable; and production capital drives long-term growth and expansion by spreading and multiplying the successful business models.

“Perhaps the most critical function of any organization or society is its decision systems,” wrote Pentland. “In modern societies, decision systems provide a leader the ability to make informed and timely decisions, supported by a complex enterprise of distributed information and communication systems that provide situational awareness. Traditionally, decision systems have been confined to individual physical domains, such as logistics, physical plant, and human resources, and more recently virtual domains such as cyber, by both policy and technology, resulting in challenges with the integration of information across disparate domains.”

But, despite the increasingly complex decisions that organizations are called upon to make, decision-making remains human-intensive and anecdotal.Few organizations have applied social network analysis to help them scale the size and expertise of the decision-making group.Nor have they integrated the large amounts of data, analytical tools and powerful AI systems now at our disposal into their decision making systems.

According to Arthur, the digital revolution has morphed through three distinct eras over the past several decades.The first era, in the 1970s and 1980s, brought us Moore’s law and the dramatic advances in semiconductor technologies.From mainframe and supercomputers to PCs and workstation, IT was now being used in a wide variety of applications, from financial services and oil exploration, to computer-aided design and office systems.“The economy for the first time had serious computational assistance.”

Then came the second era in the 1990s and 2000s, which enabled us to link computers together, share information, and connect digital processes.“Everything suddenly was in conversation with everything else,” giving rise to “the virtual economy of interconnected machines, software, and processes…, where physical actions now could be executed digitally.”

We’re now in the third era, which began roughly in the 2010s.It’s brought us smartphones, ubiquitous sensors, IoT devices and oceans and oceans of data.Powerful computers and intelligent algorithms are enabling us to make sense of all that data by searching for patterns and doingsomething with the results, including computer vision, natural-language processing, language translation, face recognition and digital assistants.

November 06, 2017

The MIT Initiative on the Digital Economy (IDE) was organized in 2013 by Erik Brynjolfsson and Andy McAfee to examine the impact of digital technologies on the world.Understanding the future of work and jobs is one of the major areas of research being address by IDE.What will the workforce of the future look like?, Where will jobs come from in the coming years, especially for the workers must impacted by automation?, How can we accelerate the transformation of institutions, organizations, and human skills to keep up with the quickening pace of digital innovation?

To help come up with breakthrough, real-world answers to these tough questions, IDE launched the MIT Inclusive Innovation Challenge (IIC) last year.The Challenge aims to identify, celebrate and award prizes to organizations around the world that are developing innovative approaches for improving the economic opportunities of middle- and base-level workers.

Now in its second year, the IIC introduced the winners of its 2017 competition at an event held on October 12 as part of Boston’s annual HUBweek.Over $1 million was awarded to the winners in four categories: job creation and income growth, skills development and matching, technology access and financial inclusion.The grand prize winner in each category received $150,000, while the three runners up in each category received $35,000.The awards were funded with support from from Google.org, The Joyce Foundation, ISN and Joseph Eastin.

October 30, 2017

Gross domestic product (GDP), the basic measure of a country’s overall economic output, is generally used by governments to inform their policies and decisions.“What we measure affects what we do; and if our measurements are flawed, decisions may be distorted,” noted a 2009 Commissionconvened to look at the adequacy of GDP as an indicator of economic performance, - which was led by Nobel-prize winning economists Joseph Stiglitz and Amartya Sen.

GDP is essentially a measure of production. While suitable when economies were dominated by the production of physical goods, GDP doesn’t adequately capture the growing share and variety of services and the development of increasingly complex solutions in our 21st century digital economy.

Digital Spillover, a recent report co-developed by Huawei and Oxford Economics, explores how to better define and measure the true impact of the digital economy.It argues that the scope of the digital economy is expanding and proposes a novel way of measuring its impact.

“A truly digital economy is one in which businesses from across the industrial spectrum are investing in digital and making the most productive use of it,” notes the report.“The mechanisms by which this is happening are complex and evolving.Over and above the direct productivity boost that companies enjoy from digital technologies, a more profound chain of indirect benefits also takes place, as the impact spills over within a firm, to its competitors, and throughout its supply chain.These digital spillover effects materialize through numerous channels, and are integral to understanding the role digital technologies play in the economy.”

October 23, 2017

Two years ago, a group of leading CEOs from around the world met in China to discuss the major issues facing their companies in the global marketplace.The meeting was organized by the Center for Global Enterprise (CGE), - the nonprofit research institution founded by former IBM Chairman and CEO Sam Palmisano to study the contemporary corporation, globalization, economic trends, and their impact on society.The CEOs at the meeting in China concluded that their highest research priority was the evolution toward Digital Supply Chains (DSC), and its potential to transform organizations and the conduct of business around the world.In response, the Digital Supply Chain Institute (DSCI) was launched a few months later.

In October of 2016, the DSCI, in partnership with CREATe.orgpublished its first research paper, Digital Supply Chains: A Frontside Flip. The paper explains the key differences between traditional and digital supply chains, and provides practical advice to help companies prepare for the digital supply chains of the future

The traditional supply chain has evolved over the years as a crucial process used by many companies for the production, handling, and/or distribution of their products or services.Traditional supply chains generally comprise five separate back-office functions: transport, warehousing, purchasing, marketing and finance.Their key objective has been to improve the efficiency of getting goods and services from suppliers to customers, through a series of intermediate steps including manufacturing, distributors and retailers. But, these supply chains were quite fragmented, with limited interactions and information sharing among their various functions and steps.

Over the past couple of decades, the explosive growth of the Internet has made it much easier for companies to transact with each other around the world.The connectivity and universal reach of the Internet has enabled companies to integrate and better coordinate all their various processes, as well as to go beyond the boundaries of the firm and develop highly sophisticated global supply chains.Vertically integrated firms have evolved into virtual enterprises, increasingly relying on supply chain partners for many of the manufacturing and services functions once done in-house.

The DSCI paper argues that it’s time to focus on the transformation of the whole supply chain process, not just in further increases in efficiency. Their traditional linear nature is increasingly misaligned with the more networked nature of today’s global production.

October 16, 2017

Earlier this year the communications firm Burson-Marsteller along with research firm PSBinterviewed 1,500 Americans from all walks of life to shed light on their feelings about the current state of the economy and their expectations for the future.“Americans are concerned about the present, but optimistic about the future,” was the survey’s overriding finding. “However, opinions about the economy and the future are somewhat driven by education level.”

While only 33% said that the country is headed in the right direction, 67% said that they were somewhat or very optimistic about the future.To a large extent, this optimism is based on the belief that their jobs are safe over the next 5 years, with 62% believing that they won’t be laid off and 60% that a machine couldn’t replace them. In their opinion, the high cost of living, unemployment and jobs, the gap between rich and poor and taxes are the top issues facing the US economy, ahead by a fairly wide margin over corporate corruption, loss of manufacturing jobs, and falling or stagnant wages.

Their feelings about the future differed by educational level.71% of Americans with a college education or more said that they have the right skills to succeed in the 21st century, compared to 42% of those with a high school education or less.Only 14% of survey participants with a college education worried that their jobs could be automated within five years, compared with 30% of those with high school or less.And 55% of Americans with college or more felt that technology will improve overall employment compared to 45% of those with high school or less.

Let me attempt to explore these questions based on relatively recent academic research in three key areas: gender-based behavioral differences; how to create smarter working groups; and the changing skills requirements in the digital economy.

“Business leaders are scrambling to adjust to a world few imagined possible just a year ago,” notes the article’s opening sentence.“The myth of a borderless world has come crashing down.Traditional pillars of open markets - the United States and the UK - are wobbling, and China is positioning itself as globalization’s staunchest defender.In June 2016, the Brexit vote stunned the European Union, and the news coverage about globalization turned increasingly negative in the U.S. as the presidential election campaign progressed.”

What should companies do?Should they pack up and return home?Is this the end of globalization, only a dozen years after Thomas Friedman’sThe World is Flat became a prominent international best-seller?“Not according to my research,” says Ghemawat.

I now want to focus my attention on the provocative questions raised in the book’s latter chapters.“In this era of powerful new technologies, do we still need companies?”How are firms likely to evolve given the continuing advances of Internet-based technologies, as well as emerging blockchain-based technologies such as distributed ledgers and smart contracts?“Are companies passé” in an increasingly decentralized economy?

Following the Great Depression and WW2, the country welcomed the stability promised by corporate capitalism.Big, multinational companies dominated most industries, - GM, Ford and Chrysler in cars, Citibank, American Express and Morgan Stanley in finance, and Esso/Exxon, Mobil and Texaco in oil and gas.It was an era characterized by bureaucratic corporate cultures, - not unlike military hierarchies - focused on organizational power and orderly prosperity.

This all started to change a few decades later with the advent of a more innovative, fast moving entrepreneurial economy.The 1980s saw the rise of young, high-tech companies, - e.g., Microsoft, Apple, Oracle, Sun Microsystems, - and Silicon Valley became the global hub for innovation, emulated by regions around the world. Turtlenecks and jeans replaced blue suits, white shirts, and ties.

The advent of the Internet and e-commerce pushed all these trends into hyperdrive in the 1990s.The Internet made it much easier for companies to transact with each other around the world.Vertically integrated firms evolved into virtual enterprises, increasingly relying on supply chain partners for many of the functions once done in-house.Management experts noted that large firms were no longer necessary and would in fact be at a disadvantage in the Internet era when competing against agile, innovative smaller companies.

The book is organized into three sections, each focused on a major trend that’s reshaping the business world: the rapidly expanding capabilities of machines; the emergence of large, asset-lightplatform companies; and the ability to now leverage the knowledge, expertise and enthusiasm of the crowd. These three trends are combining into a triple revolution, causing companies to rethink the balance between minds and machines; between products and platforms ; and between the core and the crowd.

I cannot possible do justice to all three trends in one blog, so let me summarize the key themes of the Mind and Machine section, which I found to be an excellent explanation of the current state of AI.

September 11, 2017

AI is now seemingly everywhere.In the past few years, the necessary ingredients have come together to propel AI beyond the research labs into the marketplace: powerful, inexpensive computer technologies; advanced algorithms and models; and most important, oceans and oceans of data.

“Artificial intelligence is getting ready for business, but are businesses ready for AI?,” asks McKinsey in a recently published report - Artificial Intelligence: the Next Digital Frontier.“AI adoption outside of the tech sector is at an early, often experimental stage,” is the report’s succinct answer.“Few firms have deployed it at scale.”

The report is based on a survey of over 3,000 AI-aware C-level executives across 10 countries and 14 sectors.Only 20 percent of respondents had adopted AI at scale in a core part of their business.40 percent were partial adopters or experimenters, while another 40 percent were essentially contemplators.

AI encompasses a broad rage of technologies and application. It’s often viewed as the leading edge of IT. As soon as AI is successfully applied to a problem, the problem is no longer a part of AI.The McKinsey report is focused on five technologies being increasingly deployed in business, that most everyone agrees are part of AI: robotics and autonomous vehicles, computer vision, language, virtual agents, and machine learning.

September 05, 2017

“In primitive economies, people traded mostly with members of their village and community,” wrote David Brooks in a June, 2014 NY Times OpEd, - The Evolution of Trust.“Trust was face to face.Then, in the mass economy we’ve been used to, people bought from large and stable corporate brands, whose behavior was made more reliable by government regulation. But now there is a new trust calculus, powered by both social and economic forces.”

Brooks’ article focused on the rise of the on-demand economy, - aka, the collaborative, sharing, peer-to-peer economy, - and in particular on the surprising success of Airbnb. Rooms for rent in boarding houses and child and pet care services are nothing new.What’s new is the impact of technology, platforms and blockchain in particular, on the growing on-demand economy.Companies are being disrupted as consumers are now able to deal with each other bypassing traditional hotels and taxi services.All kinds of on-demand products and services are now coming to market.

The on-demand economy wouldn’t be possible without the mobile devices and platforms that enable peer-to-peer transactions among individuals any time and place; the digital payment systems that reliably and securely broker the transactions between buyers and sellers; and the social reputation systems, where people rank buyers and sellers, - a critical requirement for the smooth functioning of collaborative markets like Airbnb. In the digital economy, our online reputations follow us everywhere, whether we are the renters or the ones renting, the service providers or the service consumers.

August 28, 2017

“The rise of artificial intelligence is the great story of our time,” notes What to Do When Machines Do Everything in its Preface. The book was published earlier this year by Malcolm Frank, Paul Roehrig and Ben Bring of Cognizant’s Center for the Future of Work.“Artificial intelligence has left the laboratory (and the movie lot)… It’s pervading all the institutions that drive our global economy.… And this is just the beginning.”

The book deals with a number of important questions: When machines do everything, what am I going to do?; Will a robot take my job away?; How are humans going to make a living?; What will my industry look like in 10 years?; Will my children be better off than I am?

Overall, the authors are optimistic about the future, reminding us that we’ve been here before.Automation anxieties have been with us ever since the 1810s, when the so-called Luddites smashed the new weaving machines that were threatening their textile jobs, and they continued to resurface over the past two centuries right along with advances in technology.

Over 60 years ago, the advent of computers led to a new round of automation fears. In 1965, Newsweek devoted a special issue to The Challenge of Automation, which it called“the most controversial economic concept of the age.Businessmen love it.Workers fear it. The government frets and investigates and wonders what to do about it.

August 21, 2017

In a recent Harvard Business Review article, - The Error at the Heart of Corporate Leadership, - Harvard Business School professors Joseph Bower and Lynn Paine raised a number of important questions. What’s the core priority of corporate’s management, the near-term gains of shareholders or the long-term health of the company?Do shareholders generally share common investment objectives?Can a governance model focused on maximizing shareholder value threaten a company’s overall health and financial performance?These questions go to the heart of what a corporation is all about as well as to the very nature of modern capitalism.

“In a free-enterprise, private-property system, a corporate executive is an employee of the owners of the business. He has direct responsibility to his employers. That responsibility is to conduct the business in accordance with their desires, which generally will be to make as much money as possible while conforming to the basic rules of the society, both those embodied in law and those embodied in ethical custom.”He believed that business concerns beyond making a profit, - such as “promoting desirable social ends,” or “providing employment, eliminating discrimination, avoiding pollution and whatever else,” amounted to “preaching pure and unadulterated socialism.”

August 14, 2017

A few weeks ago I wrote about the current state of blockchain.In particular, I asked whether blockchain is ready to cross the chasm from its early adopters, - who are already involved with blockchain in one way or another, - to the considerably larger set of mainstream users who may be considering blockchain but are waiting for results and assurances before jumping in.

I explored this question by comparing blockchain today to the Internet in the early-mid 1990s, which is roughly when the Internet made its own transition from early adopters to mainstream users.My conclusion was that blockchain isn’t quite ready for this important transition.It’s still in its early adopters stage, - perhaps akin to the Internet in the late1980s.A lot is going on with blockchain, so much so that last year the World Economic Forum (WEF) named The Blockchain one of its Top Ten Emerging Technologies for 2016.But much remains to be done in key areas, including standards, applications and governance.Blockchain has the potential to make our digital infrastructures much more secure, efficient and trustworthy, but it will take time.

The Internet in the early-mid 1990s was clearly more advanced than blockchain is today. On the other hand, the business environment in 2017 is quite different from the one back then.The Internet and associated technologies like smartphones and cloud computing have significantly lowered the costs of collaboration and experimentation, enabling innovations to emerge more rapidly.In addition to providing a good set of lessons for blockchain’s adoption, the Internet is smoothing the way for the development of blockchain platforms, applications and ecosystems.

What shouldyour company do?Get on the blockchain learning curve now or wait until the technology is more mature before jumping in?Become an early adopter of this potentially transformative technology, or run the risk of being left behind by more aggressive competitors?How should your company decide when and how to best embrace a disruptive new technology like blockchain?

August 07, 2017

A couple of weeks ago I wrote about The Productivity Puzzle, - namely that despite our continuing technology advances, the US and other developed economies have experienced a sharp decline in productivity growth over the past 10 to 15 years.In an article published earlier this year, McKinsey offered three main possible explanations for this productivity growth decline: the difficulty of measuring productivity in the digital economy, the shortage of demand and investment opportunities, and the impact, - or lack thereof, - of technological innovations. “So far, though, economists have failed to reach consensus on the causes of theproductivity growth slowdown or indeed the relative significance of the various arguments,” wrote McKinsey.

But another article, The New Class War by Michael Lind, offers a different, rather provocative explanation for our low productivity growth. According to Lind, the last few decades have seen the rise of managerial elites in advanced economies which have been pursuing profits by methods other than technology-driven innovation and productivity growth.Instead, they’ve been taking advantage of the significant reductions of trade barriers over this timeframe to pursue profits based on global labor arbitrage, as well as tax arbitrage and other forms of financial engineering.

Lind references James Burnham, John Kenneth Galbraith, and other political scientists and economists whose works I’m not familiar with. I found the article quite interesting because Lind’s explanation for our declining economic and productivity growth seems quite different from those of most other economists. Let me attempt to summarize his key arguments.

July 31, 2017

Earlier this year, IBM conducted the largest study to date on the state of adoption of blockchain.The study interviewed almost 3,000 C-Suite executives from over 80 countries and 20 industries to learn about their company’s blockchain plans.Survey responses where classified into three groups: explorers, investigators and passives.

Explorers are already involved in blockchain pilots and experiments.They make up an average of 8 percent across all industries, with higher activity in certain industries like financial services.25 percent of organizations are investigators, - considering but not yet ready to deploy blockchains, and 67 percent are passives, - not considering blockchains so far.

These survey results are not surprising for a technology as new and complex as blockchain.With such technologies, you typically have a relatively small number of early adopters.A larger number of followers are waiting to learn from the early adopters’ experiences, while the laggards are not yet sure what the technology is about or if it applies to them.

July 24, 2017

Last year, Foreign Affairs focused its March/April issue on Surviving Slow Growth.“The first decade of the twenty-first century was a time of unprecedented economic growth,” said the issue’s introductory article.“The rich world got richer, and the developing world raced ahead: by 2007, the emerging-market growth rate had hit 8.7 percent… Then came the fall… growth has ground to a halt almost everywhere, and economists, investors, and ordinary citizens are starting to confront a grim new reality: the world is stuck in the slow lane and nobody seems to know what to do about it.”

Economic growth has two main components, productivity growth and the growth of the labor force.Fewer workers is one of the key reasons for our stagnant economic growth. The labor forceis still growing in some developing countries like India, Nigeria and the Philippines, but it’s already shrinking in China, Japan and Germany. In the US, the labor force is growing very slowly.

Over the coming decades, the labor force is expected to shrink in most parts of the world as fertility rates continue to decline. The 2016 US fertility rate was the lowest it’s ever been. Increasing productivity is thus more crucial than ever to promote economic growth.But, in the US and other advanced economies, productivity growth has significantly slowed down over the past few decades.

July 17, 2017

Is the Robocalypse upon us?, asked MIT economist David Autor in his presentation at a recent Forum of European central bankers. His presentation was based on a paper co-written with Utrecht University economist Anna Salomons.“Is productivity growth inimical to employment?,” they asked in the paper’s abstract.“Canonical economic theory says no, but much recent economic theory says maybe- that is, rapid advances in machine capabilities may curtail aggregate labor demand as technology increasingly encroaches on human job tasks.”

Fears that machines will put humans out of work are not new. Throughout the Industrial Revolution there were periodic panics about the impact of automation on jobs, going back to the Luddites, - textile workers who in the 1810s smashed the new machines that were threatening their jobs.But, “In the end, the fears of the Luddites that machinery would impoverish workers were not realized, and the main reason is well understood,” noted a 2015 article on the history of technological anxiety.

“The mechanization of the early 19th century could only replace a limited number of human activities. At the same time, technological change increased the demand for other types of labor that were complementary to the capital goods embodied in the new technologies.This increased demand for labor included such obvious jobs as mechanics to fix the new machines, but it extended to jobs for supervisors to oversee the new factory system and accountants to manage enterprises operating on an unprecedented scale.More importantly, technological progress also took the form of product innovation, and thus created entirely new sectors for the economy, a development that was essentially missed in the discussions of economists of this time.”

Automation fears have understandbly accelerated in recent years, as our increasingly smart machines are now being applied to activities requiring intelligence and cognitive capabilities that not long ago were viewed as the exclusive domain of humans. “Previous technological innovation has always delivered more long-run employment, not less. But things can change,” said a 2014 Economistarticle. “Nowadays, the majority of economists confidently wave such worries away… Yet some now fear that a new era of automation enabled by ever more powerful and capable computers could work out differently.”

“However, this extraordinary technology may be stalled, sidetracked, captured or otherwise suboptimized depending on how all the stakeholders behave in stewarding this set of resources - i.e. how it is governed,” they add in a strong note of caution. “How we govern the internet of information as a global resource serves as a model for how to govern this new resource: through a multi-stakeholder approach using what we call global governance networks.”

I totally agree.

The Internet and World Wide Web brought a badly needed culture of collaboration and standards to the IT industry.In the 1980s, it was quite difficult to get different IT systems to talk to each other.Just sending an e-mail across two different applications from two different vendors was quite a chore, as was sharing information across disparate systems.Then in the 1990s, the open Internet protocol, - TCP/IP, - was widely embraced by the general IT marketplace, making it possible to interconnect systems and applications from any vendors.Internet e-mail protocols, - SMTP, MIME, POP, IMAP, - enabled people to easily communicate with anyone on any system.At the same time,the Web’s open standards, - HTML, HTTP, URLs enabled any PC connected to the Internet to access information on any web server anywhere in the world.

A similar story played out with Unix.In the 1980s, Unix became a popular operating system for technical workstations, supercomputers and other kinds of systems.Every vendor had developed its own version of Unix, - IBM’s AIX, Sun’s Solaris, HP’s HP-UX, and several others, - and they were all somewhat different and incompatible, making it difficult to port applications across these different flavors of Unix.Attempts to unify UNIX were not successful.Finally, in the 1990s Linux emerged as a Unix-like operating system that over time was embraced by just about all vendors.

Lots has been written about the critical importance of a good education in our knowledge-based digital economy. At the same time, lots of questions are been raised about college and careers.Is a four-year college education necessary to attain a good job and to have a successful long-term career? Is it sufficient? Are colleges properly preparing students for the job market?Is there a role for liberal arts colleges?Is college still worth it, given their increasing costs and the student loan debt crisis?Should we continue to encourage young people to get a college degree?

The Chroniclearticle addresses a number of these questions.It writes that “Apprenticeships - long embraced as a work-force-training method in other countries - may now have an opening to become a far more common pathway to education and employment in the United States.The enthusiasm for apprenticeships represents a kind of refutation of college education - a recognition that something about the path from college to career is not working for many people.”The article explains why vocational apprenticeships have not fared as well in the US as they have in other advanced economies, and why apprenticeships, and similar dual-track systems, are increasingly important for all careers, including white-collar and professional careers requiring a bachelor’s or higher professional degree.

“Contrary to the more fantastic predictions for AI in the popular press, the Study Panel found no cause for concern that AI is an imminent threat to humankind. No machines with self-sustaining long-term goals and intent have been developed, nor are they likely to be developed in the near future.Instead, increasingly useful applications of AI, with potentially profound positive impacts on our society and economy are likely to emerge between now and 2030, the period this report considers.At the same time, many of these developments will spur disruptions in how human labor is augmented or replaced by AI, creating new challenges for the economy and society more broadly.”

Just because experts conclude that, - at least for the foreseeable future, - AI does not pose an imminent threat to humanity, doesn’t mean that such a powerful technology isn’t accompanied by serious challenges that require our attention.

June 19, 2017

In the Fall of 1995, IBM made the decision to embrace the Internet and make it the centerpiece of its strategic directions.As the general manager of the newly formed IBM Internet Division, I spent a lot of time thinking how to best articulate the promise of the Internet.Looking back, I often described that promise using some variation of: The Internet has the potential to transform the economy, society and our personal lives.

It’s taken a a few additional innovations, - e.g., smartphones, social media, cloud computing, big data, IoT, - for its historical impact to be fully evident, but by now, most everyone agrees that the Internet has become one of the most transformative technologies the world has ever seen, - right up there with electricity, telecommunications, cars and airplanes.

About a year ago, the World Economic Forum (WEF) published its annual list of the Top Ten Emerging Technologies for 2016, and named The Blockchain as one of the technologies in the 2016 list.The WEF report compared the blockchain to the Internet, noting that “Like the Internet, the blockchain is an open, global infrastructure upon which other technologies and applications can be built.And like the Internet, it allows people to bypass traditional intermediaries in their dealings with each other, thereby lowering or even eliminating transaction costs.”

As Don Tapscott and Alex Tapscott wrote in the opening paragraph of their recently published bookBlockchain Revolution, “It appears that once again, the technological genie has been unleashed from its bottle.Summoned by an unknown person or persons with unclear motives, at an uncertain time in history, the genie is now at our service for another kick of the can - to transform the economic power grid and the old order of human affairs for the better.If we will it.”

June 12, 2017

Despite dramatic advances in technology, most of the world’s economies have been stuck in a long period of slow growth and slow productivity. This is one of the most serious challenges in our 21st century economy.Opinions abound, but there’s little consensus on its causes, and, nobody seems to know what to do about it, or how long it will likely last, - years or decades.

In a recent article in the MIT Sloan Management Review, MIT Research Fellow Michael Schrage proposed a provocative and counterintuitive approach for enhancing innovation and productivity through man-machine collaborations. Schrage’s approach has been more influenced by behavioral economics than by technology or algorithmic advances.Instead of just askinghow can people create more valuable innovation?, why not also ask How can innovation create more valuable people?Don’t just leverage advanced technologies, - e.g., bots, software agents, and digital assistants, - to automate away a large portion of the workforce, but also focus on enhancing innovation and productivity by leveraging technology to create higher-performance versions of employees.

“Designing and training smarter algorithms may be cheaper and easier than retraining smart people,” wrote Schrage.“Advocates of autonomous systems and machine learning typically innovate to minimize or marginalize human involvement in business processes.For them, people are part of the problem, not the solution.Organizations that take productivity seriously, however, understand that false dichotomies make poor investments: Smarter machines can - and should - be keys to unlocking greater returns from human capital.”

June 05, 2017

Apple Pay was launched in September of 2014. It was positioned as a relatively easy, secure way to pay for purchases in physical stores as well as online using a variety of supported Apple devices including iPhones, iPads, Macs and Apple Watch. The general reaction to Apple’s announcement was quite positive: “[F]or now, at least, analysts believe if there is any company to persuade consumers of the mobile wallet’s value, it is Apple.”

The payments community welcomed Apple with open arms, hoping that Apple could play a major role in bringing together the different players in the fragmented payments ecosystem, much as it had previously done with iTunes for music and the App Store for smartphone apps. Apple Pay embraced a number of industry standards, including NFC (near-field communications) for contact-less payments; secure element, a dedicated device chip; and network-based tokenization to protect sensitive financial data. In addition, Apple collaborated with the credit card networks, several banks and a number of merchants in the development and deployment of Apple Pay, as well as publishing the APIs so developers could embed Apple Pay services in their own apps.

“Say hello to Apple Pay. It’s the new kid on the payments block and, depending on how things unfold, it could be the new Gorilla in the mobile payments ecosystem,” wroteKaren Webster, CEO of Market Platform Dynamics and an expert in digital payments.But, she also added a prescient note of caution: “As great as this sounds, there are two limitations of Apple Pay right now for consumers and merchants. And it’s that old chicken and egg issue that gets in the way of every new payments system.”The chicken and egg issue is that there aren’t enough consumer devices supporting Apple Pay, and there aren’t enough merchants that accept it.Both, consumers and merchants, have to be convinced to move away from payment systems they know well and feel comfortable with.

May 29, 2017

McKinsey recently published an article on the long-term trends that are reshaping the business environment. The article reminds us that the trend is your friend is an old adage that applies to business strategy as well as to investing.The ability to anticipate where your industry is moving and to begin to reposition your company ahead of competitors is one of the most important elements of business strategy. Companies that correctly identify and ride the tailwinds of technology and market trends will generally perform significantly better than companies for whom these same trends become tough headwinds.

But, the full version of the phrase is “The trend is your friend, until the end when it bends.” Identifying long term trends is easier said than done. Major trends involve competing complex forces interacting with each other, making it even more difficult to ascertain where things are heading. And, they will eventually change direction. It can be just as painful to miss an important trend as to stay with the trend once it starts to bend.

The article discusses nine major global forces organized into three distinct mega-trends: global growth shifts, accelerating industry disruption, and a new societal deal.Let’s take a closer look at each of them.

May 22, 2017

The Economist’s May 6 issue referred to data in its cover as the world’s most valuable resource.Its lead article, - Data is giving rise to a new economy, - called it the fuel of the future.“Data are to this century what oil was to the last one: a driver of growth and change.”Both oil refineries and data centers fulfill similar roles: “producing crucial feedstocks for the world economy.”Much of modern life would not exist without the cars, plastics and drugs made possible by oil refineries.Similarly, data centers “power all kinds of online services and, increasingly, the real world as devices become more and more connected.”

Data has been closely intertwined with our digital technology revolution since the early days of the IT industry.Data processing was the term then used to describe the applications of IT to automate highly structured business processes, e.g., financial transactions, inventory management, airline reservations. Over time, increasingly sophisticated applications were developed to better manage key business operations along with their associated data, including enterprise resource planning, customer relationship management and human resources.Beyond their use in operations, the information generated by these various applications was collected in data warehouses, and a variety of business intelligence tools were used to analyze the data and generate management reports.

These commercial applications dealt mostly with structured information in those early years of computing. But at the same time, the scientific community was developing tools for managing, analyzing and visualizing the high volumes of much more unstructured data generated by their experiments and observations. Physicists, astronomers, biologists, and other scientists and engineers were developing methodologies and architectures for dealing with very large volumes of unstructured data, as well as analytical techniques, like data mining, for discovering patterns and extracting insights from all that data.

Then came the explosive growth of the Internet in the mid-1990s.Ever since, digital technologies have been permeating just about every nook and cranny of the economy, society and our personal lives. Data is now being generated by just about everything and everybody around us, including the growing volume of online and offline transactions, web searches, social media interactions, billions of smart mobile devices and 10s of billions of smart IoT sensors.

May 15, 2017

Every four years since 1997, the US National Intelligence Council has been publishing a Global Trends report on the key trends that will shape the world over the following twenty years.This unclassified strategic report is intended to provide the incoming administration and other senior leaders with a framework for long-range policy assessment.

Global Trends 2030: Alternative Worlds, released four years ago, identified four overarching megatrends that are expected to shape and transform the world by 2030: the empowerment of individuals - which will accelerate the reduction of poverty and raise the standard of living around the world; the diffusion of power among states, - which will shift power to networks and coalitions in a multipolar world; changing demographic patterns especially rapid aging, urbanization, and increased migration; and growing demands on resources such as food, water and energy, - which might lead to scarcities.

The latest Global Trends report, Paradox of Progress, was released earlier this year.It begins by explaining its title. “We are living a paradox: The achievements of the industrial and information ages are shaping a world to come that is both more dangerous and richer with opportunity than ever before… The progress of the past decades is historic - connecting people, empowering individuals, groups, and states, and lifting a billion people out of poverty in the process.But this same progress also spawned shocks like the Arab Spring, the 2008 Global Financial Crisis, and the global rise of populist, anti-establishment politics.These shocks reveal how fragile the achievements have been, underscoring deep shifts in the global landscape that portend a dark and difficult near future.”

But now, “The biggest business idea of the past three decades is in deep trouble,” said The Economist.Palmisano pointed out that the world stands at a major crossroads.“A rising chorus of nationalism echoes across developed countries; it calls for tighter borders and restrictions on immigration. Global trade negotiations have essentially ceased, and regional trade deals face strong headwinds of opposition.”

May 01, 2017

The February 23 issue of the NY Times Magazine was devoted to the future of work, and in particular, to what it called the new working class.The Oxford Dictionary defines working class as “The social group consisting of people who are employed for wages, especially in manual or industrial work.”The working class includes many traditional blue-collar occupations, - factory workers, truck drivers, electricians, plumbers, - but it also now includes an increasing number of white-collar and service jobs.

While the new working class comprises a variety of jobs, most discussions on the topic are generally centered on the deindustrialization of America over the past few decades, as well as on the ensuing decline in both the number and the wages of manufacturing jobs.“Now when politicians invoke the working class, they are likely to gesture, anachronistically, to an abandoned factory,” notes the issue’s introductory article.

The ongoing transformation of manufacturing jobs was the subject of Learning to Love Our Robot Co-Workers by Kim Tingley.Her article nicely explained the major changes taking place in US manufacturing, especially the appearance of a new generation of flexible, programmable, moderately priced robots that are designed to safely collaborate with humans, - each doing what they do best.

Robotics is an exciting, fast moving discipline, which is playing an increasingly important role in the future of manufacturing.All computers are defined by what their brains, - that is, their hardware and software, - are capable of computing and controlling.Robots are computers that have both a brain and a body.A robot’s capabilities are defined by what its brains and body can jointly do.

April 24, 2017

I recently attended a very interesting talk , - Exploring the Impact of Artificial Intelligence: Prediction versus Judgment, - by University of Toronto professor Avi Goldfarb.The talk was based on recent research conducted with his UoT colleagues Ajay Agrawaland Joshua Gans. In addition to an in-depth paper aimed at a research audience, they’ve explained their work in two more general interest articles, one in the Harvard Business Review and the second in the MIT Sloan Management Review.

In their opinion, “the best way to assess the impact of radical technological change is to ask a fundamental question: How does the technology reduce costs?Only then can we really figure out how things might change.”For example, the semiconductor revolution can be viewed as being all about the dramatic reductions in the cost of arithmetic calculations.Before the advent of computers, arithmetic was done by humans with the aid of various kinds of devices, from the abacus to mechanical and electronic calculators.

Then came digital computers, which are essentially powerful calculators whose cost of arithmetic operations has precipitously decreased over the past several decades thanks to Moore’s Law.Over the years, we’ve learned to define all kinds of tasks in terms of such digital operations, e.g., inventory management, financial transactions, word processing, photography.Similarly, the economic value of the Internet revolution can be described as reducing the cost of communications and of search, thus enabling us to easily find and access all kinds of information, - including documents, pictures, music and videos.

How does this framing now apply to our emerging AI revolution?After decades of promise and hype, AI seems to have finally arrived, - driven by the explosive growth of big data,inexpensive computing power and storage, and advanced algorithms like machine learning that enable us to analyze and extract insights from all that data. Agrawal, Fans and Goldfarb provide an elegant answer to this question in their HBRarticle.“Machine intelligence is, in its essence, a prediction technology, so the economic shift will center around a drop in the cost of prediction.”