A collection of observations, news and resources on the changing nature of innovation, technology, leadership, and other subjects.

March 04, 2019

Blockchain was created around a decade ago as the public, distributed ledger for the Bitcoin cryptocurrency.Most everyone agrees that it’s a truly brilliant architecture, built on decades-old fundamental research in cryptography, distributed data, distributed computing, game theory and other advanced technologies.When it first came to light, blockchain had no broader goals beyond supporting Bitcoin.But, as was the case with the Internet and World Wide Web, blockchain soon transcended its original objectives.

In 2016, blockchain made the list of the World Economic Forum’s Top Ten Emerging Technologies.The WEF report compared blockchain to the Internet, noting that “Like the Internet, the blockchain is an open, global infrastructure upon which other technologies and applications can be built.And like the Internet, it allows people to bypass traditional intermediaries in their dealings with each other, thereby lowering or even eliminating transaction costs.”

That same year, blockchain made its first appearance in Gartner’s yearly hype cycles, where it’s remained for the past three years.In 2017, Gartner noted that “blockchain might seem like it’s just around the corner.However, most initiatives are still in alpha or beta stage… Long-term, Gartner believes this technology will lead to a reformation of whole industries.”

In a recent article in the MIT Technology Review, associate editor Mike Orcutt succinctly summarized the current state of blockchain: “In 2017, blockchain technology was a revolution that was supposed to disrupt the global financial system.In 2018, it was a disappointment.In 2019, it will start to become mundane… After the Great Crypto Bull Run of 2017 and the monumental crash of 2018, blockchain technology won’t make as much noise in 2019.But it will become more useful.”

February 04, 2019

AI is seemingly everywhere.In the past few years, the necessary ingredients have finally come together to propel AI beyond early adopters to a broader marketplace: powerful, inexpensive computer technologies; advanced algorithms; and huge amounts of data on almost any subject.Newspapers and magazines are full of articles about the latest advances in machine learning and related AI technologies.

Two recent reports concluded that, over the next few decades, AI will be the biggest commercial opportunity for companies and nations.AI advances have the potential to increase global GDP by up to 14% between now and 2030, the equivalent of an additional $14-15 trillion contribution to the world’s economy, and an annual average contribution to productivity growth of about 1.2 percent.

Over time, AI could become a transformative, general purpose technology like the steam engine, electricity, and the Internet. AI marketplace adoption will likely follow a typical S curve pattern, - that is, a relatively slow start in the early years, followed by a steep acceleration as the technology matures and firms learn how to best leverage AI for business value.

To get a better sense of the current state of AI adoption, McKinsey recently conducted a global online survey on the topic, garnering responses from over 2,000 participants across 10 industry sectors, 8 business functions and a wide range of regions and company sizes. The survey asked about their progress in deploying nine major AI capabilities, including machine learning, computer vision, natural language text and speech processing, and robotic process automation.

January 14, 2019

In the 1990s, the Internet was supposed to usher a much more open, decentralized, democratic economy and society.Startups with innovative business models where now able to reach customers anywhere, anytime.Companies, from the largest to the smallest, could now transact with anyone around the world.Vertically integrated firms became virtual enterprises, increasingly relying on supply chain partners for many of the functions once done in-house. Experts noted that large firms were no longer necessary and would in fact be at a disadvantage in the emerging digital economy when competing against agile, innovative smaller companies.

Some even predicted that the Internet would lead to the decline of cities.People could now work and shop from home; be in touch with their friends over e-mail, video calls, and text messaging; and get access to all kinds of information and entertainment online.Why would anyone choose to live in a crowded, expensive, crime-prone urban area when they could lead a more relaxing, affordable life in an outer suburb or small town?

January 07, 2019

Artificial intelligence is rapidly becoming one of the most important technologies of our era.Every day we can read about the latest AI advances from startups and large companies.AI technologies are approaching or surpassing human levels of performance in vision, speech recognition, language translation, and other human domains. Machine learning advances, like deep learning, have played a central role in AI’s recent achievements, giving computers the ability to be trained by ingesting and analyzing large amounts of data instead of being explicitly programmed.

Deep learning is a powerful statistical technique for classifying patterns using large training data sets and multi-layer AI neural networks. It’s essentially a method for machines to learn from all kinds of data, whether structured or unstructured, that’s loosely modeled on the way a biological brain learns new capabilities.Each artificial neural unit is connected to many other such units, and the links can be statistically strengthened or decreased based on the data used to train the system.Each successive layer in a multi-layer network uses the output from the previous layer as input.

Machine learning can be applied to just about any domain of knowledge given our ability to gather valuable data in almost any area of interest. But, machine learning methods are significantly narrower and more specialized than humans.There are many tasks for which they’re not effective given the current state-of-the-art.In an article recently published in Science, professors Erik Brynjolfsson and Tom Mitchell identified the key criteria that help distinguish tasks that are particularly suitable for machine learning from those that are not.These include:

Tasks that map well-defined inputs to well-defined outputs, - e.g., labeling images of specific animals, the probability of cancer in medical record, the likelihood of defaulting on a loan application;

Large data sets exist or can be created containing such input-output pairs, - the bigger the training data sets the more accurate the learning;

The capability being learned should be relatively static, - If the function changes rapidly, retraining is typically required, including the acquisition of new training data; and

No need for detailed explanation of how the decision was made, - the methods behind a machinelearning recommendation, - subtle adjustments to the numerical weights that interconnect its huge number of artificial neurons, - are difficult to explain because they’re so different from those used by humans.

December 24, 2018

People have long feared that machines are coming for our jobs. Throughout the Industrial Revolution there were periodic panics about the impact of automation on work, going back to the so-called Luddites, - textile workers who in the 1810s smashed the new machines that were threatening their jobs.

In a recent article in Foreign Affairs, World Bank Group President Jim Yong Kim warned that the world is facing a Human Capital Gap.“Governments in pursuit of economic growth love to invest in physical capital - new roads, beautiful bridges, gleaming airports, and other infrastructure.But they are typically far less interested in investing in human capital, which is the sum total of a population’s health, skills, knowledge, experience, and habits.That’s a mistake, because neglecting investments in human capital can dramatically weaken a country’s competitiveness in a rapidly changing world, one in which economies need ever-increasing amounts of talent to sustain growth.”

December 17, 2018

A recent issue of The Economist included a special report on cryptocurrencies and blockchains.The Economist’s overall conclusion was that “Bitcoin has been a failure as a means of payment, but thrilling for speculators.”Its assessment of blockchain was somewhat more positive.“For blockchains, the jury is still out,… For all the technology’s potential, though, most attempts to use it remain tentative,…The advantages of blockchains are often oversold.”

Is there too much hype surrounding blockchain?Absolutely,… but not surprising.All potentially transformative technologies are oversold in their early stages.Remember the dot-com bubble of the late 1990s.Blockchain is still in its early phases of experimentation and adoption.Much work remains to be done on standards, platforms, interoperability, applications and governance.

But, does blockchain have the potential to become a truly transformative technology over time? Yes, said McKinsey in a recent article on the strategic value of blockchain beyond the hype. The article starts out by acknowledging that all the hype around blockchain makes it difficult to nail down not just blockchain's long term strategic value, but also what it actually is and what it isn’t in the present. To make sure everyone is on the same page, the article first recaps what it calls “the nuts and bolts of blockchain.” Let me start by summarizing blockchain’s nuts and bolts.

December 10, 2018

How will labor markets evolve in our 21st century digital economy?What’s the likely future of jobs, given that our increasingly smart machines are now being applied to activities requiring intelligence and cognitive capabilities that not long ago were viewed as the exclusive domain of humans?How will AI, robotics and other advanced technologies transform the very nature of work?

Over the past few years, a number of papers, reports and books have addressed these very important questions.They generally conclude that AI will have a major impact on jobs and the very nature of work. For the most part, they view AI as mostly augmenting rather than replacing human capabilities, automating the more routine parts of a job and increasing the productivity and quality of workers, so they can focus on those aspect of the job that most require human attention. Overall, few jobs will be entirely automated, but automation will likely transform the vast majority of occupations.

A recent McKinsey report noted while there will likely be enough work to maintain full employment by 2030, the transition will be very challenging, “on a scale not seen since the transition of the labor force out of agriculture in the early 1900s in the United States and Europe, and more recently in in China.”The report estimated that “up to 375 million workers, or 14 percent of the global workforce, may need to change occupations - and virtually all workers may need to adapt to work alongside machines in new ways.”

Given these predictions about the changing nature of work, what should companies do?How should firms prepare for a brave new world where we can expect major economic dislocations along with the creation of new jobs, new business models and whole new industries, and where many people will be working alongside smart machines in whole new ways?

December 03, 2018

A few weeks ago, the World Economic Forum (WEF) released The Future of Jobs Report 2018. The report takes an in-depth look at the world of work in what the WEF calls the Fourth Industrial Revolution. The report is based on a survey of over 300 chief human resource officers and top strategy executives from large global companies across 12 industry sectors.These companies collectively represent more than 15 million employees, and conduct business in 20 developed and emerging economies which collectively account for about 70% of global GDP.

First introduced at its 2016 annual Davos Forum, the WEF positions the Fourth Industrial Revolution within the historical context of three previous industrial revolutions. The First, - in the last third of the 18th century, - ushered the transition from hand-made goods to mechanized, machine-based production based on based on steam and water power.The Second, - a century later, - revolved around steel, railroads, cars, chemicals, petroleum, electricity, the telephone and radio, leading to the age of mass production.The Third, - starting in the 1960s, - saw the advent of digital technologies, computers, the IT industry, and the automation of process in just about all industries.

“Now a Fourth Industrial Revolution is building on the Third…” wrote WEF founder and chairman Klaus Schwab in a December, 2015 Foreign Affairs article.“The possibilities of billions of people connected by mobile devices, with unprecedented processing power, storage capacity, and access to knowledge, are unlimited.And these possibilities will be multiplied by emerging technology breakthroughs in fields such as artificial intelligence, robotics, the Internet of Things, autonomous vehicles, 3-D printing, nanotechnology, biotechnology, materials science, energy storage, and quantum computing.”

The Future of Jobs 2018 examines the potential of these advanced technologies to eliminate and create new jobs, as well as to improve the current workplace and prepare people for emerging jobs. Rather than looking at the future of work over the longer term, the report chose to examine the 2018-2022 period, which is within the planning horizon of the companies and executives that participated in the survey.

“The IIC believes that Inclusive Innovation is an economic and moral imperative, and that the key question of our era isn’t what technology is going to do to our economy and society, but what we will do with technology.By identifying and promoting the powerful global community of future of work visionaries, the IIC proactively accelerates the technology-driven solutions enabling greater economic opportunity for working people around the world facing the challenge of rapidly advancing digital progress.”

Entrepreneurs from around the globe compete for awards in four categories: income growth and job creation, skills development and opportunity matching, technology access, and financial inclusion. Approximately 1,500 organizations registered to compete this year, and 500 judges reviewed their applications.

In 2018, rather than having one global competition as in the previous two years, there were five separate regional competitions, - North America, Latin America, Europe, Africa, and Asia, - each region selecting its own category winners.A Grand Prize winner was then selected in each category from among the regional winners and each awarded $250,000.The winners were announced at a gala event that took place in MIT on November 8.

November 05, 2018

Darwinian principles seem to apply in business almost as much as in biology. After analyzing the longevity of more than 30,000 public US firms over a 50-year span, Martin Reeves, Simon Levin, and Daichi Ueda noted in a 2016 HBR article, The Biology of Corporate Survival, that companies are disappearing faster than ever before.“Public companies have a one in three chance of being delisted in the next five years, whether because of bankruptcy, liquidation, M&A, or other causes. That’s six times the delisting rate of companies 40 years ago… Neither scale nor experience guards against an early demise.”

Biological systems have long been an inspiration in the study of complex systems.In their HBR article, the authors argued that companies are not just like biological species, but in some important respects they’re actually identical to biological species. Companies and biological systems are both what’s known as complex adaptive systems, that is, systems in which a perfect understanding of their individual components does not automatically lead to a perfect understanding of their overall system behavior.

Companies are an example of sociotechnical systems, that is, systems that have to deal with complex technologies and infrastructures, and the even more complex issues associated with human and organizational behaviors. Other examples include cities, government agencies, industries and economies. The dynamic nature of their technology and human components, as well as their intricate interrelationships renders such systems increasingly unpredictable and accounts for their emergent behavior and unintended consequences.

October 29, 2018

In Reinvent Your Business Model, his recently published book, Mark Johnson argues that digital transformation and business model innovation are not the same thing.New technology alone, - no matter how transformative, - is not enough to propel a business into the future.The business model wrapped around the technology is the key to its success or failure.Johnson is Senior Partner in Innosight, the strategy consulting firm he cofounded with Harvard Business School professor Clayton Christensen in 2000.He’s been conducting research on business model innovation for over a decade, and in 2010 published Seizing the White Space.

Business model innovation has long been the domain of disruptive startups looking to compete against established companies by changing the rules of the game, - and, hopefully, creating new markets and reshaping entire industries.But, it’s no longer enough for established companies to just roll out improved products and services based on their once-reliable business models.

“Building a great business and operating it well no longer guarantees you’ll be around in a hundred years, or even twenty,” notes Johnson.“In 1965, the average length of time a company remained on the S&P 500 was thirty-three years.By 1990, it had dropped to twenty years; in 2012, it was just eighteen.Based on the 2017 churn rate, it is forecast that half of the S&P 500 will be replaced over the next ten years.”

Examples abound.In the 1970s, Xerox PARC famously developed, - but didn’t commercialize, - some of the key innovations of the PC era, including the graphical user interface, the mouse and local area networks.In 2010, Blockbuster filed for bankruptcy, a victim of Netflix’s new business models.Done in by Apple’s iPhone and Android-based smartphones, the Blackberry, - once the undisputed leader in its market, - fell into a death spiral from which it never recovered.

October 22, 2018

PwC recently released a report on the potential economic value of AI to different regions and industry sectors around the world.The report defined AI, “as a collective term for computer systems that can sense their environment, think, learn, and take action in response to what they’re sensing and their objectives.”By this broad definition, AI includes the automation of physical and cognitive tasks; assisting people to perform tasks better and faster; helping humans make better decisions; and automating decision making with no human intervention.

The report’s overriding finding is that AI is the biggest commercial opportunity for companies, industries and nations over the next few decades.PwC estimates that AI advances will increase global GDP by up to 14% between now and 2030, the equivalent of an additional $15.7 trillion contribution to the world’s economy.

Around $6.6 trillion of the expected GDP growth will come from productivity gains, especially in the near term.These include the continued automation of routine tasks, and the development of increasingly sophisticated tools to augment human capabilities.Companies that are slow to adopt these AI-based productivity improvements will find themselves at a serious competitive disadvantage.

But, over time, increased consumer demand for AI-enhanced offerings will overtake productivity gains and result in an additional $9.1 trillion of GDP growth by 2030.Moreover, network effects, - that is, the more data and better insights companies are able to gather, the more appealing the products and services they’ll be able to develop, - will further increase consumer demand. AI front-runners will gain an enormous competitive advantage through their ability to leverage this rich supply of customer data to shape product developments and business models, making it harder for slower moving competitors to catch up.

“In the beginning computers were human. Then they took the shape of metal boxes, filling entire rooms before becoming ever smaller and more widespread. Now they are evaporating altogether and becoming accessible from anywhere…Now,… computing is taking on yet another new shape. It is becoming more centralised again as some of the activity moves into data centres. But more importantly, it is turning into what has come to be called a ‘cloud’, or collections of clouds. Computing power will become more and more disembodied and will be consumed where and when it is needed…It will allow digital technology to penetrate every nook and cranny of the economy and of society…”

Serverless is an approach to software development for quickly building and deploying applications as a collection of loosely coupled modules and microservices.According to Forrester, “Firms report a better software development experience, rapid scaling of services that compose applications, lower costs, and better infrastructure utilization when workloads are variable.They also spend less time maintaining cloud infrastructure.”

October 08, 2018

In 2015, the McKinsey Global Institute launched a multi-year study to explore the potential impact of automation technologies on jobs, organizations and the future of work.In the intervening three years, the study has published a number of report on the subject.

The first report, published in November, 2015, explored whether we can look forward to vast improvements in productivity and quality of life, or whether automation will mostly threaten jobs, disrupt organizations, and strain the social fabric. Based on an analysis of around 2,000 work activities across 800 different occupations, the report concluded that “fewer than 5 percent of occupations can be entirely automated using current technology.However, about 60 percent of occupations could have 30 percent or more of their constituent activities automated.”In other words, relatively few jobs will be entirely automated, but automation will likely transform the vast majority of occupations.

McKinsey’s December, 2017 report analyzed whether there will be enough work in the future.The overall conclusion was that a growing technology-based economy will create a significant number of new occupations, - as has been the case in the past, - which will more than offset declines in occupations displaced by automation.However, “while there may be enough work to maintain full employment to 2030 under most scenarios, the transitions will be very challenging - matching or even exceeding the scale of shifts out of agriculture and manufacturing we have seen in the past.”

Let me now discuss the more recent report published earlier this year, which examined the changes in skills required of human workers over the next 10-15 years.To do so, the study analyzed how the total number of hours worked in 25 different skill areas has changed between 2002 and 2016 and estimated the expected change in hours worked by 2030.This was done for the US and 5 Western European countries. I’ll focus my discussion on the US skill changes, as the European changes were similar.

October 01, 2018

A few weeks ago I wrote about social physics, a new discipline that aims to help us better understand and predict the behavior of human groups.Social physics is based on the premise that all event-data representing human activity, - e,g,, phone call records, credit card purchases, taxi rides, web activity, - contain a special set of group behavior patterns.As long as the data involves human activity, - regardless of the type of data, the demographic of the users or the size of the data sets, - similar behavioral dynamics apply.These patterns can be used to detect emerging behavioral trends before they can be observed by other data analytics techniques.

Physics, biology and other natural sciences have long relied on universal patterns or principles to detect a faint signal within a large data set, - i.e., the proverbial needle in a haystack.It’s what has enabled the discovery of very short lived elementary particles in physics, - like the Higgs bosonin 2013, - amidst the huge amounts of data generated by high energy particle accelerators. In biology, it’s given rise to DNA sequencing and its growing list of applications in medicine, biotechnology and other disciplines.

It’s not surprising that evolutionary biology and natural selection have led to similar universal patterns in the behavior of human crowds.Humans and our ancestors have evolved with the drive to learn from each other because it’s been a major part of our survival over millions of years.And if a new behavior, - whether the result of an innovative idea like the discovery of tools, or a mutation like a larger brain size, - helps a human group better adapt to a changing environment, natural selection will favor the survival of that group over others.

Social physics originated in MIT’s Human Dynamics Lab based on research by professor Alex (Sandy) Pentland, his then postdoctoral associate Yaniv Altshuler and their various collaborators.In 2014, Pentland and Altshuler co-founded Endor, an Israeli-based startup that leverages social physics methods to make fast accurate predictions by analyzing data derived from human behavior.

September 24, 2018

Bitcoin was created about a decade ago, with the release of the original design paper, Bitcoin: A Peer-to-Peer Electronic Cash System, under the pen name of Satoshi Nakamoto.Bitcoin aimed to become a decentralized cryptocurrency and digital payment system that would enable the secure exchange of digital money without the need for central government authorities or financial intermediaries. It introduced the blockchain, - which most everyone agrees is a truly brilliant architecture built on decades-old fundamental research in cryptography, distributed data, distributed computing, game theory and other advanced technologies.

How are they doing after 10 years? The September 1 issue of The Economist includes a special report on cryptocurrencies and blockchains, with nine articles on the subject. The report’s overview article doesn’t mince words in articulating its overall assessment: “Bitcoin and other cryptocurrencies are useless.For blockchains, the jury is still out.”

“Bitcoin has been a failure as a means of payment, but thrilling for speculators,” it further added.Its price has been subject to wild fluctuations over the past two years.It rose from around $600 in September, 2016, to $6,000 in September, 2017, reaching a peak of over $19,000 in December, 2017, before falling to around $7,000 earlier this year. It’s been hovering between $6,500 and $7,500 for the past few months.Other cryptocurrencies have experienced similar price fluctuations.“That has made a few people very rich (just 100 accounts own 19% of all existing bitcoin), encouraged others to play for quick gains and left some nursing substantial losses.”

September 10, 2018

Social physics first emerged over 200 years ago as an attempt to understand society and human behavior using laws similar to those of the physical sciences.But, it wasn’t until the past two decades that we finally had enough data, powerful computers and sophisticated mathematical algorithms to develop quantitative theories of human social interactions. We were now able to reliably predict how large groups of people make decisions by analyzing how information and ideas flow from person to person.

“The engine that drives social physics is big data: the newly ubiquitous digital data now available about all aspects of human life,” wrote MIT professor Alex (Sandy) Pentland in his 2014 book Social Physics: How Good Ideas Spread - The Lessons from a New Science.“Social physics functions by analyzing patterns of human experience and idea exchange within the digital bread crumbs we all leave behind as we move through the world, - call records, credit card transactions, and GPS location fixes, among others.These data tell the story of everyday life by recording what each of us has chosen to do… Who we actually are is more accurately determined by where we spend our time and which things we buy, not just by what we say we do.”

The book was the result of over a decade’s worth of research in the Human Dynamics group at MIT’s Media Lab.Pentland, along with his graduate students and research associates, partnered with a variety of companies to obtain and analyze real-world data, properly anonymized to protect the privacy of the individuals whose behavior was reflected in the data. They eventually discovered that all event-data representing human activity contain a special set of social activity patterns regardless of what the data is about.These patterns are common across all human activities and demographics, and can be used to detect emerging behavioral trends before they can be observed by any other technique.

August 20, 2018

AI is rapidly becoming one of the most important technologies of our era. Every day we can read about the latest AI advances from startups and large companies, including applications that not long ago were viewed as the exclusive domain of humans.Over the past few years, the necessary ingredients have finally come together to propel AI beyond the research labs into the marketplace: powerful, inexpensive computer technologies; huge amounts of data; and advanced algorithms.

Machine learningand related algorithms like deep learning, have played a major role in AI’s recent achievements.Machine learning gives computers the ability to learn by ingesting and analyzing large amounts of data instead of being explicitly programmed.It’s enabled the construction of AI algorithms that can be trained with lots and lots of sample inputs, which are subsequently applied to difficult AI problems like language translation, natural language processing, and playing championship-level Go.

Over the past several decades, AI has been besieged by rounds of hype which over-promised, under-delivered, and nearly killed the field. Once more, a recent Gartner article positioned machine and deep learning at the top of their hype cycle, - when all the excitement, publicity and promising potential often leads to a peak of inflated expectations, risking falling into the trough of disillusionment if the technology fails to deliver. Now that AI is finally reaching a tipping point of market acceptance, it’s particularly important to be cautious and not repeat past mistakes.

InArtificial Intelligence - The Revolution Hasn’t Happened Yet, UC Berkeley professor Michael I. Jordan aims to inject such a note of caution.“AI has now become the mantra of our current era…The idea that our era is somehow seeing the emergence of an intelligence in silicon that rivals our own entertains all of us - enthralling us and frightening us in equal measure.And, unfortunately, it distracts us…Whether or not we come to understand intelligence any time soon, we do have a major challenge on our hands in bringing together computers and humans in ways that enhance human life.”

August 06, 2018

I’ve long believed that customer self-service was key to the success of e-business, - the IBM Internet strategy I was closely involved with in the mid-late 1990s.It was quite revolutionary how easy it was to now do for yourself so many ordinary activities that previously required a phone call during office hours or a trip to a store or office.

Moreover, such e-business applications were relatively easy to develop.By integrating their existing transaction and data base applications with a web front-end, - a strategy we succinctly described as Web + IT, - any business could now be in touch with its customers, employees, suppliers and partners at any time of the day or night, no matter where they were.Companies were able to engage in their core activities in a more productive way by web-enabling their back-end systems, that is, linking their web front-ends to the presentation services of their back-end applications.

I was reminded of IBM’s e-business strategy when recently reading about Robotic Process Automation (RPA), a technology for automating business processes based on emulating the manual actions of a human at a keyboard.RPA aims to improve the operational efficiency of office and service workers by automating tedious, repetitive, tasks, such as those associated with widely used horizontal processes in HR, Finance & Accounting and IT services.

RPA enables process automation at a fraction of the cost and time of classic software development.Rather than automating a process by redesigning the overall back-end application, RPA interfaces with the back-end system by performing the same actions that a human does via the application’s user interface. RPAs thus creates kind of software robots, or bots for short, that work alongside the humans.

July 09, 2018

Given the pace of technological change, we tend to think of our age as the most innovative ever.But over the past several years, a number of economists have argued that increasing R&D efforts are yielding decreasing returns.

Has the ideas machine broken down?, asked The Economistin a January, 2012 article that examined the growing concerns that we may be in a long-term period of slow innovation despite our rapidly advancing technologies.“With the pace of technological change making heads spin, we tend to think of our age as the most innovative ever,” saidThe Economist.“We have smartphones and supercomputers, big data and nanotechnologies, gene therapy and stem-cell transplants. Governments, universities and firms together spend around $1.4 trillion a year on R&D, more than ever before.”But, perhaps these don’t quite compare with modern sanitation, electricity, cars, planes, the telephone, and antibiotics. These innovations, first developed in the late 19th and early 20th century, have long been transforming the lives of billions.

In a September, 2012 paper, Northwestern University economist Robert Gordon questioned the generally accepted assumption that economic growth is a continuous process that will persist forever.He wrote that the slow growth we’ve been experiencing in the US and other advanced economies isn’t cyclical, but rather evidence that long-term economic growth may be grinding to a halt.

The rapid growth and rising per-capita incomes we experienced at the height of the Industrial Revolution, - between 1870 and 1970, - may have been a unique episode in human history.Since the 1970s, US productivity and income growth have dipped sharply except for an Internet-driven productivity boost between1996 and 2004.Innovation may be hitting a wall of diminishing returns.There was little growth before 1800, and there might conceivable be little growth in the future.

A similar pessimistic view was expressed by George Mason University economist Tyler Cowen in his 2011 book The Great Stagnation. According to Cowen, over the past two centuries the US economy has enjoyed lots of low-hanging fruit, including a vast, resource rich land, waves of immigrant labor, access to education and the technological advances of the Industrial Revolution. But, Cowen believes that we are at a technological plateau, and wonders whether long term growth is still possible because the supply of low-hanging economic fruit is nearly exhausted.

July 03, 2018

After decades of promise and hype, AI is now seemingly everywhere.Over the past several years, the necessary ingredients have come together to propel AI beyond the research labs into the marketplace: powerful, inexpensive computer technologies; huge amounts of data; and advanced algorithms including machine learning.

While AI is likely to become one of the most important technologies of our era, we’re still in the early stages of deployment, especially outside leading-edge technology companies.But, as AI continues its rapid progress, it’s not too early to ask a few important questions: What is AI’s overall value to the economy? What are the biggest application opportunities?; and, what are AI’s most serious challenges and limitations?

To address these questions, McKinsey recently published a discussion paper on the marketplace potential of AI.The paper is particularly focused on machine learning and related technologies, and is based on a detail analysis of more that 400 use cases across 19 industries and 9 business functions.

“Two-thirds of the opportunities to use AI are in improving the performance of existing analytical use cases,” is the paper’s overriding finding.This is a very interesting insight.AI is now being successfully applied to tasks that not long ago were viewed as the exclusive domain of humans, - machine translation, natural language processing, defeating the world’s top Go players, - but only 16% of the use cases studied by McKinsey are greenfield cases, where only machine learning techniques can be used.For the vast majority of the use cases, 69%, the key value of machine learning is to improve performance beyond that provided by traditional analytic techniques.And, in the remaining 15% of cases, machine learning provided limited additional performance over existing analytical methods.

June 11, 2018

For the past couple of centuries, general-purpose technologies (GPTs) have been the key drivers of productivity and economic growth, - thanks to their rapid improvements in price and performance, pervasive applications, and ability to spur complementary innovations across a wide variety of industries. The steam engine, electricity and the internal combustion engine are prominent examples of GPTs in the 18th and 19th century.More recently, semiconductors, computers and the Internet have led to the digital revolution of the past several decades.

Beyond innovations in existing sectors, the rapidly improving price/performance of GPTs have led over time to the creation of whole new applications and industries. For example, the steady declines in the price of electricity-generated power and the improvement in the efficiency of electric motors led to the radical transformation of manufacturing in the early part of the 20th century with the advent of the assembly line. It also led to the creation of the consumer appliance industry, e.g., refrigerators, dishwashers, washing machines. Similarly, as the semiconductor industry took off, it led to the historical transition from the industrial economy of the past two centuries to our ongoing digital economy.

How about artificial intelligence? Beyond its use by leading edge technology companies, we’re still in the early stages of AI deployment.It’s only been in the last few years that major advances in machine learning have taken AI from the lab to early adopters in the marketplace.While considerable innovations and investments are required for its wider deployment, AI is likely to become one of the most important GPTs in the 21st century.

May 28, 2018

Autonomous vehicles (AVs) may well be the quintessential symbol of our AI/robotics age.Cars are a major part of our daily lives.A self-driven car is a concept that requires little explanation, something we can all quickly grasp.It wasn’t that long ago that the notion on an AV driving us around while we read or sleep would have felt like the stuff of science fiction. Having experimental AVs coursing through public roads in Silicon Valley, Pittsburgh and Phoenix is concrete evidence that our smart machines are achieving human-like intelligence, raising a number of important questions:how long before AVs are all around us?; how will they impact our lives?; what unintended consequences might we have to deal with?; and what should be done to ensure that they arrive as safely and smoothly as possible?

These are among the questions addressed in the March 1 issue of The Economist, which includes a comprehensive special report on autonomous vehicles with seven articles on the subject.The special report starts out with the assumption that whatever technological hurdles lie in their way will be eventually overcome.But there are wider economic, social and public policy issues to be explored, starting with: what can we learn from the transition to horseless carriages in the 20th century that can now be applied to the transition to driverless cars?

May 21, 2018

Will there be enough work in the future?What’s the likely impact of our continuing technology advances on jobs?How will they impact productivity?These are all-important questions to reflect on as our increasingly smart technologies are now being applied to activities that not long ago were viewed as the exclusive domain of humans.

While no one really knows the answer to these questions, most studies of the subject have concluded that relatively few occupations, 10% or less, will be entirely automated and disappear over the next 10 - 15 years.Instead, a growing percentage of occupations will significantly change as technologies automate the more routines tasks within those occupations.People will still be involved, but their jobs will be transformed by the advanced tools they now have to master.Moreover, a growing technology-based economy will likely create all kinds of new occupations, which will more than offset declines in occupations displaced by automation, - as has been the case over the past couple of centuries.

One would also expect that technology advances will increase the productivity of the workers involved in these new or transformed occupations.But, if we look at the past 10-15 for guidance, we’ll find that productivity growth has significantly declined over this timeframe, notwithstanding huge technology advances like smartphones, cloud computing, big data and artificial intelligence.Economist have proposed competing explanations for the declining productivity growth, but have so far failed to reach consensus.Understanding this productivity puzzle may well hold the key to future productivity improvements and long-term economic growth.

May 07, 2018

I recently participated in a blockchain panel at a conference of senior supply chain executives from a variety of companies. I welcomed the opportunity to learn how these real-world executives feel about blockchain, as well as how to best communicate the promise and current state of this emerging and potentially transformative technology.Let me share a few of my observations.

Before our panel started, the executives were asked how they viewed blockchain.In general, they felt that there was too much hype surrounding blockchain, were skeptical about its value to supply chain systems, but expressed a wait-and-see attitude.They were then asked how many were conducting blockchain prototypes in their companies.The majority raised their hands.

I thought that their views and subsequent questions and comments during the panel succinctly captures the current state of blockchain in the business world: skeptical, turned off by the hype, but hedging their bets by getting on the learning curve with prototypes and experimentation.

The moderator first asked each panelist to address the audience’s skepticism about blockchain.What, in our opinion, is the key value of blockchain?I said that I look at the value of blockchain through two complementary lenses:enhancing the security of the Internet, and reducing the inefficiencies and overheads in applications, like supply chains, involving multiple institutions.

April 30, 2018

In March of 2017, Northeastern University launched a new interdisciplinary initiative, the Global Resilience Institute (GRI).GRI is a university-wide research and educational effort to advance the resilience of individuals, communities, economies and societies around the world by strengthening their capacity to adapt to an increasingly turbulent world.Its key mission is to develop and deploy practical tools, applications and skills to help effectively respond to human-made and naturally-occurring disruptions and disasters.

GRI’s Founding Director is Northeastern professor Stephen Flynn.In his Director’s Welcome web page, Flynn writes that “Building resilience within and across multiple levels, from individual to societal, requires a comprehensive effort that matches the complexity of the increasingly interdependent systems and networks we all rely on… Our aim is to serve as both a channel and a catalyst for experts in industry, academia, and government to collaborate on solving the world’s most pressing resilience challenges. These include both slowly emerging disruptions as well as shocks and sudden disasters.”

Most everyone agrees that dealing with our increasingly turbulent, complex world us requires significant enhancements to the resilience of our societies and systems.But, five critical barriers stand in the way:

Widespread risk illiteracy and a limited understanding of the new dependencies and interdependencies that pervade our more connected lives;

Inadequate designs for embedding resilience into systems, networks, and infrastructure at multiple levels;

April 23, 2018

Identity plays a major role in everyday life. Think about going to an office, getting on a plane, logging to a website or making an online purchase.Identity is the key that determines the particular transactions in which we can rightfully participate as well as the information we’re entitled to access. But, we generally don’t pay much attention to the management of our identity credentials unless something goes seriously wrong.

For much of history, our identity systems have been based on face-to-face interactions and on physical documents and processes.But, the transition to a digital economy requires radically different identity systems.In a world that’s increasingly governed by digital transactions and data, our existing methods for managing security and privacy are proving inadequate.Data breaches, large-scale fraud, and identity theft are becoming more common.In addition, a significant portion of the world’s population lacks the credentials needed to participate in the digital economy.Our existing methods for managing digital identities are far from adequate.

As explained in A Blueprint for Digital Identity, - a 2016 report by the World Economic Forum, - identity is essentially a collection of information or attributes associated with a specific individual.These attributes fall into three main categories:inherent - attributes intrinsic to an individual, - e.g., age, height, date of birth, fingerprints, color of eyes, retinal scans;assigned - attributes attached to but not intrinsic to the individual - e.g., e-mail address, telephone numbers, social security, drivers license, passport number; and accumulated - attributes gathered or developed over time - e.g., health records, job history, home addresses, schools attended.

While mostly associated with individuals, identities can also be assigned to legal entities like corporations, partnerships and trusts; to physical entities like cars, buildings, smartphones and IoT devices; and to digital entities like patents, software programs and data sets.

Data attributes are generally siloed within different private and public sector institutions, each using its data for its own purposes.But to reach a higher level of privacy and security, we need to establish trusted data ecosystems, which requires the exchange and sharing of data across a variety of institutions.The more data sources a trusted ecosystem has access to, the higher the probability of detecting fraud and identity theft while reducing false positives.In addition, an ecosystem with a variety of data sources can help foster economic inclusiveness by certifying the identities and credit worthiness of poor people with no banking affiliation.

However, safeguarding the data used to validate identities creates security and privacy issues of its own. It’s unsafe to gather all the needed attributes within one institution or central data location, making it a target for data breaches. But, it’s also highly unfeasible, as few institutions will let their critical data out of their premises.

April 16, 2018

About five years ago, McKinsey conducted an online survey of over 850 CEOs and other senior executives to find out how their companies were faring in their implementation of digital technologies and strategies. The data showed that executives were generally optimistic, but they still had much to do to achieve their digital business objectives.Organizational alignment and leadership were the critical factors in the success or failure of their digital strategies.

More specifically, the survey asked them about their progress in embracing five major digital trends: big data and advanced analytics, digital engagement of customers, digital engagement of employees and external partners, automation, and digital innovation.While all five trends were important, executives said that digital customer engagements was their most important competitive differentiator and where they were expecting the largest financial returns.

This is not surprising, even more now than at the time of the survey.A company can differentiate itself from competitors in one of two key ways: by providing a superior customers experience or by offering the lowest prices. For companies that prefer the former, digital channels are, far and away, the most cost-effective way of reaching out to their clients. The explosive growth of mobile devices means that you can be engaged with your customers whether they are at home, at work, in your store or anywhere else.

But, digital customers can be fickle and hard to satisfy.It’s quite difficult to keep up with their fast changing behaviors and expectations.New products and services are hitting the market faster than ever, brand loyalty keeps decreasing, and the increased competition continues to shift power from institutions to individuals.

April 02, 2018

One can look at the transformational impact of IT on companies and industries through a variety of lenses, - e.g., products and services, revenue and profit, sales and support, market strategy.One of the most important such lenses is the impact of IT on business processes, that is, on how work is actually done across the various functions of an organization.

Examining the impact of IT over the past several decades through a process point of view, one can identify three distinct phases: first came process automation in the early years of IT, followed around the 1990s by enterprise-wide process reengineering and management.The third phase, is now starting, focused on the processes and transactions that determine how institutions interact with each other around the world.Let me discuss each of these phases.

March 05, 2018

A killer app is an IT application whose value is simple to explain, its use is fairly intuitive, and the application turns out to be so useful that it helps propel the success of a product or service.Spreadsheets and word processing were major factors in the 1980s adoption of personal computers, for example.A decade later, e-mail, and the World Wide Web played crucial roles in the mainstream success of the Internet.While there are a number of strong candidates for smartphone killer-apps, navigation and music are among the most widely used.

Blockchain has been in the news lately, but beyond knowing that it has something to do with payments and digital currencies, most people don’t know what blockchain is or why they should care. A major part of the reason is that we still don’t have the kind of easy-to-explain blockchain killer-apps that propelled the Internet forward.

Blockchain has yet to cross the chasm from technology enthusiasts and visionaries to the wider marketplace that’s more interested in business value and applications. There’s considerable research on blockchain technologies, platforms and applications as well as market experimentation in a number of industries - roughly where the Internet was in the mid-late 1980s: full of promise but still confined to a niche audience.

In addition, outside of digital currencies, -a somewhat exotic topic of interest to a relatively small audience, - blockchain applications are primarily aimed at institutions. And, given that blockchain is all about the creation, exchange and management of valuable assets, its applications are significantly more complex to understand and explain than Internet applications.

The management of information is quite different from the management of transactions.The latter, especially for transactions dealing with valuable or sensitive assets, requires deep contractual negotiations among companies and jurisdictional negotiations among governments.Moreover, since blockchain is inherently multi-institutional in nature, its applications involve close collaboration among companies, governments and other entities.

January 22, 2018

This past October, the Pew Research Center released Automation in Everyday Life, a report on what Americans think about advanced technologies like artificial intelligence and robotics, and the impact they expect them to have on their everyday lives.The report is based on a national survey of 4,135 randomly selected American adults.To help gauge their opinion on such complex topics, the survey questions were framed around four specific scenarios related to these advanced technologies: driverless cars, workplace automation, robot caregivers, and computer algorithms that evaluate and hire job applicants.

“Americans anticipate significant impacts from various automation technologies in the course of their lifetime,” noted the report.“Although they expect certain positive outcomes from these developments, their attitudes more frequently reflect worry and concern over the implications of these technologies for society as a whole.”

January 15, 2018

In 2008 I gave a talk at a conference on The Promise and Reality of Cloud Computing.In his closing remarks, the conference organizer noted that most everyone had agreed that something big and profound was going on, but they weren’t quite sure what it was they were excited about.“There is a clear consensus that there is no real consensus on what cloud computing is,” he said.

At the time, most people associated cloud with utility computing and IT-as-a-Service.Technology writer Nicholas Carr was also a speaker at the conference.In his talk, based on his then recently published book The Big Switch, Carr nicely framed the historical shift to cloud computing, comparing the evolution of IT to that of power plants.In the early days, companies usually generated their own power with steam engines and dynamos. But with the rise of highly sophisticated, professionally run electric utilities, companies stopped generating their own power and plugged into the newly built electric grid.

IT was then undergoing a similar transformation, as cloud service providers were starting to offer IT-as-a-Service with near unlimited scalability at very attractive prices, based on a flexible pay-as-you-go delivery model.Early cloud adopters were mostly focused on improving the economics of IT, giving them the ability to launch new offerings or expand their current ones without major investments in additional IT infrastructure while paying for whatever cloud resources they actually used.

Cloud continued to evolve and advance over the ensuing years.As an article noted about a 2013 cloud conference: “There wasn’t a single speaker who started off a session by saying, ‘Let's define cloud computing.’That gets tiresome when seen in session after session, year after year, so its absence is gratefully received. This is a clear indication that the industry has moved beyond elementary knowledge-gathering and onto the practicalities associated with cloud implementation and rollout.” Still, debates continued on the truedefinition of cloud computing. Do private and hybrid clouds qualify or only public ones?

The paper’s authors, - MIT professor Erik Brynjolfsson, MIT PhD candidate Daniel Rock, and University of Chicago professor Chad Syverson, - note that “aggregate labor productivity growth in the U.S. averaged only 1.3% per year from 2005 to 2016, less than half of the 2.8% annual growth rate sustained from 1995 to 2004…What’s more, real median income has stagnated since the late 1990s and non-economic measures of well-being, like life expectancy, have fallen for some groups.”

This productivity puzzle isn’t confined to AI.“Few topics in economics today in most large economies generate as much debate as the productivity puzzle,” said McKinsey in a March, 2017 report.“Inthe United States, productivity growth has declined sharply since 2004 yet digital technology has been widely apparent during this period…The answer to this puzzle holds the key to future prosperity because now more than ever our economy depends on productivity improvements for long-term economic growth.Economists have proposed competing explanations for declining productivity growth and so far have failed to reach a consensus.”

January 02, 2018

Just about every industry has been significantly transformed in the past few decades. But few have been as disrupted as the music industry. Everything seems to be changing at once, from the way content is produced and delivered, to the sources of revenue and profits. Digital technologies, - the Internet, smartphones, cloud computing, … - have literally turned dollars into pennies. Now, blockchain and related technologies may once more play a major role in the music industry, - this time helping to turn those pennies back into dollars.

We’re truly surrounded by music as never before, - in a wide variety of styles; in physical and digital formats; over the Internet, satellite, and broadcasts; in mobile devices and home music systems.But, the shift from physical to digital, and then from downloads to streaming have wreaked havoc on the business of music.US retail revenues of recorded music were close to $14 billion in 1998 before starting their decline.According to the Recording Industry Association of America (RIAA), revenues fell from roughly $12 billion in 2006 to around $7 billion in 2010.They stayed flat at $7 billion though 2015, starting to increase in 2016 mostly due to a growth in paid streaming music subscriptions.Revenues are expected to be around $8 billion in 2017.

Turning dollars into pennies succinctly captures the music industry’s precipitous revenue declines.As this article noted, “During the heyday of the 80s, the music industry was rolling in so much money it could afford to leave some on the table… which it often did…There were so many dollars coming in, they didn’t bother paying attention to the pennies.Well, as we all know those days are gone.And after a decade of losses from its traditional revenue stream (sales), we’ve seen the music industry forced to look for revenue in new places and in different ways.In the absence of dollars, the pennies now matter.”

As a result, the music industry now finds itself trying to figure out how to keep track of all those pennies in the hope of turning them back into dollars.In the first half of 2017, streaming services accounted for 62% of the total market, with digital downloads and physical sales accounting for 19% and 16% respectively, said the RIAA.But streaming pays royalties in fractions of pennies, so legislation, new business models and technological innovations are needed to make sure that all the necessary data are efficiently collected so the various artists involved in a song are paid.

“The most important and innovative industries and the most talented, most ambitious, and wealthiest people are converging as never before in a relative handful of leading superstar cities that are knowledge and tech hubs,” wrote Florida.“This small group of elite places forge ever forward, while most others struggle, stagnate, or fall behind.This process is one I like to call winner-take-all urbanism.”

Winner-take-all urbanism is another manifestation of the winner-take-all economics of the past few decades, in which a relatively small number of players reap a very large share of the rewards, upending whole industries with their outsized returns.

December 11, 2017

Over the past few years, Industry 4.0, - aka the 4th Industrial Revolution, - has been the subject of several articles, studies, and surveys, as well as a book published earlier this year by Klaus Schwab, founder and executive chairman of the World Economic Forum (WEF).They all attempt to describe and put a name to the disruptive changes taking place all around us within the context of the past 250 years.

“Industrial revolutions are momentous events,” said A Strategist Guide to Industry 4.0, - a 2016 article in strategy+business.“By most reckonings, there have been only three.”The First Industrial Revolution, - starting in the last third of the 18th century, - introduced new tools and manufacturing processes based on steam and water power, ushering the transition from hand-made goods to mechanized, machine-based production.The Second starting a century later, brought us steel, cars, chemicals, petroleum, electricity, the telephone and radio while creating the age of mass production.The Third, - following World War II - saw the advent of computers, digital technologies, the IT industry, and the automation of process in just about all industries.

There’s general agreement that the 4th Industrial Revolution is primarily driven by a fusion of once separate technologies that when joined together are integrating the physical and digital worlds.But, there’s a spectrum of opinions as to the scope of its impact, some arguing that Industry 4.0 applies primarily to manufacturing technologies and industries, while others argue that it’s profoundly transforming our economies and societies.

“Perhaps the most critical function of any organization or society is its decision systems,” wrote Pentland. “In modern societies, decision systems provide a leader the ability to make informed and timely decisions, supported by a complex enterprise of distributed information and communication systems that provide situational awareness. Traditionally, decision systems have been confined to individual physical domains, such as logistics, physical plant, and human resources, and more recently virtual domains such as cyber, by both policy and technology, resulting in challenges with the integration of information across disparate domains.”

But, despite the increasingly complex decisions that organizations are called upon to make, decision-making remains human-intensive and anecdotal.Few organizations have applied social network analysis to help them scale the size and expertise of the decision-making group.Nor have they integrated the large amounts of data, analytical tools and powerful AI systems now at our disposal into their decision making systems.

According to Arthur, the digital revolution has morphed through three distinct eras over the past several decades.The first era, in the 1970s and 1980s, brought us Moore’s law and the dramatic advances in semiconductor technologies.From mainframe and supercomputers to PCs and workstation, IT was now being used in a wide variety of applications, from financial services and oil exploration, to computer-aided design and office systems.“The economy for the first time had serious computational assistance.”

Then came the second era in the 1990s and 2000s, which enabled us to link computers together, share information, and connect digital processes.“Everything suddenly was in conversation with everything else,” giving rise to “the virtual economy of interconnected machines, software, and processes…, where physical actions now could be executed digitally.”

We’re now in the third era, which began roughly in the 2010s.It’s brought us smartphones, ubiquitous sensors, IoT devices and oceans and oceans of data.Powerful computers and intelligent algorithms are enabling us to make sense of all that data by searching for patterns and doingsomething with the results, including computer vision, natural-language processing, language translation, face recognition and digital assistants.

November 06, 2017

The MIT Initiative on the Digital Economy (IDE) was organized in 2013 by Erik Brynjolfsson and Andy McAfee to examine the impact of digital technologies on the world.Understanding the future of work and jobs is one of the major areas of research being address by IDE.What will the workforce of the future look like?, Where will jobs come from in the coming years, especially for the workers must impacted by automation?, How can we accelerate the transformation of institutions, organizations, and human skills to keep up with the quickening pace of digital innovation?

To help come up with breakthrough, real-world answers to these tough questions, IDE launched the MIT Inclusive Innovation Challenge (IIC) last year.The Challenge aims to identify, celebrate and award prizes to organizations around the world that are developing innovative approaches for improving the economic opportunities of middle- and base-level workers.

Now in its second year, the IIC introduced the winners of its 2017 competition at an event held on October 12 as part of Boston’s annual HUBweek.Over $1 million was awarded to the winners in four categories: job creation and income growth, skills development and matching, technology access and financial inclusion.The grand prize winner in each category received $150,000, while the three runners up in each category received $35,000.The awards were funded with support from from Google.org, The Joyce Foundation, ISN and Joseph Eastin.

October 30, 2017

Gross domestic product (GDP), the basic measure of a country’s overall economic output, is generally used by governments to inform their policies and decisions.“What we measure affects what we do; and if our measurements are flawed, decisions may be distorted,” noted a 2009 Commissionconvened to look at the adequacy of GDP as an indicator of economic performance, - which was led by Nobel-prize winning economists Joseph Stiglitz and Amartya Sen.

GDP is essentially a measure of production. While suitable when economies were dominated by the production of physical goods, GDP doesn’t adequately capture the growing share and variety of services and the development of increasingly complex solutions in our 21st century digital economy.

Digital Spillover, a recent report co-developed by Huawei and Oxford Economics, explores how to better define and measure the true impact of the digital economy.It argues that the scope of the digital economy is expanding and proposes a novel way of measuring its impact.

“A truly digital economy is one in which businesses from across the industrial spectrum are investing in digital and making the most productive use of it,” notes the report.“The mechanisms by which this is happening are complex and evolving.Over and above the direct productivity boost that companies enjoy from digital technologies, a more profound chain of indirect benefits also takes place, as the impact spills over within a firm, to its competitors, and throughout its supply chain.These digital spillover effects materialize through numerous channels, and are integral to understanding the role digital technologies play in the economy.”

October 23, 2017

Two years ago, a group of leading CEOs from around the world met in China to discuss the major issues facing their companies in the global marketplace.The meeting was organized by the Center for Global Enterprise (CGE), - the nonprofit research institution founded by former IBM Chairman and CEO Sam Palmisano to study the contemporary corporation, globalization, economic trends, and their impact on society.The CEOs at the meeting in China concluded that their highest research priority was the evolution toward Digital Supply Chains (DSC), and its potential to transform organizations and the conduct of business around the world.In response, the Digital Supply Chain Institute (DSCI) was launched a few months later.

In October of 2016, the DSCI, in partnership with CREATe.orgpublished its first research paper, Digital Supply Chains: A Frontside Flip. The paper explains the key differences between traditional and digital supply chains, and provides practical advice to help companies prepare for the digital supply chains of the future

The traditional supply chain has evolved over the years as a crucial process used by many companies for the production, handling, and/or distribution of their products or services.Traditional supply chains generally comprise five separate back-office functions: transport, warehousing, purchasing, marketing and finance.Their key objective has been to improve the efficiency of getting goods and services from suppliers to customers, through a series of intermediate steps including manufacturing, distributors and retailers. But, these supply chains were quite fragmented, with limited interactions and information sharing among their various functions and steps.

Over the past couple of decades, the explosive growth of the Internet has made it much easier for companies to transact with each other around the world.The connectivity and universal reach of the Internet has enabled companies to integrate and better coordinate all their various processes, as well as to go beyond the boundaries of the firm and develop highly sophisticated global supply chains.Vertically integrated firms have evolved into virtual enterprises, increasingly relying on supply chain partners for many of the manufacturing and services functions once done in-house.

The DSCI paper argues that it’s time to focus on the transformation of the whole supply chain process, not just in further increases in efficiency. Their traditional linear nature is increasingly misaligned with the more networked nature of today’s global production.

Let me attempt to explore these questions based on relatively recent academic research in three key areas: gender-based behavioral differences; how to create smarter working groups; and the changing skills requirements in the digital economy.

September 11, 2017

AI is now seemingly everywhere.In the past few years, the necessary ingredients have come together to propel AI beyond the research labs into the marketplace: powerful, inexpensive computer technologies; advanced algorithms and models; and most important, oceans and oceans of data.

“Artificial intelligence is getting ready for business, but are businesses ready for AI?,” asks McKinsey in a recently published report - Artificial Intelligence: the Next Digital Frontier.“AI adoption outside of the tech sector is at an early, often experimental stage,” is the report’s succinct answer.“Few firms have deployed it at scale.”

The report is based on a survey of over 3,000 AI-aware C-level executives across 10 countries and 14 sectors.Only 20 percent of respondents had adopted AI at scale in a core part of their business.40 percent were partial adopters or experimenters, while another 40 percent were essentially contemplators.

AI encompasses a broad rage of technologies and application. It’s often viewed as the leading edge of IT. As soon as AI is successfully applied to a problem, the problem is no longer a part of AI.The McKinsey report is focused on five technologies being increasingly deployed in business, that most everyone agrees are part of AI: robotics and autonomous vehicles, computer vision, language, virtual agents, and machine learning.

July 17, 2017

Is the Robocalypse upon us?, asked MIT economist David Autor in his presentation at a recent Forum of European central bankers. His presentation was based on a paper co-written with Utrecht University economist Anna Salomons.“Is productivity growth inimical to employment?,” they asked in the paper’s abstract.“Canonical economic theory says no, but much recent economic theory says maybe- that is, rapid advances in machine capabilities may curtail aggregate labor demand as technology increasingly encroaches on human job tasks.”

Fears that machines will put humans out of work are not new. Throughout the Industrial Revolution there were periodic panics about the impact of automation on jobs, going back to the Luddites, - textile workers who in the 1810s smashed the new machines that were threatening their jobs.But, “In the end, the fears of the Luddites that machinery would impoverish workers were not realized, and the main reason is well understood,” noted a 2015 article on the history of technological anxiety.

“The mechanization of the early 19th century could only replace a limited number of human activities. At the same time, technological change increased the demand for other types of labor that were complementary to the capital goods embodied in the new technologies.This increased demand for labor included such obvious jobs as mechanics to fix the new machines, but it extended to jobs for supervisors to oversee the new factory system and accountants to manage enterprises operating on an unprecedented scale.More importantly, technological progress also took the form of product innovation, and thus created entirely new sectors for the economy, a development that was essentially missed in the discussions of economists of this time.”

Automation fears have understandbly accelerated in recent years, as our increasingly smart machines are now being applied to activities requiring intelligence and cognitive capabilities that not long ago were viewed as the exclusive domain of humans. “Previous technological innovation has always delivered more long-run employment, not less. But things can change,” said a 2014 Economistarticle. “Nowadays, the majority of economists confidently wave such worries away… Yet some now fear that a new era of automation enabled by ever more powerful and capable computers could work out differently.”

June 12, 2017

Despite dramatic advances in technology, most of the world’s economies have been stuck in a long period of slow growth and slow productivity. This is one of the most serious challenges in our 21st century economy.Opinions abound, but there’s little consensus on its causes, and, nobody seems to know what to do about it, or how long it will likely last, - years or decades.

In a recent article in the MIT Sloan Management Review, MIT Research Fellow Michael Schrage proposed a provocative and counterintuitive approach for enhancing innovation and productivity through man-machine collaborations. Schrage’s approach has been more influenced by behavioral economics than by technology or algorithmic advances.Instead of just askinghow can people create more valuable innovation?, why not also ask How can innovation create more valuable people?Don’t just leverage advanced technologies, - e.g., bots, software agents, and digital assistants, - to automate away a large portion of the workforce, but also focus on enhancing innovation and productivity by leveraging technology to create higher-performance versions of employees.

“Designing and training smarter algorithms may be cheaper and easier than retraining smart people,” wrote Schrage.“Advocates of autonomous systems and machine learning typically innovate to minimize or marginalize human involvement in business processes.For them, people are part of the problem, not the solution.Organizations that take productivity seriously, however, understand that false dichotomies make poor investments: Smarter machines can - and should - be keys to unlocking greater returns from human capital.”

June 05, 2017

Apple Pay was launched in September of 2014. It was positioned as a relatively easy, secure way to pay for purchases in physical stores as well as online using a variety of supported Apple devices including iPhones, iPads, Macs and Apple Watch. The general reaction to Apple’s announcement was quite positive: “[F]or now, at least, analysts believe if there is any company to persuade consumers of the mobile wallet’s value, it is Apple.”

The payments community welcomed Apple with open arms, hoping that Apple could play a major role in bringing together the different players in the fragmented payments ecosystem, much as it had previously done with iTunes for music and the App Store for smartphone apps. Apple Pay embraced a number of industry standards, including NFC (near-field communications) for contact-less payments; secure element, a dedicated device chip; and network-based tokenization to protect sensitive financial data. In addition, Apple collaborated with the credit card networks, several banks and a number of merchants in the development and deployment of Apple Pay, as well as publishing the APIs so developers could embed Apple Pay services in their own apps.

“Say hello to Apple Pay. It’s the new kid on the payments block and, depending on how things unfold, it could be the new Gorilla in the mobile payments ecosystem,” wroteKaren Webster, CEO of Market Platform Dynamics and an expert in digital payments.But, she also added a prescient note of caution: “As great as this sounds, there are two limitations of Apple Pay right now for consumers and merchants. And it’s that old chicken and egg issue that gets in the way of every new payments system.”The chicken and egg issue is that there aren’t enough consumer devices supporting Apple Pay, and there aren’t enough merchants that accept it.Both, consumers and merchants, have to be convinced to move away from payment systems they know well and feel comfortable with.

April 24, 2017

I recently attended a very interesting talk , - Exploring the Impact of Artificial Intelligence: Prediction versus Judgment, - by University of Toronto professor Avi Goldfarb.The talk was based on recent research conducted with his UoT colleagues Ajay Agrawaland Joshua Gans. In addition to an in-depth paper aimed at a research audience, they’ve explained their work in two more general interest articles, one in the Harvard Business Review and the second in the MIT Sloan Management Review.

In their opinion, “the best way to assess the impact of radical technological change is to ask a fundamental question: How does the technology reduce costs?Only then can we really figure out how things might change.”For example, the semiconductor revolution can be viewed as being all about the dramatic reductions in the cost of arithmetic calculations.Before the advent of computers, arithmetic was done by humans with the aid of various kinds of devices, from the abacus to mechanical and electronic calculators.

Then came digital computers, which are essentially powerful calculators whose cost of arithmetic operations has precipitously decreased over the past several decades thanks to Moore’s Law.Over the years, we’ve learned to define all kinds of tasks in terms of such digital operations, e.g., inventory management, financial transactions, word processing, photography.Similarly, the economic value of the Internet revolution can be described as reducing the cost of communications and of search, thus enabling us to easily find and access all kinds of information, - including documents, pictures, music and videos.

How does this framing now apply to our emerging AI revolution?After decades of promise and hype, AI seems to have finally arrived, - driven by the explosive growth of big data,inexpensive computing power and storage, and advanced algorithms like machine learning that enable us to analyze and extract insights from all that data. Agrawal, Fans and Goldfarb provide an elegant answer to this question in their HBRarticle.“Machine intelligence is, in its essence, a prediction technology, so the economic shift will center around a drop in the cost of prediction.”

January 02, 2017

In February of 2016, the President issued an Executive Order establishing the Commission on Enhancing National Cybersecurity within the Department of Commerce.The Commission was charged with “recommending bold, actionable steps that the government, private sector, and the nation as a whole can take to bolster cybersecurity in today’s digital world, and reporting back by the beginning of December.”

On December 1 the Commission issued its final Report on Securing and Growing the Digital Economy.The Commission urged the incoming administration to strengthen cybersecurity efforts within its first 100 days. The report includes 16 major recommendations and related actions that can hopefully serve as a useful framework for significantly enhancing the Nation’s cybersecurity.

The widespread success of the Internet in the 1990s has led to a historical transition from the industrial age of the past two centuries to an economy and society increasingly based on global, digital interactions.This transition has significantly accelerated over the past decade with the advent of billions of smartphones, tens of billions of IoT devices and huge amounts of data, all now connected via Internet-based broadband networks.

At the same time, Internet threats have been growing. Large-scale fraud, data breaches, and identity thefts are becoming more common.Companies are finding that cyber-attacks are costly to prevent and recover from.As we move from a world of physical interactions and paper documents, to a world primarily governed by digital data and transactions, our existing methods for protecting identities and data are proving inadequate.

December 12, 2016

I was recently involved in two different meetings, each convened by an IT research organization focused on providing strategic advice to CIOs.My participation in each of the CIOmeetings was in the form of a fireside chat with a senior analyst of the organization.Given that I’ve closely followed cloud computing over the years, the state of cloud was one of the main topics we discussed.

For a while now I’ve thought of cloud as a new model of computing in the IT world.For the IT industry, a new computing model is a very big deal. In the sixty years or so since there’s been an IT industry, this would be only the third such model, centralized and client-server computing being the two previous ones.Mainframes and PCs were the defining technologies of the centralized model and client-server models respectively.

The Internet is the defining technology of the cloud computing model. There are clearly other major technologies around us, - e.g., smartphones, IoT, analytics, AI, - but when you think about it, all of them rely on their connections to cloud-based data, applications, and services for much of their functionality.

Cloud computing has gone through three major stages over the past decade: IT-as-a-service in its initial years, then application innovation in it’s next stage, and it’s now just entering the business transformation stage.Let me say a few words about the first two before turning to business transformation.

November 28, 2016

A few weeks ago I first learned about a relatively new concept - Digital Twin.A Digital Twin is essentially a computerized companion to a real-world entity, be it an industrial physical asset like a jet engine, an individual’s health profile, or a highly complex system like a city.It’s a highly realistic, one-to-one digital model of each such specific physical entity.

Digital Twin helps bring the physical and digital worlds closer to each other.It’s intertwined with and complementary to the Internet of Things (IoT).The huge amounts of data now collected by IoT sensors on physical objects, personal devices and smart systems make it possible to represent their near real-time status in their Digital Twin alter-ego.

“The myriad possibilities that arise from the ability to monitor and control things in the physical world electronically have inspired a surge of innovation and enthusiasm,” said a 2015 McKinsey report on the Internet of Things.Experts estimate that the number of connected things or devices will reach 50 billion by 2020, growing to 100s of billions in the decades ahead.The economic potential of the smart solutions this makes possible is enormous, possibly reaching several trillion dollars within a decade.