A collection of observations, news and resources on the changing nature of innovation, technology, leadership, and other subjects.

March 18, 2019

The world has been stuck in an era of slow economic growth over the past decade.Top economists and policy makers have proposed a number of explanations for the economic slowdown, but in the end, there’s no consensus on the reasons, on how long the slowdown will likely last, or on what to do about it.

The current world’s population is 7.7 billion. The UN Population Division estimates that global population will reach 9.8 billion with a growth rate of 0.5% in 2050, and will peak around 2100 at roughly 11.2 billion with a growth rate of 0.1%.Other projections estimate that the population will grow slower, peak at 9.4 billion around 2070 and then decline to around 9 billion in 2100. Some project that after reaching a peak of 9 billion, the world’s population will decline back to today’s levels, - around 7 billion, - by 2100.

Japan and a few other countries are already experiencing population declines. The US population is currently around 330 million, and is projected to reach 390 million in 2050 and 450 million in 2100.While the US birthrate hit a historic low in 2017, the population has continued to grow due to immigration.The US growth rate is 0.71% including immigration and 0.43% without.

The global labor force grew at an average of 1.8% per year between 1960 and 2005, but since then it’s been growing at just 1.1% per year.The labor force is still growing in some developing countries like India, Nigeria and the Philippines, but it’s already shrinking in China, Japan and Germany.In the US, the labor force is growing very slowly, - 0.5 percent per year over the past decade, compared with 1.7 percent from 1960 to 2005.Given the continuing decline in fertility rates in most parts of the world, the global labor force is expected to slip further in the coming decades.

March 11, 2019

A few months ago, MIT president Rafael Reif announced the launch of the Stephen A. Schwarzman College of Computing in an e-mail to the MIT community. “This new College is our strategic response to a global phenomenon – the ubiquity of computing and the rise of AI…” wrote President Reif.“To state the obvious, AI in particular is reshaping geopolitics, our economy, our daily lives and the very definition of work.It is rapidly enabling new research in every discipline and new solutions to daunting problems.At the same time, it is creating ethical strains and human consequences our society is not yet equipped to control or withstand.”

The new College of Computing is made possible by a $350 million foundational gift from Stephen Schwarzman, - chairman, CEO and co-founder of Blackstone, one of the world’s leading investment firms. The gift is part of a$1 billion commitment to reshape MIT to help it better address the continuing advances in technology in our 21st century digital economy, as well as their many applications , - both the exciting opportunities and the difficult challenges.

March 04, 2019

Blockchain was created around a decade ago as the public, distributed ledger for the Bitcoin cryptocurrency.Most everyone agrees that it’s a truly brilliant architecture, built on decades-old fundamental research in cryptography, distributed data, distributed computing, game theory and other advanced technologies.When it first came to light, blockchain had no broader goals beyond supporting Bitcoin.But, as was the case with the Internet and World Wide Web, blockchain soon transcended its original objectives.

In 2016, blockchain made the list of the World Economic Forum’s Top Ten Emerging Technologies.The WEF report compared blockchain to the Internet, noting that “Like the Internet, the blockchain is an open, global infrastructure upon which other technologies and applications can be built.And like the Internet, it allows people to bypass traditional intermediaries in their dealings with each other, thereby lowering or even eliminating transaction costs.”

That same year, blockchain made its first appearance in Gartner’s yearly hype cycles, where it’s remained for the past three years.In 2017, Gartner noted that “blockchain might seem like it’s just around the corner.However, most initiatives are still in alpha or beta stage… Long-term, Gartner believes this technology will lead to a reformation of whole industries.”

In a recent article in the MIT Technology Review, associate editor Mike Orcutt succinctly summarized the current state of blockchain: “In 2017, blockchain technology was a revolution that was supposed to disrupt the global financial system.In 2018, it was a disappointment.In 2019, it will start to become mundane… After the Great Crypto Bull Run of 2017 and the monumental crash of 2018, blockchain technology won’t make as much noise in 2019.But it will become more useful.”

February 25, 2019

Innovation has been a hot topic for the past few decades.Companies all over the world have integrated innovation into their overall strategies and marketing campaigns.Nations and regions have launched innovation initiatives in an attempt to attract such companies and their accompanying well-paying jobs. But, as I have learned over my long career, managing innovation initiatives is actually quite hard, much harder than it may at first appear.

I was reminded of this point as I read The Hard Truth About Innovative Cultures, an article in a recentissue of the Harvard Business Review (HBR) by Harvard professor Gary Pisano.“A culture conducive to innovation is not only good for a company’s bottom line,” writes Pisano.“It also is something that both leaders and employees value in their organizations.”

Most everyone agrees that such a culture entails five key behaviors: tolerance for failure, willingness to experiment, psychological safety, collaborative, and non-hierarchical. “But despite the fact that innovative cultures are desirable and that most leaders claim to understand what they entail, they are hard to create and sustain.This is puzzling.How can practices apparently so universally loved - even fun - be so tricky to implement?”

The reason, answers Pisano, is that these much-liked behaviors are only one side of the coin, which must be counterbalanced with some tougher behaviors.People love solving problems that seem impossible to everyone else and creating something qualitatively different from anything that came before.But they will rarely perform at the necessary levels if they’re relaxed and happy.True innovation is, in fact, not all that much fun.In order to reach inside yourself and find that something extra needed for true innovation, you have to feel the stress that comes when you know that what you’re doing is absolutely crucial.In the end, necessity is the mother of true innovation.

February 18, 2019

The World is Getting Quietly, Relentlessly Better is the title of a recent WSJ article by its chief economics commentator Greg Ip.“If you spent 2018 mainlining misery about global warming, inequality, toxic politics or other anxieties, I’m here to break your addiction with some good news: The world got better last year, and it is going to get even better this year,” writes Ip in his opening paragraph.“Poverty around the world is plummeting; half the world is now middle class; and illiteracy, disease and deadly violence are receding. These things don’t make headlines because they are gradual, relentless and unsurprising.”

OWID analyzes longitudinal data on how the world has been changing over decades and centuries, as well as explaining the causes and consequences of those changes in its accompanying articles. It makes extensive use of visual aids, including interactive graphs and maps, to present its analyses and insights. Its content is organized into 15 main sections:population, health, food, energy, environment, technology, growth & inequality, work & life, public sector, global connections, war & peace, politics, violence & rights, education, media and culture, -each of which further includes a number of subsections.

The belief that efficiency is fundamental to competitive advantage has turned management into a science, - taught in every business school on the planet, - whose objective is the elimination of waste, - whether of time, materials, or capital.

“Why would we not want managers to strive for an ever-more-efficient use of resources?,” asks Martin. Of course we do.But, an excessive focus on efficiency can produce startlingly negative effects. To counterbalance such potential negative effects, companies should pay just as much attention to a less appreciated source of competitive advantage: resilience, - “the ability to recover from difficulties - to spring back into shape after a shock.”

“Think of the difference between being adapted to an existing environment (which is what efficiency delivers) and being adaptable to changes in the environment. Resilient systems are typically characterized by the very features - diversity and redundancy, or slack - that efficiency seeks to destroy.”

February 04, 2019

AI is seemingly everywhere.In the past few years, the necessary ingredients have finally come together to propel AI beyond early adopters to a broader marketplace: powerful, inexpensive computer technologies; advanced algorithms; and huge amounts of data on almost any subject.Newspapers and magazines are full of articles about the latest advances in machine learning and related AI technologies.

Two recent reports concluded that, over the next few decades, AI will be the biggest commercial opportunity for companies and nations.AI advances have the potential to increase global GDP by up to 14% between now and 2030, the equivalent of an additional $14-15 trillion contribution to the world’s economy, and an annual average contribution to productivity growth of about 1.2 percent.

Over time, AI could become a transformative, general purpose technology like the steam engine, electricity, and the Internet. AI marketplace adoption will likely follow a typical S curve pattern, - that is, a relatively slow start in the early years, followed by a steep acceleration as the technology matures and firms learn how to best leverage AI for business value.

To get a better sense of the current state of AI adoption, McKinsey recently conducted a global online survey on the topic, garnering responses from over 2,000 participants across 10 industry sectors, 8 business functions and a wide range of regions and company sizes. The survey asked about their progress in deploying nine major AI capabilities, including machine learning, computer vision, natural language text and speech processing, and robotic process automation.

January 28, 2019

In his 1945 seminal report, Science The Endless Frontier, presidential science advisor Vannevar Bush laid out the blueprint for R&D in post-war America: “New knowledge can be obtained only through basic scientific research” conducted in universities and research labs, which is then applied to develop new products by the private sector and new and improved weapons by the defense sector.The report was quite influential and led to the considerable expansion of university research, much of it supported by the National Science Foundation (NSF), the National Institutes of Heath (NIH) and other US federal agencies.

A number of large industrial companies also embraced these recommendations, and launched or expanded corporate research labs, such as ATT’s Bell Labs, GE Research, Xerox PARC and IBM Research.Their job was to push the frontiers of knowledge by conducting both basic and applied research, which would hopefully lead to innovative technologies and products.

A doctoral degree or PhD has long been a key requirement for most research positions, whether in universities, or corporate and government labs.Thus, since the post-war expansion of research, the number of PhDs has been steadily rising.NSF data shows that the number of PhD recipients per year from US universities has more than quintupled, - from under 10,000 in 1958 to almost 55,000 in 2017.Over three quarter of the degrees awarded in 2017 were in STEM disciplines: 23% in Life Sciences, 18% in Engineering, 17% in Social Sciences, 11% in Physical Sciences, and 7% in Math and Computer Sciences. Beyond STEM, 10% of 2017 PhDs were in the Humanities and Arts, 9% in Education, and 5% in various other disciplines.

Data from the US Bureau of Labor Statistics (BLS) shows the value of a PhD education. According to 2017 BLS data, recipients of doctoral degrees had among the lowest unemployment rates at 1.5%, compared with 2.2% of those with master’s degrees, 2.5% with bachelor degrees and an overall unemployment rate of 3.6%.Holders of doctoral degrees were also among the best paid, with weekly median earnings of $1,743, slightly below holders of professional degrees, - e.g. physicians, dentists, lawyers, - at $1,836, and higher than the $1,401 for master’s degrees and $1,173 for bachelor’s.

Given these positive figures, one would expect near universal agreement on the value of obtaining a PhD.But, in fact, this is not the case.A number of recent articles have been questioning whether a PhD is worth it.Let’s take a closer look at their arguments.

January 21, 2019

After decades of promise and hype, artificial intelligence is finally becoming one of the most important technologies of our era.As AI continues to both spread and advance, will it enhance human capacities or will it lessen human autonomy and agency?By 2030, will most people be better or worse off than they’re today?

Overall, 63% of respondents were optimistic, predicting that most individuals will mostly be better off, while 37% thought otherwise.But optimistic or not, most experts expressed concerns about the long-term impact of AI on “the essential elements of being human.”

The survey further asked the experts to explain their hopes and fears, to sketch out how they envision human-machine interactions over the coming decade, and what actions should be taken to assure the best possible future. Many answered, and their detailed comments are organized into the report’sthree main sections: concerns; suggested solutions; and expectations for the future.Let me briefly discuss the key points in each section.

January 14, 2019

In the 1990s, the Internet was supposed to usher a much more open, decentralized, democratic economy and society.Startups with innovative business models where now able to reach customers anywhere, anytime.Companies, from the largest to the smallest, could now transact with anyone around the world.Vertically integrated firms became virtual enterprises, increasingly relying on supply chain partners for many of the functions once done in-house. Experts noted that large firms were no longer necessary and would in fact be at a disadvantage in the emerging digital economy when competing against agile, innovative smaller companies.

Some even predicted that the Internet would lead to the decline of cities.People could now work and shop from home; be in touch with their friends over e-mail, video calls, and text messaging; and get access to all kinds of information and entertainment online.Why would anyone choose to live in a crowded, expensive, crime-prone urban area when they could lead a more relaxing, affordable life in an outer suburb or small town?

January 07, 2019

Artificial intelligence is rapidly becoming one of the most important technologies of our era.Every day we can read about the latest AI advances from startups and large companies.AI technologies are approaching or surpassing human levels of performance in vision, speech recognition, language translation, and other human domains. Machine learning advances, like deep learning, have played a central role in AI’s recent achievements, giving computers the ability to be trained by ingesting and analyzing large amounts of data instead of being explicitly programmed.

Deep learning is a powerful statistical technique for classifying patterns using large training data sets and multi-layer AI neural networks. It’s essentially a method for machines to learn from all kinds of data, whether structured or unstructured, that’s loosely modeled on the way a biological brain learns new capabilities.Each artificial neural unit is connected to many other such units, and the links can be statistically strengthened or decreased based on the data used to train the system.Each successive layer in a multi-layer network uses the output from the previous layer as input.

Machine learning can be applied to just about any domain of knowledge given our ability to gather valuable data in almost any area of interest. But, machine learning methods are significantly narrower and more specialized than humans.There are many tasks for which they’re not effective given the current state-of-the-art.In an article recently published in Science, professors Erik Brynjolfsson and Tom Mitchell identified the key criteria that help distinguish tasks that are particularly suitable for machine learning from those that are not.These include:

Tasks that map well-defined inputs to well-defined outputs, - e.g., labeling images of specific animals, the probability of cancer in medical record, the likelihood of defaulting on a loan application;

Large data sets exist or can be created containing such input-output pairs, - the bigger the training data sets the more accurate the learning;

The capability being learned should be relatively static, - If the function changes rapidly, retraining is typically required, including the acquisition of new training data; and

No need for detailed explanation of how the decision was made, - the methods behind a machinelearning recommendation, - subtle adjustments to the numerical weights that interconnect its huge number of artificial neurons, - are difficult to explain because they’re so different from those used by humans.

December 31, 2018

Last week I wrote about a recently published World Bank report, The Changing Nature of Work.The report highlights the increasing importance of investing in human capital by both the public and private sectors. All around the world, the demand for workers who are good at complex problem-solving, interactions with clients and colleagues, and able to adapt to new tools and technologies is rising, while the demand for less advanced skills that can be replaced by technology is declining.Neglecting investments in human capital, it warned, can dramatically weaken a country’s competitiveness.

“The world is healthier and more educated than ever…,” notes the World Bank in a companion report, The Human Capital Project.“But a large and unfinished agenda remains.Life expectancy in the developing world still lags far behind that of rich countries…Worldwide, more than 260 million children and youth are not in school. Meanwhile, nearly 60 percent of primary school children in developing countries fail to achieve minimum proficiency in learning.”

Why should countries invest in human capital? Can early health care and education prepare children to succeed and prosper as adults in a rapidly changing world?What are the barriers to nurturing human capital and how can countries overcome them?To address these important questions, the World Bank introduced the Human Capital Index (HCI).The HCI is designed to quantify the amount of human capital that a child born today can expect to attain by age 18 given the prevailing health and education conditions in the country where the child lives.It will be updated periodically and refined as data measures improve.This video nicely explains the HCI and what it measures.

December 24, 2018

People have long feared that machines are coming for our jobs. Throughout the Industrial Revolution there were periodic panics about the impact of automation on work, going back to the so-called Luddites, - textile workers who in the 1810s smashed the new machines that were threatening their jobs.

In a recent article in Foreign Affairs, World Bank Group President Jim Yong Kim warned that the world is facing a Human Capital Gap.“Governments in pursuit of economic growth love to invest in physical capital - new roads, beautiful bridges, gleaming airports, and other infrastructure.But they are typically far less interested in investing in human capital, which is the sum total of a population’s health, skills, knowledge, experience, and habits.That’s a mistake, because neglecting investments in human capital can dramatically weaken a country’s competitiveness in a rapidly changing world, one in which economies need ever-increasing amounts of talent to sustain growth.”

December 17, 2018

A recent issue of The Economist included a special report on cryptocurrencies and blockchains.The Economist’s overall conclusion was that “Bitcoin has been a failure as a means of payment, but thrilling for speculators.”Its assessment of blockchain was somewhat more positive.“For blockchains, the jury is still out,… For all the technology’s potential, though, most attempts to use it remain tentative,…The advantages of blockchains are often oversold.”

Is there too much hype surrounding blockchain?Absolutely,… but not surprising.All potentially transformative technologies are oversold in their early stages.Remember the dot-com bubble of the late 1990s.Blockchain is still in its early phases of experimentation and adoption.Much work remains to be done on standards, platforms, interoperability, applications and governance.

But, does blockchain have the potential to become a truly transformative technology over time? Yes, said McKinsey in a recent article on the strategic value of blockchain beyond the hype. The article starts out by acknowledging that all the hype around blockchain makes it difficult to nail down not just blockchain's long term strategic value, but also what it actually is and what it isn’t in the present. To make sure everyone is on the same page, the article first recaps what it calls “the nuts and bolts of blockchain.” Let me start by summarizing blockchain’s nuts and bolts.

December 10, 2018

How will labor markets evolve in our 21st century digital economy?What’s the likely future of jobs, given that our increasingly smart machines are now being applied to activities requiring intelligence and cognitive capabilities that not long ago were viewed as the exclusive domain of humans?How will AI, robotics and other advanced technologies transform the very nature of work?

Over the past few years, a number of papers, reports and books have addressed these very important questions.They generally conclude that AI will have a major impact on jobs and the very nature of work. For the most part, they view AI as mostly augmenting rather than replacing human capabilities, automating the more routine parts of a job and increasing the productivity and quality of workers, so they can focus on those aspect of the job that most require human attention. Overall, few jobs will be entirely automated, but automation will likely transform the vast majority of occupations.

A recent McKinsey report noted while there will likely be enough work to maintain full employment by 2030, the transition will be very challenging, “on a scale not seen since the transition of the labor force out of agriculture in the early 1900s in the United States and Europe, and more recently in in China.”The report estimated that “up to 375 million workers, or 14 percent of the global workforce, may need to change occupations - and virtually all workers may need to adapt to work alongside machines in new ways.”

Given these predictions about the changing nature of work, what should companies do?How should firms prepare for a brave new world where we can expect major economic dislocations along with the creation of new jobs, new business models and whole new industries, and where many people will be working alongside smart machines in whole new ways?

December 03, 2018

A few weeks ago, the World Economic Forum (WEF) released The Future of Jobs Report 2018. The report takes an in-depth look at the world of work in what the WEF calls the Fourth Industrial Revolution. The report is based on a survey of over 300 chief human resource officers and top strategy executives from large global companies across 12 industry sectors.These companies collectively represent more than 15 million employees, and conduct business in 20 developed and emerging economies which collectively account for about 70% of global GDP.

First introduced at its 2016 annual Davos Forum, the WEF positions the Fourth Industrial Revolution within the historical context of three previous industrial revolutions. The First, - in the last third of the 18th century, - ushered the transition from hand-made goods to mechanized, machine-based production based on based on steam and water power.The Second, - a century later, - revolved around steel, railroads, cars, chemicals, petroleum, electricity, the telephone and radio, leading to the age of mass production.The Third, - starting in the 1960s, - saw the advent of digital technologies, computers, the IT industry, and the automation of process in just about all industries.

“Now a Fourth Industrial Revolution is building on the Third…” wrote WEF founder and chairman Klaus Schwab in a December, 2015 Foreign Affairs article.“The possibilities of billions of people connected by mobile devices, with unprecedented processing power, storage capacity, and access to knowledge, are unlimited.And these possibilities will be multiplied by emerging technology breakthroughs in fields such as artificial intelligence, robotics, the Internet of Things, autonomous vehicles, 3-D printing, nanotechnology, biotechnology, materials science, energy storage, and quantum computing.”

The Future of Jobs 2018 examines the potential of these advanced technologies to eliminate and create new jobs, as well as to improve the current workplace and prepare people for emerging jobs. Rather than looking at the future of work over the longer term, the report chose to examine the 2018-2022 period, which is within the planning horizon of the companies and executives that participated in the survey.

November 26, 2018

In mid-September, the World Economic Forum (WEF) released Building Block(chain)s for a Better Planet, a report that examined how blockchain technologies could be harnessed to address serious environmental issues, better manage our shared global environment and help drive sustainable growth and value creation.The report outlines some of the world’s most-pressing environmental challenges and highlights eight blockchain-based game changers that could lead to transformative solutions to these pressing problems.

The WEF report includes a short blockchain tutorial.Blockchain is essentially “a new, decentralized and global computational infrastructure that could transform many existing processes in business, governance and society.” Blockchain technologies promise to significantly improve the efficiency and security of business transactions and data sharing among participants in a global ecosystem, such as supply chains, financial services, and similar complex applications involving multiple institutions.

But, the hype surrounding blockchain, can tempt solution designers to try to use it for applications that it’s not suitable for.The report recommends that designers first consider three key questions:

“The IIC believes that Inclusive Innovation is an economic and moral imperative, and that the key question of our era isn’t what technology is going to do to our economy and society, but what we will do with technology.By identifying and promoting the powerful global community of future of work visionaries, the IIC proactively accelerates the technology-driven solutions enabling greater economic opportunity for working people around the world facing the challenge of rapidly advancing digital progress.”

Entrepreneurs from around the globe compete for awards in four categories: income growth and job creation, skills development and opportunity matching, technology access, and financial inclusion. Approximately 1,500 organizations registered to compete this year, and 500 judges reviewed their applications.

In 2018, rather than having one global competition as in the previous two years, there were five separate regional competitions, - North America, Latin America, Europe, Africa, and Asia, - each region selecting its own category winners.A Grand Prize winner was then selected in each category from among the regional winners and each awarded $250,000.The winners were announced at a gala event that took place in MIT on November 8.

November 12, 2018

A few weeks ago I wrote about the economic value of AI based on a recently published report by PwC.The report’s overriding finding was that AI technologies and applications will increase global GDP by up to 14% between now and 2030, the equivalent of an additional $15.7 trillion contribution to the world’s economy.According to PwC, AI is the biggest economic opportunity over the next 10-15 years.

The McKinsey report is based on simulation models of the impact of AI at the country, sector, company and worker levels. It looked at their adoption of five broad categories of AI technologies: computer vision, natural language, virtual assistants, robotic process automation, and advanced machine learning. Its data sources included survey data from approximately 3,000 firms in 14 different sectors; around 400 potential AI use cases across a variety of industries and functions; AI’s potential to automate and transform 800 existing occupations in 46 countries; and economic data from a number of organizations including the United Nations, the World Bank, the Organization for Economic Co-operation and Development (OECD), and the World Economic Forum.

November 05, 2018

Darwinian principles seem to apply in business almost as much as in biology. After analyzing the longevity of more than 30,000 public US firms over a 50-year span, Martin Reeves, Simon Levin, and Daichi Ueda noted in a 2016 HBR article, The Biology of Corporate Survival, that companies are disappearing faster than ever before.“Public companies have a one in three chance of being delisted in the next five years, whether because of bankruptcy, liquidation, M&A, or other causes. That’s six times the delisting rate of companies 40 years ago… Neither scale nor experience guards against an early demise.”

Biological systems have long been an inspiration in the study of complex systems.In their HBR article, the authors argued that companies are not just like biological species, but in some important respects they’re actually identical to biological species. Companies and biological systems are both what’s known as complex adaptive systems, that is, systems in which a perfect understanding of their individual components does not automatically lead to a perfect understanding of their overall system behavior.

Companies are an example of sociotechnical systems, that is, systems that have to deal with complex technologies and infrastructures, and the even more complex issues associated with human and organizational behaviors. Other examples include cities, government agencies, industries and economies. The dynamic nature of their technology and human components, as well as their intricate interrelationships renders such systems increasingly unpredictable and accounts for their emergent behavior and unintended consequences.

October 29, 2018

In Reinvent Your Business Model, his recently published book, Mark Johnson argues that digital transformation and business model innovation are not the same thing.New technology alone, - no matter how transformative, - is not enough to propel a business into the future.The business model wrapped around the technology is the key to its success or failure.Johnson is Senior Partner in Innosight, the strategy consulting firm he cofounded with Harvard Business School professor Clayton Christensen in 2000.He’s been conducting research on business model innovation for over a decade, and in 2010 published Seizing the White Space.

Business model innovation has long been the domain of disruptive startups looking to compete against established companies by changing the rules of the game, - and, hopefully, creating new markets and reshaping entire industries.But, it’s no longer enough for established companies to just roll out improved products and services based on their once-reliable business models.

“Building a great business and operating it well no longer guarantees you’ll be around in a hundred years, or even twenty,” notes Johnson.“In 1965, the average length of time a company remained on the S&P 500 was thirty-three years.By 1990, it had dropped to twenty years; in 2012, it was just eighteen.Based on the 2017 churn rate, it is forecast that half of the S&P 500 will be replaced over the next ten years.”

Examples abound.In the 1970s, Xerox PARC famously developed, - but didn’t commercialize, - some of the key innovations of the PC era, including the graphical user interface, the mouse and local area networks.In 2010, Blockbuster filed for bankruptcy, a victim of Netflix’s new business models.Done in by Apple’s iPhone and Android-based smartphones, the Blackberry, - once the undisputed leader in its market, - fell into a death spiral from which it never recovered.

October 22, 2018

PwC recently released a report on the potential economic value of AI to different regions and industry sectors around the world.The report defined AI, “as a collective term for computer systems that can sense their environment, think, learn, and take action in response to what they’re sensing and their objectives.”By this broad definition, AI includes the automation of physical and cognitive tasks; assisting people to perform tasks better and faster; helping humans make better decisions; and automating decision making with no human intervention.

The report’s overriding finding is that AI is the biggest commercial opportunity for companies, industries and nations over the next few decades.PwC estimates that AI advances will increase global GDP by up to 14% between now and 2030, the equivalent of an additional $15.7 trillion contribution to the world’s economy.

Around $6.6 trillion of the expected GDP growth will come from productivity gains, especially in the near term.These include the continued automation of routine tasks, and the development of increasingly sophisticated tools to augment human capabilities.Companies that are slow to adopt these AI-based productivity improvements will find themselves at a serious competitive disadvantage.

But, over time, increased consumer demand for AI-enhanced offerings will overtake productivity gains and result in an additional $9.1 trillion of GDP growth by 2030.Moreover, network effects, - that is, the more data and better insights companies are able to gather, the more appealing the products and services they’ll be able to develop, - will further increase consumer demand. AI front-runners will gain an enormous competitive advantage through their ability to leverage this rich supply of customer data to shape product developments and business models, making it harder for slower moving competitors to catch up.

“In the beginning computers were human. Then they took the shape of metal boxes, filling entire rooms before becoming ever smaller and more widespread. Now they are evaporating altogether and becoming accessible from anywhere…Now,… computing is taking on yet another new shape. It is becoming more centralised again as some of the activity moves into data centres. But more importantly, it is turning into what has come to be called a ‘cloud’, or collections of clouds. Computing power will become more and more disembodied and will be consumed where and when it is needed…It will allow digital technology to penetrate every nook and cranny of the economy and of society…”

Serverless is an approach to software development for quickly building and deploying applications as a collection of loosely coupled modules and microservices.According to Forrester, “Firms report a better software development experience, rapid scaling of services that compose applications, lower costs, and better infrastructure utilization when workloads are variable.They also spend less time maintaining cloud infrastructure.”

October 08, 2018

In 2015, the McKinsey Global Institute launched a multi-year study to explore the potential impact of automation technologies on jobs, organizations and the future of work.In the intervening three years, the study has published a number of report on the subject.

The first report, published in November, 2015, explored whether we can look forward to vast improvements in productivity and quality of life, or whether automation will mostly threaten jobs, disrupt organizations, and strain the social fabric. Based on an analysis of around 2,000 work activities across 800 different occupations, the report concluded that “fewer than 5 percent of occupations can be entirely automated using current technology.However, about 60 percent of occupations could have 30 percent or more of their constituent activities automated.”In other words, relatively few jobs will be entirely automated, but automation will likely transform the vast majority of occupations.

McKinsey’s December, 2017 report analyzed whether there will be enough work in the future.The overall conclusion was that a growing technology-based economy will create a significant number of new occupations, - as has been the case in the past, - which will more than offset declines in occupations displaced by automation.However, “while there may be enough work to maintain full employment to 2030 under most scenarios, the transitions will be very challenging - matching or even exceeding the scale of shifts out of agriculture and manufacturing we have seen in the past.”

Let me now discuss the more recent report published earlier this year, which examined the changes in skills required of human workers over the next 10-15 years.To do so, the study analyzed how the total number of hours worked in 25 different skill areas has changed between 2002 and 2016 and estimated the expected change in hours worked by 2030.This was done for the US and 5 Western European countries. I’ll focus my discussion on the US skill changes, as the European changes were similar.

October 01, 2018

A few weeks ago I wrote about social physics, a new discipline that aims to help us better understand and predict the behavior of human groups.Social physics is based on the premise that all event-data representing human activity, - e,g,, phone call records, credit card purchases, taxi rides, web activity, - contain a special set of group behavior patterns.As long as the data involves human activity, - regardless of the type of data, the demographic of the users or the size of the data sets, - similar behavioral dynamics apply.These patterns can be used to detect emerging behavioral trends before they can be observed by other data analytics techniques.

Physics, biology and other natural sciences have long relied on universal patterns or principles to detect a faint signal within a large data set, - i.e., the proverbial needle in a haystack.It’s what has enabled the discovery of very short lived elementary particles in physics, - like the Higgs bosonin 2013, - amidst the huge amounts of data generated by high energy particle accelerators. In biology, it’s given rise to DNA sequencing and its growing list of applications in medicine, biotechnology and other disciplines.

It’s not surprising that evolutionary biology and natural selection have led to similar universal patterns in the behavior of human crowds.Humans and our ancestors have evolved with the drive to learn from each other because it’s been a major part of our survival over millions of years.And if a new behavior, - whether the result of an innovative idea like the discovery of tools, or a mutation like a larger brain size, - helps a human group better adapt to a changing environment, natural selection will favor the survival of that group over others.

Social physics originated in MIT’s Human Dynamics Lab based on research by professor Alex (Sandy) Pentland, his then postdoctoral associate Yaniv Altshuler and their various collaborators.In 2014, Pentland and Altshuler co-founded Endor, an Israeli-based startup that leverages social physics methods to make fast accurate predictions by analyzing data derived from human behavior.

September 24, 2018

Bitcoin was created about a decade ago, with the release of the original design paper, Bitcoin: A Peer-to-Peer Electronic Cash System, under the pen name of Satoshi Nakamoto.Bitcoin aimed to become a decentralized cryptocurrency and digital payment system that would enable the secure exchange of digital money without the need for central government authorities or financial intermediaries. It introduced the blockchain, - which most everyone agrees is a truly brilliant architecture built on decades-old fundamental research in cryptography, distributed data, distributed computing, game theory and other advanced technologies.

How are they doing after 10 years? The September 1 issue of The Economist includes a special report on cryptocurrencies and blockchains, with nine articles on the subject. The report’s overview article doesn’t mince words in articulating its overall assessment: “Bitcoin and other cryptocurrencies are useless.For blockchains, the jury is still out.”

“Bitcoin has been a failure as a means of payment, but thrilling for speculators,” it further added.Its price has been subject to wild fluctuations over the past two years.It rose from around $600 in September, 2016, to $6,000 in September, 2017, reaching a peak of over $19,000 in December, 2017, before falling to around $7,000 earlier this year. It’s been hovering between $6,500 and $7,500 for the past few months.Other cryptocurrencies have experienced similar price fluctuations.“That has made a few people very rich (just 100 accounts own 19% of all existing bitcoin), encouraged others to play for quick gains and left some nursing substantial losses.”

September 17, 2018

Fewer workers is one of the key reasons for our stagnant economic growth.Over the coming decades, the labor force is expected to shrink in most parts of the world as fertility rates continue to decline, especially in advanced and emerging economies.The US labor force is growing very slowly while fertility rates keep hitting record lows.

Our aging economies are thus dependent on productivity gains to drive long-term economic growth, future prosperity and higher standards of living.Which is why few topics have generated as much concern among economists and policy makers as the sharp decline in productivity growth in the US and other advanced economies over the past decade, - despite accelerating technology advances.

“Labor productivity growth is near historic lows in the United States and much of Western Europe,” noted Solving the Productivity Puzzle, a report published earlier this year by the McKinsey Global Institute (MGI).The report analyzed the productivity-growth declines across a sample of seven countries, - France, Germany, Italy, Spain, Sweden, the United Kingdom, and the United States, - which represent about 65% of the GDP of advanced economies.Their average productivity growth in the 5 years between 2000 and 2004 was 2.4%.A decade later, - in the 5 year period between 2010 and 2014, - their average productivity growth had declined to .5%.While starting to pick up recently, productivity growth remains at or below 1 percent in many of the countries in the study, still quite low relative to historical levels.

September 10, 2018

Social physics first emerged over 200 years ago as an attempt to understand society and human behavior using laws similar to those of the physical sciences.But, it wasn’t until the past two decades that we finally had enough data, powerful computers and sophisticated mathematical algorithms to develop quantitative theories of human social interactions. We were now able to reliably predict how large groups of people make decisions by analyzing how information and ideas flow from person to person.

“The engine that drives social physics is big data: the newly ubiquitous digital data now available about all aspects of human life,” wrote MIT professor Alex (Sandy) Pentland in his 2014 book Social Physics: How Good Ideas Spread - The Lessons from a New Science.“Social physics functions by analyzing patterns of human experience and idea exchange within the digital bread crumbs we all leave behind as we move through the world, - call records, credit card transactions, and GPS location fixes, among others.These data tell the story of everyday life by recording what each of us has chosen to do… Who we actually are is more accurately determined by where we spend our time and which things we buy, not just by what we say we do.”

The book was the result of over a decade’s worth of research in the Human Dynamics group at MIT’s Media Lab.Pentland, along with his graduate students and research associates, partnered with a variety of companies to obtain and analyze real-world data, properly anonymized to protect the privacy of the individuals whose behavior was reflected in the data. They eventually discovered that all event-data representing human activity contain a special set of social activity patterns regardless of what the data is about.These patterns are common across all human activities and demographics, and can be used to detect emerging behavioral trends before they can be observed by any other technique.

September 03, 2018

Deep learning and related machine learning advances have played a central role in AI’s recent achievements, giving computers the ability to be trained by ingesting and analyzing large amounts of data instead of being explicitly programmed.In just the past two years, Google’s deep-learning-based AlphaGo defeated the world’s top Go players, surprising most AI experts who thought that it would take another 5 to 10 years to achieve such a milestone.Similarly, when Google switched to its new deep learning AI system in late 2016, it achieved an overnight improvement in the quality of its machine translations roughly equal to the total gains that the previous program had accrued over its 10 year lifetime.

As is typically the case with major technology achievements, - e.g., the dot-com bubble, - deep learning has quickly climbed to the top of Gartner’s hype cycle, where all the excitement and publicity accompanying new, promising technologies often leads to inflated expectations, followed by disillusionment if the technology fails to deliver.AI may be particularly prone to such hype cycles, as the notion of machines achieving or surpassing human levels of intelligence leads to feelings of wonder as well as fear.Over the past several decades, AI has gone through a few such hype cycles, including the so-called AI winter in the 1980s that nearly killed the field.

August 27, 2018

Steve Jobs, Bill Gates, and Mark Zuckerberg were in their early 20s; Sergei Brin and Larry Page were 25; and Jeff Bezos was just 30 when they founded their world leading high-tech companies.Their companies are truly in a class by themselves, even having their own, somewhat frightful acronym - FAMGA.Their founders’ ages are not only evidence that young people can be highly successful entrepreneurs, but it’s led to the belief that they’re uniquely capable of coming up with big, transformational ideas.

“Young people are just smarter,” Zuckerberg told the audience of a 2007 VC conference, adding that successful start-ups should only employ young people with technical expertise.In 2010, PayPal’s co-founder Peter Thiel created the Thiel Fellowships, a program that provides $100,000 over two years to would-be entrepreneurs 23 and under so they can drop out of school to work on their ideas. “The stereotype of an entrepreneur is a college drop-out,” said a recent Economistarticle.

“Silicon Valley has become one of the most ageist places in America,” noted a 2014 New Republic article on The Brutal Ageism of Tech, where years of experience are considered obsolete.“People under 35 are the people who make change happen,” said VC Vinod Khosla at a 2011 conference.“People over 45 basically die in terms of new ideas” because they “keep falling back on old habits.”Entrepreneur and VC Paul Graham said in a 2013 interview that investors tend to be biased against older founders.“The cutoff in investors’ heads is 32… after 32, they start to be a little skeptical.”

These various statements remind me of the 1960s counterculture saying don’t trust anyone over 30.But, are they justified?Are young entrepreneurs more likely to produce high-growth firms?Can middle-age founders in their 40s be successful?

August 20, 2018

AI is rapidly becoming one of the most important technologies of our era. Every day we can read about the latest AI advances from startups and large companies, including applications that not long ago were viewed as the exclusive domain of humans.Over the past few years, the necessary ingredients have finally come together to propel AI beyond the research labs into the marketplace: powerful, inexpensive computer technologies; huge amounts of data; and advanced algorithms.

Machine learningand related algorithms like deep learning, have played a major role in AI’s recent achievements.Machine learning gives computers the ability to learn by ingesting and analyzing large amounts of data instead of being explicitly programmed.It’s enabled the construction of AI algorithms that can be trained with lots and lots of sample inputs, which are subsequently applied to difficult AI problems like language translation, natural language processing, and playing championship-level Go.

Over the past several decades, AI has been besieged by rounds of hype which over-promised, under-delivered, and nearly killed the field. Once more, a recent Gartner article positioned machine and deep learning at the top of their hype cycle, - when all the excitement, publicity and promising potential often leads to a peak of inflated expectations, risking falling into the trough of disillusionment if the technology fails to deliver. Now that AI is finally reaching a tipping point of market acceptance, it’s particularly important to be cautious and not repeat past mistakes.

InArtificial Intelligence - The Revolution Hasn’t Happened Yet, UC Berkeley professor Michael I. Jordan aims to inject such a note of caution.“AI has now become the mantra of our current era…The idea that our era is somehow seeing the emergence of an intelligence in silicon that rivals our own entertains all of us - enthralling us and frightening us in equal measure.And, unfortunately, it distracts us…Whether or not we come to understand intelligence any time soon, we do have a major challenge on our hands in bringing together computers and humans in ways that enhance human life.”

August 13, 2018

I recently read Better Identity in America, a report by the Better Identity Coalition, an organization launched earlier this year to focus on promoting the development and adoption of solutions for identity verification and authentication.The report outlines a policy agenda for improving the privacy and security of digital systems and help combat identity fraud.

How bad is the problem?Data breaches, large-scale fraud, and identity theft are becoming more common.In 2017, there were 16.7 million victims of identity fraud in the US, causing a loss of $16.8 billion; there was a 44.7% increase in US data breaches between 2016 and 2017 and a 30% rise in online shopping fraud; 179 million personal information records were exposed to data breaches, 69% of which were identity theft incidents.

The report notes that “the ability to offer high-value transactions and services online is being tested more than ever, due in large part to the challenges of proving identity online. The lack of an easy, secure, reliable way for entities to verify identities or attributes of people they are dealing with online creates friction in commerce, leads to increased fraud and theft, degrades privacy, and hinders the availability of many services online.”

As the report reminds us, the extent of this challenge was famously captured back in 1993 by Peter Steiner’sNew Yorkercartoon with the caption On the Internet, nobody knows you’re a dog.25 year later, the cartoon still perfectly describes the identity challenge.If anything, the challenge is even more serious in 2018, given that the volume and variety of online transaction services are greater than ever before.

August 06, 2018

I’ve long believed that customer self-service was key to the success of e-business, - the IBM Internet strategy I was closely involved with in the mid-late 1990s.It was quite revolutionary how easy it was to now do for yourself so many ordinary activities that previously required a phone call during office hours or a trip to a store or office.

Moreover, such e-business applications were relatively easy to develop.By integrating their existing transaction and data base applications with a web front-end, - a strategy we succinctly described as Web + IT, - any business could now be in touch with its customers, employees, suppliers and partners at any time of the day or night, no matter where they were.Companies were able to engage in their core activities in a more productive way by web-enabling their back-end systems, that is, linking their web front-ends to the presentation services of their back-end applications.

I was reminded of IBM’s e-business strategy when recently reading about Robotic Process Automation (RPA), a technology for automating business processes based on emulating the manual actions of a human at a keyboard.RPA aims to improve the operational efficiency of office and service workers by automating tedious, repetitive, tasks, such as those associated with widely used horizontal processes in HR, Finance & Accounting and IT services.

RPA enables process automation at a fraction of the cost and time of classic software development.Rather than automating a process by redesigning the overall back-end application, RPA interfaces with the back-end system by performing the same actions that a human does via the application’s user interface. RPAs thus creates kind of software robots, or bots for short, that work alongside the humans.

July 30, 2018

Blockchain Revolution, published in May of 2016 by Don Tapscott and Alex Tapscott, was one of the first books that explained the promise of blockchain technologies to the general public. Its central argument is that for nearly four decades, the Internet has been great for reducing the costs of searching, collaborating, and exchanging information.But, it has serious limitations for business and economic activity.“Doing business on the Internet requires a leap of faith,” the book noted.The Internet was designed to move information, but it’s lacked the necessary trust, security and privacy safeguards to move assets of value.With blockchain, we’re now seeing the emergence of such an Internet of value.“Now for the first time ever we have a native digital medium for value, through which we can manage, store, and transfer any asset.”

An updated edition of Blockchain Revolution was published this past June. A lot has happened with blockchain in the intervening two years.The updated edition explains some of these recent developments in a new preface, including cryptoassets, permissioned networks, identity and supply chains.I’ll focus my comments on the emergence of a blockchain-based token economy, driven by the explosive growth in the value and variety of cryptoassets.

When the original book was published, the entire cryptoasset market had a value of $9 billion, - mostly dominated by the value of Bitcoin at $7 billions, while Ether, the Ethereum cryptocurreny, had reached around $1 billion.The new edition estimates that as of March, 2018, the cryptoasset market was around $400 billion in size, a value subject to rapid fluctuations.In addition, a wide variety of cryptoassets have been developed over the past two years.

July 23, 2018

Artificial intelligence is rapidly becoming one of the most important technologies of our era.Every day we can read about the latest AI advances from startups and large companies.Over the past few years, the necessary ingredients have come together to take AI across the threshold: powerful, inexpensive computer technologies; huge amounts of data; and advanced algorithms, especially machine learning.Machine learning has enabled AI to get around one of its biggest obstacles, - the so-called Polanyi’s paradox.

Explicit knowledge is formal, codified, and can be readily explained to people and captured in a computer program.But, tacit knowledge, a concept first introduced in the 1950s by scientist and philosopher Michael Polanyi, is the kind of knowledge we’re often not aware we have, and is therefore difficult to transfer to another person, let alone capture in a computer program.

“We can know more than we can tell,” said Polanyi in what’s become known as Polanyi’s paradox. This common sense phrase succinctly captures the fact that we tacitly know a lot about the way the world works, yet aren’t able to explicitly describe this knowledge. Tacit knowledge is best transmitted through personal interactions and practical experiences. Everyday examples include speaking a language, riding a bike, and easily recognizing many different people, animals and objects.

Machine learning, and related advances like deep learning, have enabled computers to acquire tacit knowledge by being trained with lots and lots of sample inputs, thus learning by analyzing large amounts of data instead of being explicitly programmed.Machine learning methods are now being applied to vision, speech recognition, language translation, and other capabilities that not long ago seemed impossible but are now approaching or surpassing human levels of performance in a number of domains.

July 16, 2018

In 2013, the Houston Astros finished the season with a 51-111 record, - the worst record in the history of the franchise, having also lost over 100 games in the previous two years.In 2017, the Astros won 101 games, finished first in the American League West Division, and went on to win their first ever World Series.How did the Astros go from worst team record to first World Series victory in only four years?

The answer, in a nutshell, is analytics.In December, 2011 the Astros hired Jeff Luhnow from the St Louis Cardinals as their General Manager.As VP of scouting and player development in St Louis, Luhnow built a strong analytics department and enjoyed great success as one of the top talent producers in baseball, helping the Cardinals win the 2011 World Series as well as reach the playoffs from 2012 to 2015.Luhnow then brought his analytics and management skills to Houston and proceeded to transform the team into a top contender in a short few years.

Today, just every major sports team makes extensive use of sports analytics, but none more so than Major League Baseball (MLB) teams.Baseball has long been a game of statistics, collecting and analyzing vast amounts of data going back to the founding of the MLB in the years following the Civil War.

Top MLB players are constantly compared not just to other contemporaneous players, but to other top players across the long history of the game, - a history that was very nicely depicted in Baseball, Ken Burns excellent documentary miniseries. Babe Ruth, Willie Mays, Mickey Mantle, Hank Aaron and countless other players are never forgotten in the world of baseball statistics.For those of us who are fans, - as I am, not surprisingly given my Cuban roots, - baseball’s rich history is a major part of our enjoyment of the game.

July 09, 2018

Given the pace of technological change, we tend to think of our age as the most innovative ever.But over the past several years, a number of economists have argued that increasing R&D efforts are yielding decreasing returns.

Has the ideas machine broken down?, asked The Economistin a January, 2012 article that examined the growing concerns that we may be in a long-term period of slow innovation despite our rapidly advancing technologies.“With the pace of technological change making heads spin, we tend to think of our age as the most innovative ever,” saidThe Economist.“We have smartphones and supercomputers, big data and nanotechnologies, gene therapy and stem-cell transplants. Governments, universities and firms together spend around $1.4 trillion a year on R&D, more than ever before.”But, perhaps these don’t quite compare with modern sanitation, electricity, cars, planes, the telephone, and antibiotics. These innovations, first developed in the late 19th and early 20th century, have long been transforming the lives of billions.

In a September, 2012 paper, Northwestern University economist Robert Gordon questioned the generally accepted assumption that economic growth is a continuous process that will persist forever.He wrote that the slow growth we’ve been experiencing in the US and other advanced economies isn’t cyclical, but rather evidence that long-term economic growth may be grinding to a halt.

The rapid growth and rising per-capita incomes we experienced at the height of the Industrial Revolution, - between 1870 and 1970, - may have been a unique episode in human history.Since the 1970s, US productivity and income growth have dipped sharply except for an Internet-driven productivity boost between1996 and 2004.Innovation may be hitting a wall of diminishing returns.There was little growth before 1800, and there might conceivable be little growth in the future.

A similar pessimistic view was expressed by George Mason University economist Tyler Cowen in his 2011 book The Great Stagnation. According to Cowen, over the past two centuries the US economy has enjoyed lots of low-hanging fruit, including a vast, resource rich land, waves of immigrant labor, access to education and the technological advances of the Industrial Revolution. But, Cowen believes that we are at a technological plateau, and wonders whether long term growth is still possible because the supply of low-hanging economic fruit is nearly exhausted.

July 03, 2018

After decades of promise and hype, AI is now seemingly everywhere.Over the past several years, the necessary ingredients have come together to propel AI beyond the research labs into the marketplace: powerful, inexpensive computer technologies; huge amounts of data; and advanced algorithms including machine learning.

While AI is likely to become one of the most important technologies of our era, we’re still in the early stages of deployment, especially outside leading-edge technology companies.But, as AI continues its rapid progress, it’s not too early to ask a few important questions: What is AI’s overall value to the economy? What are the biggest application opportunities?; and, what are AI’s most serious challenges and limitations?

To address these questions, McKinsey recently published a discussion paper on the marketplace potential of AI.The paper is particularly focused on machine learning and related technologies, and is based on a detail analysis of more that 400 use cases across 19 industries and 9 business functions.

“Two-thirds of the opportunities to use AI are in improving the performance of existing analytical use cases,” is the paper’s overriding finding.This is a very interesting insight.AI is now being successfully applied to tasks that not long ago were viewed as the exclusive domain of humans, - machine translation, natural language processing, defeating the world’s top Go players, - but only 16% of the use cases studied by McKinsey are greenfield cases, where only machine learning techniques can be used.For the vast majority of the use cases, 69%, the key value of machine learning is to improve performance beyond that provided by traditional analytic techniques.And, in the remaining 15% of cases, machine learning provided limited additional performance over existing analytical methods.

Their comprehensive report, - over 150 pages and 276 citations, - explored in great depth the changing nature of work as global competition and advanced technologies like AI, robotics, and smart IoT devices transform jobs and careers.It states right up front that “The most important challenge facing the United States - given the seismic forces of innovation, automation, and globalization that are changing the nature of work - is to create better pathways for all Americans to adapt and thrive.The country’s future as a stable, strong nation willing and able to devote the necessary resources and attention to meeting international challenges depends on rebuilding the links among work, opportunity, and economic security.”

For much of the 20th century, technology and globalization helped lead America to its leadership position in the world stage.Underpinning this US leadership was the American Dream, - the promise that anyone can get ahead and achieve success and prosperity through talent and hard work.But this promise has been eroding for decades.For many of its citizens, the American Dream has been steadily disappearing.

Technology advances have led to massive innovations and efficiencies, enabling companies to get the same work done with significantly fewer people.At the same time, many US-headquartered companies are truly global, doing business all over the world.Beyond outsourcing jobs to countries with lower labor costs they’ve been investing and adding jobs where the demand is highest, - which has often been in faster growing emerging markets.

The US thus faces two major challenges: “creating new work opportunities, better career paths, and higher incomes for its people, while developing a workforce that will ensure U.S. competitive success in a global economy that will continue to be reshaped by technology and trade.”

June 18, 2018

Earlier this year, the Economist Intelligence Unit (EIU) published the Inclusive Internet Index 2018.Now in its second year, the Index analyzes and compares the current state of the Internet across 86 countries, ranging from advanced to developing economies and covering 91% of the world’s population.Its key objective is to help policymakers understand the key factors necessary for the wide, inclusive use of the Internet in their respective countries.

Countries are evaluated based on 54 indicators organized around four major categories:

Availability, which captures the quality of breadth of the infrastructure available for Internet access, including network availability, access points for landline and mobile connections, and the basic electricity infrastructure needed to support Internet connectivity in urban and rural areas.

Affordability, which measures the cost of Internet access relative to income, the competitive environment for wireless and broadband operators and the measures taken to decrease costs and promote access.

Relevance, whichlooks at the availability of Internet content in the local language(s) and the value of being connected to get access to relevant services like news, entertainment, health advice and business and financial information.

Readiness, which examines the capacity to take advantage of accessing the Internet, including level of literacy, educational attainment, cultural acceptance, privacy and security, and trust in the sources of online information.

Each country is assigned a score for each of the four categories, as well as an overall index score.

June 11, 2018

For the past couple of centuries, general-purpose technologies (GPTs) have been the key drivers of productivity and economic growth, - thanks to their rapid improvements in price and performance, pervasive applications, and ability to spur complementary innovations across a wide variety of industries. The steam engine, electricity and the internal combustion engine are prominent examples of GPTs in the 18th and 19th century.More recently, semiconductors, computers and the Internet have led to the digital revolution of the past several decades.

Beyond innovations in existing sectors, the rapidly improving price/performance of GPTs have led over time to the creation of whole new applications and industries. For example, the steady declines in the price of electricity-generated power and the improvement in the efficiency of electric motors led to the radical transformation of manufacturing in the early part of the 20th century with the advent of the assembly line. It also led to the creation of the consumer appliance industry, e.g., refrigerators, dishwashers, washing machines. Similarly, as the semiconductor industry took off, it led to the historical transition from the industrial economy of the past two centuries to our ongoing digital economy.

How about artificial intelligence? Beyond its use by leading edge technology companies, we’re still in the early stages of AI deployment.It’s only been in the last few years that major advances in machine learning have taken AI from the lab to early adopters in the marketplace.While considerable innovations and investments are required for its wider deployment, AI is likely to become one of the most important GPTs in the 21st century.

June 04, 2018

For over a decade, MIT professor Thomas Malone has been conducting pioneering research on the collective intelligence of groups.Malone is the founding director of the MIT Center for Collective Intelligence, and has authored and co-authored a number of seminal articles on the subject.A few weeks ago, Malone published Superminds: The Surprising Power of People and Computers Thinking Together, - a book that nicely explains the concept of collective intelligence and how our increasingly capable machines are now complementing the intelligence of groups of people.The book illustrates these concepts with a wide variety of examples and case studies.

He defines a supermind as “a group of individuals acting in ways that seem intelligent.” A supermind has measurable properties just as individual do.It can have a life of its own beyond those of its individuals constituents. And, as Malone points out, throughout history, groups of people working together as superminds have been responsible for the vast majority of human achievements.

May 28, 2018

Autonomous vehicles (AVs) may well be the quintessential symbol of our AI/robotics age.Cars are a major part of our daily lives.A self-driven car is a concept that requires little explanation, something we can all quickly grasp.It wasn’t that long ago that the notion on an AV driving us around while we read or sleep would have felt like the stuff of science fiction. Having experimental AVs coursing through public roads in Silicon Valley, Pittsburgh and Phoenix is concrete evidence that our smart machines are achieving human-like intelligence, raising a number of important questions:how long before AVs are all around us?; how will they impact our lives?; what unintended consequences might we have to deal with?; and what should be done to ensure that they arrive as safely and smoothly as possible?

These are among the questions addressed in the March 1 issue of The Economist, which includes a comprehensive special report on autonomous vehicles with seven articles on the subject.The special report starts out with the assumption that whatever technological hurdles lie in their way will be eventually overcome.But there are wider economic, social and public policy issues to be explored, starting with: what can we learn from the transition to horseless carriages in the 20th century that can now be applied to the transition to driverless cars?

May 21, 2018

Will there be enough work in the future?What’s the likely impact of our continuing technology advances on jobs?How will they impact productivity?These are all-important questions to reflect on as our increasingly smart technologies are now being applied to activities that not long ago were viewed as the exclusive domain of humans.

While no one really knows the answer to these questions, most studies of the subject have concluded that relatively few occupations, 10% or less, will be entirely automated and disappear over the next 10 - 15 years.Instead, a growing percentage of occupations will significantly change as technologies automate the more routines tasks within those occupations.People will still be involved, but their jobs will be transformed by the advanced tools they now have to master.Moreover, a growing technology-based economy will likely create all kinds of new occupations, which will more than offset declines in occupations displaced by automation, - as has been the case over the past couple of centuries.

One would also expect that technology advances will increase the productivity of the workers involved in these new or transformed occupations.But, if we look at the past 10-15 for guidance, we’ll find that productivity growth has significantly declined over this timeframe, notwithstanding huge technology advances like smartphones, cloud computing, big data and artificial intelligence.Economist have proposed competing explanations for the declining productivity growth, but have so far failed to reach consensus.Understanding this productivity puzzle may well hold the key to future productivity improvements and long-term economic growth.

May 14, 2018

Looking back upon my long career, one of the major factors shaping my views of strategy and innovation are the powerful transformational forces that I saw buffeting the IT industry over much of that time.It’s frankly sobering how many once powerful IT companies are no longer around or are shadows of their former selves.The carnage might be more pronounced in the fast-changing IT industry, but no industry has been immune. It’s all part of what Joseph Schumpeter called creative destruction over 75 years ago, - “the process of industrial mutation that incessantly revolutionizes the economic structure from within, incessantly destroying the old one, incessantly creating a new one.”

For a startup a transformative innovation is all upside, an opportunity to take-on established companies with new products that offer significantly better capabilities and/or lower costs. Startups hope that their compelling new offerings will help them establish a foothold in the marketplace and, over time, become leaders in their industry.

But change is often difficult for established companies.Over the years, they’ve amassed a number of valuable assets and extensive organizations. Already consumed with managing their existing operations, - e.g., products and services; supply chain and channel partners; sales and marketing; revenues, profits and cash; - they may see a transformative innovation as more of a threat or distraction than an opportunity.

May 07, 2018

I recently participated in a blockchain panel at a conference of senior supply chain executives from a variety of companies. I welcomed the opportunity to learn how these real-world executives feel about blockchain, as well as how to best communicate the promise and current state of this emerging and potentially transformative technology.Let me share a few of my observations.

Before our panel started, the executives were asked how they viewed blockchain.In general, they felt that there was too much hype surrounding blockchain, were skeptical about its value to supply chain systems, but expressed a wait-and-see attitude.They were then asked how many were conducting blockchain prototypes in their companies.The majority raised their hands.

I thought that their views and subsequent questions and comments during the panel succinctly captures the current state of blockchain in the business world: skeptical, turned off by the hype, but hedging their bets by getting on the learning curve with prototypes and experimentation.

The moderator first asked each panelist to address the audience’s skepticism about blockchain.What, in our opinion, is the key value of blockchain?I said that I look at the value of blockchain through two complementary lenses:enhancing the security of the Internet, and reducing the inefficiencies and overheads in applications, like supply chains, involving multiple institutions.

April 30, 2018

In March of 2017, Northeastern University launched a new interdisciplinary initiative, the Global Resilience Institute (GRI).GRI is a university-wide research and educational effort to advance the resilience of individuals, communities, economies and societies around the world by strengthening their capacity to adapt to an increasingly turbulent world.Its key mission is to develop and deploy practical tools, applications and skills to help effectively respond to human-made and naturally-occurring disruptions and disasters.

GRI’s Founding Director is Northeastern professor Stephen Flynn.In his Director’s Welcome web page, Flynn writes that “Building resilience within and across multiple levels, from individual to societal, requires a comprehensive effort that matches the complexity of the increasingly interdependent systems and networks we all rely on… Our aim is to serve as both a channel and a catalyst for experts in industry, academia, and government to collaborate on solving the world’s most pressing resilience challenges. These include both slowly emerging disruptions as well as shocks and sudden disasters.”

Most everyone agrees that dealing with our increasingly turbulent, complex world us requires significant enhancements to the resilience of our societies and systems.But, five critical barriers stand in the way:

Widespread risk illiteracy and a limited understanding of the new dependencies and interdependencies that pervade our more connected lives;

Inadequate designs for embedding resilience into systems, networks, and infrastructure at multiple levels;

April 23, 2018

Identity plays a major role in everyday life. Think about going to an office, getting on a plane, logging to a website or making an online purchase.Identity is the key that determines the particular transactions in which we can rightfully participate as well as the information we’re entitled to access. But, we generally don’t pay much attention to the management of our identity credentials unless something goes seriously wrong.

For much of history, our identity systems have been based on face-to-face interactions and on physical documents and processes.But, the transition to a digital economy requires radically different identity systems.In a world that’s increasingly governed by digital transactions and data, our existing methods for managing security and privacy are proving inadequate.Data breaches, large-scale fraud, and identity theft are becoming more common.In addition, a significant portion of the world’s population lacks the credentials needed to participate in the digital economy.Our existing methods for managing digital identities are far from adequate.

As explained in A Blueprint for Digital Identity, - a 2016 report by the World Economic Forum, - identity is essentially a collection of information or attributes associated with a specific individual.These attributes fall into three main categories:inherent - attributes intrinsic to an individual, - e.g., age, height, date of birth, fingerprints, color of eyes, retinal scans;assigned - attributes attached to but not intrinsic to the individual - e.g., e-mail address, telephone numbers, social security, drivers license, passport number; and accumulated - attributes gathered or developed over time - e.g., health records, job history, home addresses, schools attended.

While mostly associated with individuals, identities can also be assigned to legal entities like corporations, partnerships and trusts; to physical entities like cars, buildings, smartphones and IoT devices; and to digital entities like patents, software programs and data sets.

Data attributes are generally siloed within different private and public sector institutions, each using its data for its own purposes.But to reach a higher level of privacy and security, we need to establish trusted data ecosystems, which requires the exchange and sharing of data across a variety of institutions.The more data sources a trusted ecosystem has access to, the higher the probability of detecting fraud and identity theft while reducing false positives.In addition, an ecosystem with a variety of data sources can help foster economic inclusiveness by certifying the identities and credit worthiness of poor people with no banking affiliation.

However, safeguarding the data used to validate identities creates security and privacy issues of its own. It’s unsafe to gather all the needed attributes within one institution or central data location, making it a target for data breaches. But, it’s also highly unfeasible, as few institutions will let their critical data out of their premises.

April 16, 2018

About five years ago, McKinsey conducted an online survey of over 850 CEOs and other senior executives to find out how their companies were faring in their implementation of digital technologies and strategies. The data showed that executives were generally optimistic, but they still had much to do to achieve their digital business objectives.Organizational alignment and leadership were the critical factors in the success or failure of their digital strategies.

More specifically, the survey asked them about their progress in embracing five major digital trends: big data and advanced analytics, digital engagement of customers, digital engagement of employees and external partners, automation, and digital innovation.While all five trends were important, executives said that digital customer engagements was their most important competitive differentiator and where they were expecting the largest financial returns.

This is not surprising, even more now than at the time of the survey.A company can differentiate itself from competitors in one of two key ways: by providing a superior customers experience or by offering the lowest prices. For companies that prefer the former, digital channels are, far and away, the most cost-effective way of reaching out to their clients. The explosive growth of mobile devices means that you can be engaged with your customers whether they are at home, at work, in your store or anywhere else.

But, digital customers can be fickle and hard to satisfy.It’s quite difficult to keep up with their fast changing behaviors and expectations.New products and services are hitting the market faster than ever, brand loyalty keeps decreasing, and the increased competition continues to shift power from institutions to individuals.

The book reminds us that technology has long been transforming business.Electricity and standardized processes ushered the first wave of business transformations about 100 years ago.Manufacturing was decomposed into a set of processes and reassembled into the assembly line, leading to the age of mass production.

Several decades later, IT and process automation brought us the second wave of business transformation.Mainframes and data processing software enabled the automation of back-office processes in the 1960s and 1970s.Front-office automation followed in the 1980s with the advent of PCs and client-server systems.A decade later, advances in networking and ERP software led to business process reengineering and management.

“Now, the third wave involves adaptive processes,” write Daugherty and Wilson. “This third era, which builds on the previous two, will be more dramatic than the earlier revolutions enabled by assembly lines and digital computers, and will usher in entirely new, innovative ways of doing business… This adaptive capability is being driven by real-time data rather than by an a priori sequence of steps. The paradox is that although these processes are not standardized or routine, they can repeatedly deliver better outcomes.”