A collection of observations, news and resources on the changing nature of innovation, technology, leadership, and other subjects.

December 05, 2016

A few weeks ago I discussed whether AI is finally reaching a tipping point, mostly based on a recently published report, - Artificial Intelligence and Life in 2030.The report was developed by a study panel of AI experts convened by the One Hundred Year Study of AI (AI100), an initiative launched at Stanford University in December, 2014 “to study and anticipate how the effects of artificial intelligence will ripple through every aspect of how people work, live and play.”To better understand the future impact of AI on everyday lives, the panel focused the study on the likely influence of AI on a typical North American city by the year 2030.

The report is organized into three main sections.Section I, - What is Artificial Intelligence?, - describes how researchers and practitioners define AI, as well as the key research trends that will likely influence AI’s future.Section II looks into AI’s overall impact on various sectors of the economy, while the third Section, examines AI issues related to public policy.

My previous discussion was primarily focused on Section I.I’d now like to turn my attention to Section II, - AI by Domain.To help analyze where AI might be heading, the study panel narrowed its explorations to the eight domains most likely to be impacted by AI:

November 28, 2016

A few weeks ago I first learned about a relatively new concept - Digital Twin.A Digital Twin is essentially a computerized companion to a real-world entity, be it an industrial physical asset like a jet engine, an individual’s health profile, or a highly complex system like a city.It’s a highly realistic, one-to-one digital model of each such specific physical entity.

Digital Twin helps bring the physical and digital worlds closer to each other.It’s intertwined with and complementary to the Internet of Things (IoT).The huge amounts of data now collected by IoT sensors on physical objects, personal devices and smart systems make it possible to represent their near real-time status in their Digital Twin alter-ego.

“The myriad possibilities that arise from the ability to monitor and control things in the physical world electronically have inspired a surge of innovation and enthusiasm,” said a 2015 McKinsey report on the Internet of Things.Experts estimate that the number of connected things or devices will reach 50 billion by 2020, growing to 100s of billions in the decades ahead.The economic potential of the smart solutions this makes possible is enormous, possibly reaching several trillion dollars within a decade.

“The rise of the giants is a reversal of recent history… In the 1980s and 1990s management gurus pointed to the demise of size as big companies seemed to be giving way to a much more entrepreneurial economy.Giants such as AT&T were broken up and state-owned firms were privatised.High-tech companies emerged from nowhere.Peter Drucker, a veteran management thinker, announced that ‘the Fortune 500 [list of the biggest American companies] is over.’ That chimed with the ideas of Ronald Coase, an academic who had argued in ‘The Nature of the Firm’ (1937) that companies make sense only when they can provide the services concerned more cheaply than the market can.”

Professor Coase’s views on the firm changed quite a bit over the years. In 1937 he published The Nature of the Firm, a seminal article which along with other major contributions earned him the 1991 Nobel Prize in economics.In the article, Professor Coase provided a simple answer to the question: Why do firms exist?He explained that, in principle, a firm should be able to find the cheapest, most productive goods and services by contracting them out in an efficient, open marketplace. However, markets are not perfectly fluid. Transaction costs are incurred in obtaining goods and services outside the firm, such as searching for the right people, negotiating a contract, coordinating the work, managing intellectual property and so on. Thus, firms came into being to make it easier and less costly to get work done.

November 14, 2016

I recently participated in a Treasury Identity Forum organized by the US Treasury Department in Washington, DC.The Forum focused “on the critical role of legal identity for financial inclusion, economic development, and anti-money laundering/counter financing of terrorism (AML/CFT) safeguards, and the development of new technology identification/authentication solutions to help achieve these goals.”It brought together stakeholders from governments, financial service companies, FinTech startups and technologists to better understand how emerging technologies and legal frameworks can help us develop the required digital identity systems.

I was a member of a panel on how government, business and research communities can collaborate in developing workable identity solutions.Let me summarize the points I made in my introductory remarks.

From time immemorial, our identity systems have been based on face-to-face interactions and on physical documents and processes.But, the transition to a digital economy requires radically different identity systems.As the economy and society move toward a world where interactions are primarily governed by digital data and transactions, our existing methods of managing identity and data security are proving inadequate.Large-scale fraud, identity theft and data breaches are becoming common, and a large fraction of the world’s population lacks the credentials needed to be part of the digital economy.

November 08, 2016

The September 17 issue of The Economist included a special report on the rise of so called superstar companies.“Disruption may be the buzzword in boardrooms, but the most striking feature of business today is not the overturning of the established order,” notesThe Economist.“It is the entrenchment of a group of superstar companies at the heart of the global economy.”

This trend toward consolidation and growing size is evidenced by a few worldwide statistics: 10% of public companies generate 80% of all profits; firms with over $1 billion in annual revenues are responsible for 60% of total global revenues; and the rate of mergers and acquisitions is more than twice what it was in the 1990s.The trend is particularly prominent in the US: in the 20 years from 1994 to 2013, the share of GDP generated by the 100 biggest companies rose from 33% to 46%; the five largest banks now account for 45% of all banking assets, up from 25% in 2000; and new firm formation has been going down since the late 1970s, leading to an overall decline in young- and medium-aged companies over the years.

The current rise of large companies is somewhat unexpected.Following the Great Depression and WW2, the country welcomed the stability promised by corporate capitalism.Big, multinational companies dominated most industries, - from GM, Ford and Chrysler in cars to Esso/Exxon, Mobil and Texaco in oil and gas.It was an era characterized by bureaucratic corporate cultures, focused on organizational power and orderly prosperity.

This all started to change a few decades later with the advent of a more innovative, fast moving entrepreneurial economy.The 1980s saw the rise of young, high-tech companies, - e.g., Microsoft, Apple, Oracle, Sun Microsystems; telecommunications was deregulated; ATT was broken up; and Silicon Valley became the global hub for innovation, emulated by regions around the world.

October 31, 2016

Design thinking has become an increasingly popular topic of discussion over the past decade.It was featured in the September, 2015 issue of the Harvard Business Review with several articles on the subject.Design is no longer just for physical objects, e.g. cars, bridges, shoes, jewelry, smartphones.Design thinking is now being applied to abstract entities, - e.g. systems, services, information and organizations, - as well as to devise strategies, manage change and solve complex problems.

The application of design thinking beyond products isn’t new. Nobel laureate Herbert Simon discussed the concept in his 1969 classic The Sciences of the Artificial. IDEO, a firm best known for pioneering this expanded view of design, traces its roots back to 1978.The School of Design in London’s Royal College of Art has long been expanding the boundaries of industrial design. Stanford’s Institute of Design, - better known as the d.school, - was launched in 2004 as a graduate program that integrates business, the social sciences, the humanities and other disciplines into more traditional engineering and product design.

The d.school’s website nicely explains its design-thinking point of view: “Students and faculty in engineering, medicine, business, law, the humanities, sciences, and education find their way here to take on the world’s messy problems together. Human values are at the heart of our collaborative approach… Along the way, our students develop a process for producing creative solutions to even the most complex challenges they tackle…Our deliberate mash-up of industry, academia and the big world beyond campus is a key to our continuing evolution.”

October 24, 2016

After many years of promise and hype, AI seems to be finally reaching a tipping point of market acceptance. “Artificial intelligence is suddenly everywhere… it is proliferating like mad.” So starts a Vanity Fairarticle published around two years ago by author and radio host Kurt Andersen. And, this past June, a panel of global experts convened by the World Economic Forum (WEF) named Artificial Intelligence, - Open AI Ecosystems in particular - as one of its Top Ten Emerging Technologies for 2016 because of its potential to fundamentally change the way markets, business and governments work.

AI is now being applied to activities that not long ago were viewed as the exclusive domain of humans.“We’re now accustomed to having conversations with computers: to refill a prescription, make a cable-TV-service appointment, cancel an airline reservation - or, when driving, to silently obey the instructions of the voice from the G.P.S,” wrote Andersen.The WEF report noted that “over the past several years, several pieces of emerging technology have linked together in ways that make it easier to build far more powerful, human-like digital assistants.”

What will life be like in such an AI-based society?What impact is it likely to have on jobs, companies and industries?How might it change our everyday lives?

These questions were addressed in Artificial Intelligence and Life in 2030, a report that was recently published by Stanford University’s One Hundred Year Study of AI (AI100).AI100 was launched in December, 2014 “to study and anticipate how the effects of artificial intelligence will ripple through every aspect of how people work, live and play.”The core activity of AI100 is to convene a Study Panel every five years to assess the then current state of the field, review AI’s progress in the years preceding the report, and explore the potential advances that lie ahead as well the technical and societal challenges and opportunities these advances might raise.

October 18, 2016

I recently read an article by strategy consultant and writer Ken Favaro which nicely explained how to best think about strategy in today’s business environment. “Many business leaders subscribe to the classic definition of strategy as a set of actions designed to achieve an overall aim,” wrote Favaro in The Trouble with Putting Goals Ahead of Strategy, published in strategy+business in 2015.“In other words, they believe strategy starts with a goal.But for companies that have implemented winning strategies, that’s not how it typically happens.”

Goals and strategies serve very different purposes.Goals, visions and missions are important to paint an exciting picture of the future around which everyone can rally, as well as to help set the general direction of a company. But being too high level, goals by themselves don’t give you much guidance on how to get things done and what key decisions must be made and prioritized.“[G]oals tell you very little about the fundamental choices you should make around creating customer and company value.Such choices are the very essence of your strategy.”

Most winning strategies start with an idea for an innovative new product, service or business model, followed by a plan to bring the idea to market.Only then should come a big, bold goal, as a way “to crystalize an ambition, motivate the troops, and excite investors.Unfortunately, strategic planning in most companies gets this sequence exactly reversed- and when that happens, bad strategies result.”

I totally agree with Favaro, based on my personal experiences leading emerging technology initiatives at IBM, including parallel supercomputing, the Internet, and Linux.Let me share some of what I learned when working on IBM’s Internet strategy.

October 11, 2016

Why do firms exist? Ronald Coase, - the eminent British economist and University of Chicago professor, - addressed this question in The Nature of the Firm, - a seminal paper published in 1937 which along with other major achievements earned him the 1991 Nobel Prize in economics.

Professor Coase explained that, in principle, a firm should be able to find the cheapest, most productive, highest quality goods and services by contracting them out in an efficient, open marketplace. However, markets are not perfectly fluid. Transaction costs are a kind of friction incurred in obtaining goods and services outside the firm, such as searching for the right supply chain partners, establishing a trusted relationship, negotiating a contract, coordinating the work, managing intellectual property and so on. Firms came into being to make it easier and less costly to get work done.

A recent IBM report, - Fast forward: Rethinking enterprises, ecosystems and economies with blockchains, - harks back to Coase’s paper to analyze the potential value of blockchains. The report notes that while transaction costs are lower within firms, “in recent years as enterprises have scaled, the added complexity of operations has grown exponentially while revenue growth has remained linear.The result?At a certain point, organizations are faced with diminishing returns.Blockchains have the potential to eradicate the cost of complexity and ultimately redefine the traditional boundaries of an organization.”

October 04, 2016

Last year, IBM’s Institute for Business Value conducted a C-suite studyaimed at identifying the key disruptive trends that will likely impact companies around the world over the next three to five years, as well as what their senior executives are doing to better prepare their organizations for the expected disruptions.The study surveyed over 5,200 CEOs, CFOs, CIOs, CMOs and other C-suite executives across 21 industries in over 70 countries, - most of them in face-to-face interviews.After analyzing the survey data,- including the use of Watson Analytics to extract inferences from open-ended responses, - IBM published its findings in Redefining Boundaries: Insights from the Global C-suite Study.

To better understand the traits of the most successful enterprises, the IBM study also asked CxOs to rank their companies based on their innovation reputation and their financial performance over the previous three years.After analyzing their responses, it identified 5% of companies as Torchbearers, the name given to those companies enjoying both a strong innovation reputation and an excellent financial track record.At the other end of the spectrum, 34% of enterprises were identified as Market Followers or laggards in both innovation and financial performance.

The report organized its findings and recommendations into three main areas:Preparing for digital invaders; Creating a panoramic perspective; and Be first, be best or be nowhere. Let me briefly discuss each.

“Distributed ledger technology (DLT), more commonly called blockchain, has captured the imaginations, and wallets, of the financial services ecosystem,” notes the report, citing a few statistics as evidence: over 90 central banks are engaged in DLT discussions around the world; more than 24 countries have already launched blockchain-based initiatives; 80% of banks predict that they’ll launch blockchain projects by 2017; over 90 financial and technology companies have already joined blockchain consortia; over the past 3 years; more than 2,500 patents have been filed; and $1.4 billion has been invested in blockchain-based startups over the same time span.

But, significant hurdles must be overcome before the advent of large-scale blockchain infrastructures. The WEF correctly warns that this will take time.Not only are there major technology and standards issues to be worked out, but the industry will have to collaborate with governments around the world to develop the appropriate legal frameworks and regulatory environments.

September 06, 2016

Last year, McKinsey launched a multi-year study to explore the potential impact of automation technologies on jobs, organizations and the future of work.“Can we look forward to vast improvements in productivity, freedom from boring work, and improved quality of life?,” its initial article on the study asked, or “Should we fear threats to jobs, disruptions to organizations, and strains on the social fabric?”

Most jobs involve a number of different tasks or activities.Some of these activities are more amenable to automation than others.But just because some of the activities have been automated, does not imply that the whole job has disappeared.To the contrary, automating parts of a job will often increase the productivity and quality of workers by complementing their skills with machines and computers, as well as by enabling them to focus on those aspect of the job that most need their attention.

Given that few jobs or occupations will be entirely automated in the near-mid future, the study focused instead on the kinds of activities within jobs that are more likely to be automated, as well as how those jobs and business processes will then be redefined.It did so by analyzing the extensive data in O*NET, a web-based application sponsored by the US Department of Labor which includes the most comprehensive information on US occupations.

The study analyzed around 2,000 work activities across 800 different occupations, and grouped them into 18 different capabilities, - 3 of them social in nature; 10 cognitive; and 5 physical, - and then assessed the automation potential of each.It found that 45% of work activities could be automated using existing, state-of-the-art technologies.

The book is organized into three main sections.The first explains the blockchain from two complementary points of view: as a major next step in the evolution of the Internet; and as the architecture underpinning bitcoin, the best known and most widely held digital currency.The second and longest section describes how blockchain could potentially transform financial services, companies, government, the Internet of Things, and other key areas.The last section summarizes the major challenges that must be overcome as well as the governance required to fulfill the promise of blockchain.

“It appears that once again, the technological genie has been unleashed from its bottle,” write the authors in their opening paragraph.“Summoned by an unknown person or persons with unclear motives, at an uncertain time in history, the genie is now at our service for another kick of the can - to transform the economic power grid and the old order of human affairs for the better.If we will it.”

“Horizon scanning for emerging technologies is crucial to staying abreast of developments that can radically transform our world, enabling timely expert analysis in preparation for these disruptors,” said Meyerson. “The global community needs to come together and agree on common principles if our society is to reap the benefits and hedge the risks of these technologies.”

The technologies on the list are not new.They’ve been worked on for many years.But their selection to the Top Ten List indicates that, in the opinion of the council members, each of these technologies has now reached a maturity and acceptance tipping point where its impact can be meaningfully felt.

August 02, 2016

A few weeks ago, the World Economic Forum (WEF) published its annual list of the Top Ten Emerging Technologies for 2016.The technologies on the list have been worked on for years.But their inclusion in the Top Ten List indicates that each has now reached a market acceptance tipping point where its impact can be meaningfully felt.The Blockchain is one of the technologies in this year’s list, selected by the WEF panel of global experts because of its emerging potential to fundamentally change the way markets and governments work.

What does it mean for an infrastructure technology like the blockchain to have reached such a tipping point?The WEF report compared the blockchain to the Internet, noting that “Like the Internet, the blockchain is an open, global infrastructure upon which other technologies and applications can be built.And like the Internet, it allows people to bypass traditional intermediaries in their dealings with each other, thereby lowering or even eliminating transaction costs.”

I agree with this comparison and find it useful to help us understand how blockchains might evolve over the years. So, I’d like to compare the state of the blockchain in 2016 to the state of the Internet 25 years ago or so.

A few weeks ago I had an interesting conversation on the the state of service science with analysts from an IT research organization who were preparing a report on the subject for their clients.Our discussion led me to reflect on the evolution of service science over the past several years.I think that we are hearing a bit less about it these days.But is that because we’ve become tired of the subject and moved on, or because the application of science and technology to services is now so well accepted that it’s no longer a topic of debate?I very much think it’s the latter.

July 12, 2016

In September, 2014, I attended an MIT conference that explored the major progress that’s taken place in artificial intelligence, robotics and related technologies over the past several years. Autonomous vehicles was one of the main areas of discussion. With most other topics, there was considerable consensus, but not so with self-driving cars. While some thought that fully autonomous vehicles will be all around us within a decade or so, others were not quite so sure, myself included, due to the many highly complex technical, economic and societal issues that must be worked out.

I was reminded of this meeting a few weeks ago when I read that a Florida man had been killed while driving a Model S Tesla in autopilot mode.The accident is still under investigation, but it appears that the Tesla struck a tractor-trailer truck that was making a left turn in front of its path.Neither the driver or the Tesla’s autopilot noticed that a truck was suddenly crossing its lane of traffic, perhaps because the white truck was hard to spot against a bright sky.

This accident has led to a renewed discussion of the current state-of-the-art of vehicle automation, the approaches being pursued by different companies, and the prospects for the near- and long-term future.

July 04, 2016

Urbanization is one of the major forces shaping the 21st century - right up there with the digital technology revolution and globalization. Over half of the world’s population lives in urban areas, and as the 2014 UN World Urbanization Prospect noted: “The continuing urbanization and overall growth of the world’s population is projected to add 2.5 billion people to the urban population by 2050,” with the proportion of the population living in urban areas increasing to 66 percent by 2050.

Just about every study that’s benchmarked the competitiveness of major urban areas ranks London, - along with New York, - as the world’s two top cities. But despite, - or perhaps because of, - their leadership positions, both cities face major challenges as they deal with economic growth and a growing population.

London has been much in the news lately. First came the election of Sadik Khan last May, - the first Muslim mayor not only of London but of a major Western capital, followed by the recent Brexit referendum, where London voted to Remain in the EU by an overwhelming 60% of its vote while the UK as a whole voted 52% to Leave.

At this point, it’s very hard to predict where Brexit is heading over the next few months, let alone what it’s long term consequences might be. But, given its many strengths I have little doubt that London will remain among the world’s very top cities. So, let me put Brexit aside for the moment and discuss instead what London has been doing to address its major population and economic challenges.

“The whole point of virtual reality is to create a fantasy divorced from the physical world,” he wrote. “You’re escaping the dreary mortal coil for a completely simulated experience: There you are, climbing the side of a mountain, exploring a faraway museum, flying through space or getting in bed with someone way out of your league… There are some great games on these systems… There are also several useful experiences, like designing your Ikea kitchen in V.R.”

“But if you’re not a gamer and you’re not looking for a new kitchen, V.R. is, at this point, just too immersive for most media.A few minutes after donning my goggles, I came to regard my virtual surroundings as a kind of prison… I suspect that V.R. will be used by the masses one day, but not anytime soon.I’m not sure we’re ready to fit virtual reality into our lives, no matter how excited Silicon Valley is about it.”

Manjoo’s reaction brought to mind the work of Karen Sobel-Lojeski, who’s long been studying technology-mediated interactions. She’s in the faculty at Stony Brook University as well as the founder of the workplace consulting firm Virtual Distance International. To be effective, she argues, our technology-based interactions must take place within the proper context, defined as “everything around us that helps us to understand who we are, where we are, and what our role is.” The way VR works, - at least for now, - is by cutting off all real-world context to better immerse us in its simulations, - an experience that as Farhad Manjoo discovered, can be very unsettling.

June 21, 2016

In early June I spent a few days in Mexico City.The main purpose of the trip was to participate and give a keynote at the 2016 Cumbre de Directores (Director’s Summit), a conference sponsored by Endeavor Mexico in collaboration with the IPADE Business School.Endeavor is a global not-for-profit organization dedicated to long-term economic growth by supporting high-impact entrepreneurship in over 25 cities around the world, including Mexico.The Director’s Summit is Endeavor Mexico’s major annual meeting, bringing together over 400 entrepreneurs and senior executives.

My talk was focused on the changing nature of innovation in the digital economy.A major part of the talk included a discussion of innovations in the digital payments and financial services ecosystem.My keynote was followed by two separate FinTech panels, where different entrepreneurs discussed the importance of FinTech in Mexico as well as their individual companies’ efforts in the area.That same evening I attended and spoke at a reception sponsored by Angel Ventures Mexico, an early stage VC firm where I also heard quite a bit about FinTech startups. My overall impression is that there’s considerable FinTech activity in Mexico, a large portion of which is aimed at financial inclusion.

Every year, the World Bank publishes the Global Findex database, a measure of financial inclusion around the world, including how individuals save, borrow, make payments and manage risks in over 140 countries.According to their latest report, - Global Findex 2014, - 62 percent of adults worldwide have an account at a bank, at another type of financial institution, or with a mobile money provider, - up from 51 percent in 2011.The 2014 report also noted that 2 billion adults remain without an account around the world, a 20 percent decrease from the 2.5 billion unbanked adults in 2011. Technology advances, particularly the rapid growth in mobile devices and digital financial services, are the major reasons for these dramatic improvements.

The Global Findex database includes data for each individual country. Their 2014 data for Mexico showed that about 39 percent of adults had accounts with financial institutions, - a significant improvement over the 27.4 percent of adults with accounts in 2011.But, Mexico still lags many of its peers in Latin America.Over half of all adults have financial accounts in Latin America as a whole, significantly above Mexico’s figure.

Van Alstyne started out his presentation by noting that back in 2007, seven firms controlled 99% of handset profits: Nokia, Samsung, Ericsson, Motorola, LG, RIM and HTC.That same year, Apple’s iPhone was born and began gobbling up market share.By 2015, only one of the former incumbents had any profit at all, while Apple now generated 92% of the industry’s global profits.

What happened? “Is it likely all 7 incumbents had failed strategies, run by clueless management, lacking execution capabilities?,” he asked.“Or was something more fundamental happening?…Nokia and the others had classic strategic advantages that should have protected them: strong product differentiation, trusted brands, leading operating systems, excellent logistics, protective regulation, huge R&D budgets, and massive scale.For the most part, those firms looked stable, profitable, and well entrenched.”How can we explain their rapid free fall?

We all know the answer to Van Alstyne’s rhetorical questions. “Apple (along with Google’s competing Android system) overran the incumbents by exploiting the power of platforms and leveraging the new rules of strategy they give rise to.Platform businesses bring together producers and consumers in high-value exchanges. Their chief assets are information and interactions, which together are also the source of value they create and their competitive advantage.”

Davenport started his talk by noting that over the past two centuries we’ve seen three distinct stages of automation, based on the kinds of jobs that were replaced by machines.The machines of the first automation era “relieved humans of work that was manually exhausting,” making up for our physical limitations, - steam engines and electric motors enhanced our physical power while railroads, cars and airplanes helped us go faster and farther.

Next came the automation of jobs involving routine tasks that could be well described by a set of rules and were thus prime candidates for IT substitution.“Era Two automation doesn’t only affect office workers.It washes across the entire services-based economy that arose after massive productivity gains wiped out jobs in agriculture, then manufacturing.”It threatened many transactional service jobs that “are so routinized that they are simple to translate into code,” from bank tellers to airline reservations clerks.

We’ve now entered the third era of automation.Our increasingly smart machines are “now breathing down our necks…This time the potential victims are not tellers and tollbooth collectors, much less farmers and factory workers, but rather all those knowledge workers who assumed they were immune from job displacement by machines,…” including, - as Davenport and Kirby poignantly remind us, - “People like the writers and readers of this book.”

May 23, 2016

Last February, President Obama issued an Executive Order establishing the Commission on Enhancing National Cybersecurity within the Department of Commerce.The Commission is charged with “recommending bold, actionable steps that the government, private sector, and the nation as a whole can take to bolster cybersecurity in today’s digital world, and reporting back by the beginning of December.”

To gather the necessary information for its short- and long-term recommendations, the Commission is holding public meetings around the country, each focused on a different sector of the economy.On May 16, it met in New York City to discuss the challenges and opportunities facing the financial sector.The meeting included three panels, one on finance, one on insurance, and the third on research and development.

I was a member of the R&D panel, along with MIT professor Sandy Pentland, IBM Fellow Jerry Cuomo, and Greg Baxter, head of digital strategy at Citigroup.During our 90 minute panel, we each made introductory remarks based on our previously submitted briefing statements and then answered the commissioners’ questions.

May 17, 2016

After many years of promise and hype, artificial intelligence is now being applied to activities that not long ago were viewed as the exclusive domain of humans. It wasn’t all that long ago that we were wowed by Watson, Siri, and self-driving cars. But it’s getting harder for our smart machines to truly impress us. Earlier this year Google’sAlphaGo won a match against one of the world’s top Go players. Go is a very complex game, for which there are more possible board positions than there are particles in the universe.Yet, we seem to be taking AlphaGo’s impressive achievement in stride.

“Any sufficiently advanced technology is indistinguishable from magic,” is one of the most memorable quotes of science fiction writer Arthur C. Clarke.But, as we better understand its promise and limitations, technology becomes just another tool we rely on in our work and daily life.With familiarity, the romance begins to fade, - as was the case with electricity, cars, airplanes and TV in the early decades of the 20th century, and as has been the case more recently with computers, the Internet, and, - increasingly now, - with AI.

Over time, our feelings turn from wonderment and admiration for the seemingly magical achievements of the technology in its childhood years, to the far more practical questions of what the technology can actually achieve when it grows up.This has been a particular issue with information technologies in general, including the Internet and AI.

The IT industry has long been associated from what’s been called the Solow productivity paradox, in reference to Robert Solow's 1987 quip: “You can see the computer age everywhere but in the productivity statistics.We all thought that the Solow paradox was finally behind us when IT-driven productivity surged between 1996 and 2003.But despite the continuing advances in technology, productivity is now back to its slow pre-1995 levels, for reasons that are still not well understood.

Digital technologies are all around us.But, are they a major source of competitive differentiation?Are they a strategic value to business? Can they help increase innovation and productivity and drive long term growth?These questions have no easy answers, as we have learned over the years.

Throughout the 20th century, polls have consistently indicated strong public support for science and technology, especially during the Cold War decades following World War 2.But recent polling data, - and in particular, a recent study conducted by the Pew Research Center, -reveals a more complex relationship between the public and science.While scientific achievements are still recognized and valued, there is a large opinion gap between the general public and scientists on a number of scientific issues.

The Pew findings were released in a January, 2015 report based on two surveys of science-related issues conducted in collaboration with the American Association for the Advancement of Science (AAAS).The first survey is based on a representative sample of 2,002 general public adults and was conducted by landline and mobile phones.The second survey was conducted online, and is based on a representative sample of 3,748 US-based scientists who are members of AAAS.

May 02, 2016

This past January, the World Bank issued the Digital Dividends report, a comprehensive study of the state of digital developments around the world.All in all, it’s a mixed picture. Digital technologies have been rapidly spreading in just about all nations, but their expected digital dividends, - i.e., their broad development benefits, - have lagged behind and are unevenly distributed.

“We find ourselves in the midst of the greatest information and communications revolution in human history,” notes the report in its Foreword.“More than 40 percent of the world’s population has access to the internet, with new users coming online every day.Among the poorest 20 percent of households, nearly 7 out of 10 have a mobile phone…” But, while this is great progress, much remains to be done.Many are still left behind due to their limited connectivity, and are thus unable to fully benefit from the global digital revolution.

While universal connectivity is necessary, it’s far from sufficient, the report adds.“[T]raditional development challenges are preventing the digital revolution from fulfilling its transformative potential… the full benefits of the information and communications transformation will not be realized unless countries continue to improve their business climate, invest in people’s education and health, and promote good governance.”

April 18, 2016

Over the past several decades, information technologies (IT) have been fundamentally transforming companies, industries and the economy in general.In its early years, - ’60s, ’70s, ’80s - companies deployed IT primarily to automate their existing processes, - leaving the underlying structure of the business in place.It wasn’t until the 1990s, - with the pioneering work of Michael Hammer and others on business process reengineering, - that companies realized that just automating existing processes wasn’t enough. Rather, to achieve the promise of IT, it was necessary to fundamentally redesign their operations, examine closely the flow of work across the organization, and eliminate legacy processes that no longer added value to the business.

Organizational transformation was then taken beyond the boundaries of the company with the explosive growth of the Internet. The Internet made it significantly easier to obtain goods and services outside the firm, enabling companies to rely on business partners for many of the functions once done in-house.To compete effectively in an increasingly interconnected global economy, companies now had to optimize not only the flow of work within their own organizations but across their supply chain ecosystems.Over the past 20 years, such ecosystem-wide transformations have been disrupting the business models of industry after industry, - from retail and manufacturing to media and entertainment.

The banking industry has long been one of the major users of IT, - among the first to automate its back-end and front-office processes and to later embrace the Internet and smartphones.However, banking has been relatively less disrupted by digital transformations than other industries.In particular, change has come rather slowly to the world’s banking infrastructure.

April 11, 2016

For the past few decades, digital technologies have been systematically transforming one industry after another. The transformations have generally proceeded along three different stages. First comes the use of IT to improve the productivity and quality of production-oriented, back-end processes.Distribution comes next, leveraging the universal reach and connectivity of the Internet over the past 20 years.The transformation then reaches a tipping point when technology radically changes the user experience, - as has happened with the rise of smartphones over the past decade, - leading to a fundamental disruption of the industry and its business models.

While this digital disruption journey is ultimately inevitable, the pace varies widely across industries.The IT industry has been the most disrupted, - often by its own digital creations in a kind of sorcerer’s apprentice scenario.Over my long career, I’ve seen many once powerful IT companies done in by technology and market changes, and either disappear altogether or become shadows of their former selves.

Beyond IT, few industries have felt the impact of digital forces like media. Everything seems to be changing at once, from the way content is produced and delivered, to the sources of revenue and profits. In less than two decades, the global recorded music industry has lost over half its revenues, while the drop in newspaper advertising revenue in the US has been even steeper.Retail has also been undergoing major changes with the rise of e-commerce, as has telecommunications with the transition to mobile phones and wireless data.

How about the banking industry, which has long been a major user of information technologies, - including back- and front-office automation, ATM’s, Internet banking, data-driven risk management, fraud detection, and mobile financial apps?

April 05, 2016

The March 12 issue of The Economist includes a special report on the future of computing after the very impressive 50-years run of Moore’s Law.

In his now legendary 1965 paper, Intel co-founder Gordon Moore first made the empirical observation that the number of components in integrated circuits had doubled every year since their invention in 1958, and predicted that the trend would continue for at least ten years, a prediction he subsequently changed to a doubling every two years. The semi-log graphs associated with Moore’s Law have since become a visual metaphor for the technology revolution unleashed by the exponential improvements of just about all digital components, from processing speeds and storage capacity to networking bandwidth and pixels.

The 4004, Intel’s first commercial microprocessor, was launched in November, 1971.The 4-bit chip contained 2,300 transistors.The Intel Skylake, launched in August, 2015, contains 1.75 billion transistors which collective deliver about 400,000 more computing power than the 4004.Moore’s Law has had quite a run, but like all good things, especially those based on exponential improvements, it must eventually slow down and flatten out.

In its overview article, The Economist reminds us that Moore’s Law was never meant to be a physical law like Newton’s Laws of Motion, but rather “a self-fulfilling prophecy - a triumph of central planning by which the technology industry co-ordinated and synchronised its actions.”It also reminds us that its demise has been long anticipated: for a while now, the number of people predicting the death of Moore’s Law has also been doubling every two years.

March 22, 2016

How can we best anticipate the future of a complex, fast changing industry like IT?Which hot technology innovations, - e.g., artificial intelligence, the blockchain, cloud computing, - will end up having a big impact and which are destined to fizzle?What can we learn from the IT industry’s 60-year history that might help us better prepare for whatever lies ahead?

A major way of anticipating the future of any economic or social entity, - be it a company, industry, university, government agency or city, - is to explore and learn from its history.While there’s no guarantee that historical patterns will continue to apply going forward, they might well be our most important guides as we peer into an otherwise unpredictable future.

I’ve been involved with computers since the early 1960s, first as a student at the University of Chicago, then in my long career at IBM, and subsequently through my relationship with a number of companies and universities.I’ve thus had a ringside seat from which to observe the journey the IT industry’s been on since those early days.

Let me share some of my personal impressions of this journey through the lens of three key areas, each of which has played a major role throughout IT’s history, and will continue to do so well into its future: data centers, transaction processing, and data analysis.

March 14, 2016

About three years ago, MIT launched the Initiative on the Digital Economy (IDE), a major effort focused on the broad changes brought about by the relentless advances of digital technologies. As its website explains:

“While digital technologies are rapidly transforming both business practices and societies and are integral to the innovation-driven economies of the future, they are also the core driver of the great economic paradox of our time. On one hand, productivity, wealth, and profits are each at record highs; on the other hand, the median worker in America is poorer than in 1997, and fewer people have jobs.Rapid advances in technology are creating unprecedented benefits and efficiencies, but there is no economic law that says everyone, or even a majority of people, will share in these gains.

The future of work and jobs is one of the major areas being addressed by IDE.What will the workforce of the future look like?; Where will jobs come from in the coming years?; Will the nature of work be significantly different in the digital economy?; How can we accelerate the transformation of institutions, organizations, and human skills to keep up with the quickening pace of digital innovation?

To help come up with breakthrough answers to these very challenging questions, IDE just launched its first annual Inclusive Innovation Competition.The competition aims to identify, celebrate and award prizes to “organizations that are inventing a more sustainable, productive, and inclusive future for all by focusing on improving economic opportunity for middle- and base-level income earners.”

As its introductory article reminds us “new technologies have been revolutionizing the world for centuries, transforming life and labor and enabling an extraordinary flourishing of human development.Now some argue that advances in automation and artificial intelligence are causing us to take yet another world-historical leap into the unknown.”

Speculations about a future radically transformed by technology are nothing new.But, with AI and robots seemingly everywhere, the concerns surrounding their long term impact may well be in a class by themselves. Like no other technologies, AI and robots force us to explore the boundaries between machines and humans. Will they turn out like other major innovations, e.g., steam power, electricity, cars, - highly disruptive in the near term, but ultimately beneficial to society?Or, will we see a more dystopian future, as smart machines increasingly encroach on activities requiring intelligence and cognitive capabilities that not long ago were the exclusive domain of humans?Opinions abound, but in the end, we don’t really know.

March 01, 2016

A few years ago, I came across a very interesting article about the efforts of Roger Martin to transform business education. At the time, Martin was the Dean of the Rotman School of Management at the University of Toronto. He had long been advocating “that students needed to learn how to think critically and creatively every bit as much as they needed to learn finance or accounting. More specifically, they needed to learn how to approach problems from many perspectives and to combine various approaches to find innovative solutions.”

“Learning how to think critically - how to imaginatively frame questions and consider multiple perspectives - has historically been associated with a liberal arts education, not a business school curriculum, so this change represents something of a tectonic shift for business school leaders.” Achieving this goal would require business schools to move into territory “more traditionally associated with the liberal arts: multidisciplinary approaches, an understanding of global and historical context and perspectives, a greater focus on leadership and social responsibility and, yes, learning how to think critically.”

Similarly, in a 2006 report, the National Academy of Engineering called for reforming engineering education. “New graduates were technically well prepared but lacked the professional skills for success in a competitive, innovative, global marketplace. Employers complained that new hires had poor communication and teamwork skills and did not appreciate the social and nontechnical influences on engineering solutions and quality processes.”

Dr. Schwab positions the Fourth Industrial Revolution within the historical context of three previous industrial revolutions. The First, - in the last third of the 18th century, - introduced new tools and manufacturing processes based on steam and water power, ushering the transition from hand-made goods to mechanized, machine-based production.The Second, - a century later, - revolved around steel, railroads, cars, chemicals, petroleum, electricity, the telephone and radio, leading to the age of mass production. The Third, - starting in the 1960s, - saw the advent of digital technologies, computers, the IT industry, and the automation of process in just about all industries.

“Now a Fourth Industrial Revolution is building on the Third, the digital revolution that has been occurring since the middle of the last century,” he noted.“It is characterized by a fusion of technologies that is blurring the lines between the physical, digital, and biological spheres.”

February 16, 2016

The Winter 2016 issue of Dædalus, the Journal of the American Academy of Arts and Sciences, is devoted to the Internet.Its 7 articles explore some of the major issues shaping the Internet’s future.The issue was curated by Harvard professor Yochai Benkler and MIT research scientist David Clark, whose joint introduction and individual papers frame the overall discussion.

“The Internet was born in 1983,” write Benkler and Clark in the Introduction.That year saw the adoption of TCP/IP, - the communications protocol that defines the Internet to this day. Its technical directions and overall culture have been largely determined by the academic and research communities, who were also its early users.With strong government support, they developed a system that “only a researcher could love: general, abstract, optimized for nothing, and open to exploration of more or less anything imaginable using connected computers.”

The Internet has since evolved “from a network that primarily delivered email among academics and government employees, to a network over which the World Wide Web arose, to the video and mobile platform it has become - and the control network for embedded computing that it is fast becoming.”From its niche research beginnings, it’s become the universal platform for collaborative innovation, enabling startups, large institutions, and everyone in between to quickly develop and bring to market many new digital products and services.

What design choices have led to the Internet as we know it?Could it still evolve into a fundamentally different platform than the one we use today?What decisions could significantly shape its future?These are among the very important questions explored in the Dædalus issue.In their Introduction, Benkler and Clark summarized some of the key themes that emerged from the various papers, - a few of which I’d now like to discuss.

February 09, 2016

What do we mean by platform?I particularly like this definition by MIT Professor Michael Cusumano: “A platform or complement strategy differs from a product strategy in that it requires an external ecosystem to generate complementary product or service innovations and build positive feedback between the complements and the platform. The effect is much greater potential for innovation and growth than a single product-oriented firm can generate alone.”

The importance of platforms is closely linked to the concept of network effects - the more products or services it offers, the more users it will attract.Scale increases the platform’s value, helping it attract more complementary offerings which in turn brings in more users, which then makes the platform even more valuable… and on and on and on.

Platforms have long played a key role in the IT industry.IBM’s System 360 family of mainframes, announced in 1964, featured a common hardware architecture and operating system, enabling customers to upgrade their systems with no need to rewrite their applications.The ecosystem of add-on hardware, software and services that developed around System 360 helped it become the premier platform for commercial computing over the next 25 years.

In the 1980s, the explosive growth of personal computers was largely driven by the emergence of the Wintel platform based on Microsoft’s operating systems and Intel’s microprocessors, which attracted a large ecosystem of hardware and software developers.

The 1990s saw the commercial success of the Internet and World Wide Web, driving platforms to a whole new level.Internet-based platforms connected large numbers of PC users to a wide variety of web sites and online applications. The power of platforms has grown even more dramatically over the past decade, with billions of users now connecting via smart mobile devices to all kinds of cloud-based applications and services.

What’s the current state and growth potential of platform companies?How many large platforms are currently operating around the world?What’s their impact on established enterprises?These are among the questions addressed in in a recent report, The Rise of the Platform Enterprise: A Global Survey led by Peter Evans and Annabelle Gawer and sponsored by the Center for Global Enterprise.The report is based on a comprehensive survey of the 176 platform companies around the world with an individual valuation exceeding $1 billion. Their aggregate market value was over $4.3 trillion.

January 26, 2016

Having lived through IBM’s near-death experience in the early 1990s, respect for the forces of the marketplace is edged deep in my psyche.It’s frankly sobering how many once powerful IT companies are no longer around or are shadows of their former selves. The carnage might be more pronounced in the fast-changing IT industry, but no industry is immune. It would seem as if darwinian principles apply in business almost as much as they do in biology.

A few weeks ago I discussed a paper published last year, The Mortality of Companies, by physicist Geoffrey West and his collaborators at the Santa Fe Institute.Based on their extensive analysis of data about publicly traded US companies, they were surprised to discover that a typical firm lasts about ten years before it gets merged, acquired or liquidated, and that a firm’s mortality rate is independent of its age, how well established it is or what it does.While beyond the scope of their study, the authors speculated that biological ecosystems are likely to shed valuable insights into their findings.

“Some business thinkers have argued that companies are like biological species and have tried to extract business lessons from biology, with uneven success,” note the authors.“We stress that companies are identical to biological species in an important respect: Both are what’s known as complex adaptive systems.Therefore, the principles that confer robustness in these systems, whether natural or manmade, are directly applicable to business.”

January 19, 2016

Digital technologies are all around us, - increasingly ubiquitous and commoditized.But, are they a major source of competitive differentiation? Are they still a strategic value to business?Can digital innovation drive long term economic growth?

Several weeks ago, the McKinsey’s Global Institute (MGI) published a report addressing these questions.Digital America: A tale of the haves and have-mores aims to quantify the state of digitization of the US economy.The report introduces the MGI Industry Digitization Index, a methodology for exploring the various ways US companies are going about their digital journey, - based on 27 indicators that measure how they’re building digital assets, expanding digital usage, and creating a digital workforce.

“Digital innovation, adoption, and usage are evolving at a supercharged pace across the US economy,” notes the report in its opening paragraph. “As successive waves of innovation expand the definition of what is possible, the most sophisticated users have pulled far ahead of everyone else in the race to keep up with technology and devise the most effective business uses for it.”

January 12, 2016

Transformational innovations don’t always play out as originally envisioned.Once in the marketplace, they seem to acquire a life of their own.Lest we forget, the Internet started out as a DARPA sponsored project aimed at developing a robust, fault-tolerant computer network.ARPANET was launched in 1969, and by the mid-1980s, it had grown and evolved into NSFNET, a network widely used in the academic and research communities.And, the World Wide Web was first developed by Tim Berners-Lee at CERN in the late 1980s to facilitate the sharing of information among researchers around the world. They’ve both gone on to change the world, - to say the least.

The blockchain first came to light around 2008 as the architecture underpinningbitcoin, the best known and most widely held digital currency.But, as with the Internet, the Web and other major technologies, the blockchain has now transcended its original objective.It has the potential to revolutionize the finance industry and transform many aspects of the digital economy.

Two press announcements released in mid-December are serious milestones in its evolution. Let me explain.

About 20 years ago, West got interested in whether some of the techniques and principles from the world of physics could be applied to complex biological and social systems.In particular, he wondered if we could apply empirical, quantifiable and predictive scientific methods to help us better understand complex biological organisms and social organizations like cities and companies.

In the 1990s, his attention first turned to biology.There are enormous variations in the characteristics of living creatures, their live spans, pulse rates, metabolism, and so on. How do these characteristics change with body size? Why do human beings live roughly 80 to 100 years, while mice live only two to three years. Are there some common principles that apply to all living creatures regardless of size?Can we find empirical mathematical models that might allow scientists to ask big questions about life, aging and death?

December 29, 2015

This past semester I was involved in an interesting course at MIT’s Sloan School of Management, - Analytics Labs (A-Lab).A-Lab’s objective is to teach students how to use data sets and analytics to address real-world business problems. Companies submit project proposals prior to the start of the class, including the business problem to be addressed and the data on which the project will be based. Students are then matched with the project they’re most interested in and grouped into teams of 3-4 students.

A-Lab received over 20 project proposals from different companies, of which 13 were selected by the students. Each project team was assigned a research mentor to provide guidance as appropriate. I mentored a 3-student team that worked on a project sponsored by MasterCard.The students explored the possibility of improving on predictions of the economic performance of emerging markets by coupling existing economic indicators data with consumer behavior based on MasterCard’s transaction data. This is a particularly interesting project because economic data in emerging markets is often not as reliable as the data in more advanced markets.

But more important for the students, the various A-Lab projects served as a concrete learning experience on what data science is all about, - how to leverage messy, incomplete, real-world data to shed light on a complex and not-so-well-defined problem.

December 22, 2015

The intrinsic structure of companies has long been a subject of study, most famously by Ronald Coase, the eminent British economist and recipient of the 1991 Nobel Prize in economics. In 1937, Coase published a seminal paper, The Nature of the Firm, in which he explained that, in principle, a firm should be able to find the cheapest, most productive goods and services by contracting them out in an efficient, open marketplace.

However, markets are not perfectly fluid. Transaction costs are incurred in obtaining goods and services outside the firm, such as searching for the right people, negotiating a contract, coordinating the work, managing intellectual property and so on. Firms thus came into being to make it easier to get work done. A well managed company tries to achieve a good balance between the work that gets done within and outside its boundaries.

Over the past few decades, the firm has been going through dramatic changes, driven by both advances in information technologies and the heightened competitive pressures brought about by globalization.Fundamental changes have taken place in the structure of firms and in the overall flow of goods and services in the economy.As this 2005 IBM study noted:

“The nature of competition - increasingly intense, global and unpredictable - requires strength across the board. So the objective is to decompose the enterprise into its component parts, understand with great precision what is truly differentiating - where the enterprise has strengths and weaknesses - and then make decisions about how to build, buy or partner for world-class capability.”

“In this model, companies can focus their energies on their true point of differentiation, instead of trying to master many domains and ultimately squander competitive advantage by dispersing focus and investment.Rather than existing as static and fixed organizations, more enterprises could essentially become an aggregation of specialized entities with complementary interests - expanding, contracting and reconfiguring themselves in a way that best adapts to or even anticipates market dynamics.”

December 15, 2015

The concept of T-shaped skills was first introduced over 20 years ago, but its importance, - to both individuals and organizations, - has continued to rise. A growing number of articles have been extolling the value of T-shaped professionals, that is, individuals who combine deep cognitive, analytical, and/or technical skills in a specific discipline, with broad multidisciplinary, social skills.

As described by IDEO CEO Tim Brown:“The vertical stroke of the T is a depth of skill that allows them to contribute to the creative process.That can be from any number of different fields: an industrial designer, an architect, a social scientist, a business specialist or a mechanical engineer.The horizontal stroke of the T is the disposition for collaboration across disciplines.”

T-shaped skills are increasingly valued in the marketplace. For example, arecent paper by Harvard professor David Deming showed that labor markets have been rewarding individuals with strong social skills, that is, with interpersonal skills that facilitate interactions and communications with others.Deming’s research showed that since 1980, social-skill intensive occupations have enjoyed most of the employment growth across the whole wage spectrum. Employment and wage growth have been particularly strong in jobs requiring both high cognitive and high social skills. But since 2000, they have fallen in occupations with high cognitive but low social skill requirements, - “suggesting that cognitive skills are increasingly a necessary but not sufficient condition for obtaining a high-paying job.”

But as often happens with serious ideas once they become popular, disruptive innovation has joined the pantheon of trendy, overused business buzzwords, - stretched and applied way beyond its intended meaning.It’s even led to a rather strong backlash in a 2014 New Yorkerarticle by Harvard history professor Jill Lepore.

“The theory of disruptive innovation… has proved to be a powerful way of thinking about innovation-driven growth…,” they write.“Unfortunately, disruption theory is in danger of becoming a victim of its own success.Despite broad dissemination, the theory’s core concepts have been widely misunderstood and its basic tenets frequently misapplied.Furthermore, essential refinements in the theory over the past 20 years appear to have been overshadowed by the popularity of the initial formulation.As a result, the theory is sometimes criticized for shortcomings that have already been addressed.”

December 01, 2015

I’ve been closely following cloud computing for a number of years, having posted my first cloud blogin March of 2008. A few months later I gave a presentation at a cloud-based conference.The overall sense at that conference was that something big and profound was emerging in the IT industry. Cloud computing could well be The Next Big Thing - one of those massive changes that the IT industry goes through from time to time that really shake things up, - like the advent of personal computers in the 1980s and the Internet in the 1990s.

Cloud has fulfilled those early expectations, having made much progress in a relatively short time. As I look back at its evolution over the past few years, I think of cloud as having gone through three main, overlapping phases.

Early adopters viewed cloud as computing-on-demand. Next, cloud introduced a much more disciplined approach to the architecture and management of large IT infrastructures. And by now, cloud has become accepted as a major transformational force in IT, in business, and in the economy in general.

November 24, 2015

I recently read a very interesting paper, The Growing Importance of Social Skills in the Labor Market, by Harvard professor David Deming. Deming’s paper shows that over the past several decades, labor markets have been increasingly rewarding social skills, that is, interpersonal skills that facilitate interactions and communications with others. He presents evidence that since 1980, social-skill intensive occupations have enjoyed most of the employment growth across the whole wage spectrum, and that employment and wage growth have been particularly strong in jobs that require both high cognitive and high social skills.

Deming’s paper builds on the work of MIT economist David Autor, a leading authority on the impact of technological change on the US labor market. A few weeks ago I wrote about a paper he published this past summer on the history and future of workplace automation. In that paper, Autor summarized the sharp polarization of job opportunities that’s taken place in the US over the past two decades, noting that job opportunities have significantly expanded in both high-skill and low-skill occupations, while they have significantly contracted for mid-skill jobs.

The reason for this polarization is that automation has been most successful when applied to mid-skill routine tasks, that is, tasks or processes that follow precise, well understood procedures that can be well described by a set of rules. The occupations most susceptible to automation have included blue-collar physical activities such as manufacturing and other forms of production, as well as white-collar, information-based activities like accounting, record keeping, and many kinds of administrative tasks. As a result, these occupations have experienced the biggest declines in employment opportunities and earnings.

On the other hand, non-routine tasks are much harder to describe by a set of rules that a machine can follow. These include manual jobs in fast-food restaurants, janitorial services and health-care aides that require relatively low skills and education, as well as high-skill jobs that involve expert problem solving and complex communications requiring strong cognitive skills and a college education.

November 17, 2015

The October 31 issue of The Economist featured the blockchain in its cover: “The Trust Machine: How the technology behind Bitcoin could change the world.” Its two articles on the subject explain what blockchains are about, as well as why we should care about an exotic technology that involves concepts from cryptography, game theory and distributed computing.

Right up front, the Economist distinguishes between three different notions that are often muddled up when discussing blockchains:

Bitcoin, the best known and most widely held digital currency, whose users can transact directly with each other with no need for a central authority, - be it a bank or government agency, - to certify the validity of the transactions.

The blockchain architecture underpinning bitcoin, whose protocols were specifically designed to control the creation and transfer of bitcoins.

The general concept of blockchain, a distributed database architecture with the ability to handle trust-less transactions where no parties need to know nor trust each other for transactions to complete.

Bitcoin has had a mixed reputation, due to its wild fluctuations in value and past links to illicit activities. And, bitcoin-specific blockchain concepts like mining might well be perceived as too wasteful of computing power and energy by all but the most libertarian of bitcoin supporters. But the advanced technologies and architectures underlying blockchains are being increasingly accepted as having important implications far beyond bitcoin and other cryptocurrencies.

The blockchain holds the promise to revolutionize the finance industry by bringing one of its most important and oldest concepts, the ledger, to the Internet age. Beyond finance, the blockchain “offers a way for people who do not know or trust each other to create a record of who owns what that will compel the assent of everyone concerned. It is a way of making and preserving truths.”

November 10, 2015

I recently read an interesting McKinsey article, Organizing for digital acceleration: Making a two-speed IT operating model work. The article argues that born digital Internet companies, - e.g., Amazon, Google, Facebook, - are setting the standards in the quest to understand and optimize the customer experience, challenging traditional companies to keep up. These older companies have to both leverage the back-end systems, processes and overall capabilities that have served them well over the years, while adopting innovative consumer-facing technologies to enhance their customers’ experiences.

The article starts out by summarizing the best practices of born digital companies:

Provide the appropriate information and platforms so customers can find what they need quickly and reliably.

Align technology and production systems with business goals.

Limit the size of application-development teams.

Focus on agile product development to enhance customer engagements.

Encourage product managers to think about digital experiences rather than discrete applications or components.

November 03, 2015

Few topics are as important, - and as challenging to anticipate, - than the future of work given our justifiable fears of rising technological unemployment. How are job markets likely to evolve in our 21st century digital economy? This question has been widely discussed for years, but, - at the end of the day, - we don’t really have good answers.

October 27, 2015

Design has long played a major role in product innovation. But in the last few years, a shift has been underway bringing design to the very core of the business. “The Evolution of Design Thinking: It’s no longer just for products. Executives are using this approach to devise strategy and manage change,” read the cover of the Harvard Business Review’sSeptember issue, which featured several articles on the subject.

Leading-edge companies are leveraging design thinking to translate technological advances into compelling customer experiences in order to seize market share from more traditional competitors. As noted in one of the HBRarticles, design-centric organizations are adamantly focused on their customer’ needs, rather than on their internal operational efficiencies.

A second HBRarticle explored a different kind of application. To help overcome the stiff resistance often encountered by disruptive innovations, - both within one’s own organization and in the marketplace, - the article proposed that design thinking should be applied to their actual introduction, - a process it calls intervention design.

I would now like to turn to another recent article, Management by Design, by professors Mark Gruber, Nick de Leon, Gerry George and Paul Thomson. As the authors argue, design principles should be applied within the management domain itself, to help rethink business processes, workflows and the overall structure of the organization and thus create what it calls a New Workplace Experience (NWX).