A collection of observations, news and resources on the changing nature of innovation, technology, leadership, and other subjects.

December 05, 2016

A few weeks ago I discussed whether AI is finally reaching a tipping point, mostly based on a recently published report, - Artificial Intelligence and Life in 2030.The report was developed by a study panel of AI experts convened by the One Hundred Year Study of AI (AI100), an initiative launched at Stanford University in December, 2014 “to study and anticipate how the effects of artificial intelligence will ripple through every aspect of how people work, live and play.”To better understand the future impact of AI on everyday lives, the panel focused the study on the likely influence of AI on a typical North American city by the year 2030.

The report is organized into three main sections.Section I, - What is Artificial Intelligence?, - describes how researchers and practitioners define AI, as well as the key research trends that will likely influence AI’s future.Section II looks into AI’s overall impact on various sectors of the economy, while the third Section, examines AI issues related to public policy.

My previous discussion was primarily focused on Section I.I’d now like to turn my attention to Section II, - AI by Domain.To help analyze where AI might be heading, the study panel narrowed its explorations to the eight domains most likely to be impacted by AI:

November 28, 2016

A few weeks ago I first learned about a relatively new concept - Digital Twin.A Digital Twin is essentially a computerized companion to a real-world entity, be it an industrial physical asset like a jet engine, an individual’s health profile, or a highly complex system like a city.It’s a highly realistic, one-to-one digital model of each such specific physical entity.

Digital Twin helps bring the physical and digital worlds closer to each other.It’s intertwined with and complementary to the Internet of Things (IoT).The huge amounts of data now collected by IoT sensors on physical objects, personal devices and smart systems make it possible to represent their near real-time status in their Digital Twin alter-ego.

“The myriad possibilities that arise from the ability to monitor and control things in the physical world electronically have inspired a surge of innovation and enthusiasm,” said a 2015 McKinsey report on the Internet of Things.Experts estimate that the number of connected things or devices will reach 50 billion by 2020, growing to 100s of billions in the decades ahead.The economic potential of the smart solutions this makes possible is enormous, possibly reaching several trillion dollars within a decade.

October 24, 2016

After many years of promise and hype, AI seems to be finally reaching a tipping point of market acceptance. “Artificial intelligence is suddenly everywhere… it is proliferating like mad.” So starts a Vanity Fairarticle published around two years ago by author and radio host Kurt Andersen. And, this past June, a panel of global experts convened by the World Economic Forum (WEF) named Artificial Intelligence, - Open AI Ecosystems in particular - as one of its Top Ten Emerging Technologies for 2016 because of its potential to fundamentally change the way markets, business and governments work.

AI is now being applied to activities that not long ago were viewed as the exclusive domain of humans.“We’re now accustomed to having conversations with computers: to refill a prescription, make a cable-TV-service appointment, cancel an airline reservation - or, when driving, to silently obey the instructions of the voice from the G.P.S,” wrote Andersen.The WEF report noted that “over the past several years, several pieces of emerging technology have linked together in ways that make it easier to build far more powerful, human-like digital assistants.”

What will life be like in such an AI-based society?What impact is it likely to have on jobs, companies and industries?How might it change our everyday lives?

These questions were addressed in Artificial Intelligence and Life in 2030, a report that was recently published by Stanford University’s One Hundred Year Study of AI (AI100).AI100 was launched in December, 2014 “to study and anticipate how the effects of artificial intelligence will ripple through every aspect of how people work, live and play.”The core activity of AI100 is to convene a Study Panel every five years to assess the then current state of the field, review AI’s progress in the years preceding the report, and explore the potential advances that lie ahead as well the technical and societal challenges and opportunities these advances might raise.

October 11, 2016

Why do firms exist? Ronald Coase, - the eminent British economist and University of Chicago professor, - addressed this question in The Nature of the Firm, - a seminal paper published in 1937 which along with other major achievements earned him the 1991 Nobel Prize in economics.

Professor Coase explained that, in principle, a firm should be able to find the cheapest, most productive, highest quality goods and services by contracting them out in an efficient, open marketplace. However, markets are not perfectly fluid. Transaction costs are a kind of friction incurred in obtaining goods and services outside the firm, such as searching for the right supply chain partners, establishing a trusted relationship, negotiating a contract, coordinating the work, managing intellectual property and so on. Firms came into being to make it easier and less costly to get work done.

A recent IBM report, - Fast forward: Rethinking enterprises, ecosystems and economies with blockchains, - harks back to Coase’s paper to analyze the potential value of blockchains. The report notes that while transaction costs are lower within firms, “in recent years as enterprises have scaled, the added complexity of operations has grown exponentially while revenue growth has remained linear.The result?At a certain point, organizations are faced with diminishing returns.Blockchains have the potential to eradicate the cost of complexity and ultimately redefine the traditional boundaries of an organization.”

The book is organized into three main sections.The first explains the blockchain from two complementary points of view: as a major next step in the evolution of the Internet; and as the architecture underpinning bitcoin, the best known and most widely held digital currency.The second and longest section describes how blockchain could potentially transform financial services, companies, government, the Internet of Things, and other key areas.The last section summarizes the major challenges that must be overcome as well as the governance required to fulfill the promise of blockchain.

“It appears that once again, the technological genie has been unleashed from its bottle,” write the authors in their opening paragraph.“Summoned by an unknown person or persons with unclear motives, at an uncertain time in history, the genie is now at our service for another kick of the can - to transform the economic power grid and the old order of human affairs for the better.If we will it.”

“Horizon scanning for emerging technologies is crucial to staying abreast of developments that can radically transform our world, enabling timely expert analysis in preparation for these disruptors,” said Meyerson. “The global community needs to come together and agree on common principles if our society is to reap the benefits and hedge the risks of these technologies.”

The technologies on the list are not new.They’ve been worked on for many years.But their selection to the Top Ten List indicates that, in the opinion of the council members, each of these technologies has now reached a maturity and acceptance tipping point where its impact can be meaningfully felt.

August 02, 2016

A few weeks ago, the World Economic Forum (WEF) published its annual list of the Top Ten Emerging Technologies for 2016.The technologies on the list have been worked on for years.But their inclusion in the Top Ten List indicates that each has now reached a market acceptance tipping point where its impact can be meaningfully felt.The Blockchain is one of the technologies in this year’s list, selected by the WEF panel of global experts because of its emerging potential to fundamentally change the way markets and governments work.

What does it mean for an infrastructure technology like the blockchain to have reached such a tipping point?The WEF report compared the blockchain to the Internet, noting that “Like the Internet, the blockchain is an open, global infrastructure upon which other technologies and applications can be built.And like the Internet, it allows people to bypass traditional intermediaries in their dealings with each other, thereby lowering or even eliminating transaction costs.”

I agree with this comparison and find it useful to help us understand how blockchains might evolve over the years. So, I’d like to compare the state of the blockchain in 2016 to the state of the Internet 25 years ago or so.

A few weeks ago I had an interesting conversation on the the state of service science with analysts from an IT research organization who were preparing a report on the subject for their clients.Our discussion led me to reflect on the evolution of service science over the past several years.I think that we are hearing a bit less about it these days.But is that because we’ve become tired of the subject and moved on, or because the application of science and technology to services is now so well accepted that it’s no longer a topic of debate?I very much think it’s the latter.

July 12, 2016

In September, 2014, I attended an MIT conference that explored the major progress that’s taken place in artificial intelligence, robotics and related technologies over the past several years. Autonomous vehicles was one of the main areas of discussion. With most other topics, there was considerable consensus, but not so with self-driving cars. While some thought that fully autonomous vehicles will be all around us within a decade or so, others were not quite so sure, myself included, due to the many highly complex technical, economic and societal issues that must be worked out.

I was reminded of this meeting a few weeks ago when I read that a Florida man had been killed while driving a Model S Tesla in autopilot mode.The accident is still under investigation, but it appears that the Tesla struck a tractor-trailer truck that was making a left turn in front of its path.Neither the driver or the Tesla’s autopilot noticed that a truck was suddenly crossing its lane of traffic, perhaps because the white truck was hard to spot against a bright sky.

This accident has led to a renewed discussion of the current state-of-the-art of vehicle automation, the approaches being pursued by different companies, and the prospects for the near- and long-term future.

July 04, 2016

Urbanization is one of the major forces shaping the 21st century - right up there with the digital technology revolution and globalization. Over half of the world’s population lives in urban areas, and as the 2014 UN World Urbanization Prospect noted: “The continuing urbanization and overall growth of the world’s population is projected to add 2.5 billion people to the urban population by 2050,” with the proportion of the population living in urban areas increasing to 66 percent by 2050.

Just about every study that’s benchmarked the competitiveness of major urban areas ranks London, - along with New York, - as the world’s two top cities. But despite, - or perhaps because of, - their leadership positions, both cities face major challenges as they deal with economic growth and a growing population.

London has been much in the news lately. First came the election of Sadik Khan last May, - the first Muslim mayor not only of London but of a major Western capital, followed by the recent Brexit referendum, where London voted to Remain in the EU by an overwhelming 60% of its vote while the UK as a whole voted 52% to Leave.

At this point, it’s very hard to predict where Brexit is heading over the next few months, let alone what it’s long term consequences might be. But, given its many strengths I have little doubt that London will remain among the world’s very top cities. So, let me put Brexit aside for the moment and discuss instead what London has been doing to address its major population and economic challenges.

Van Alstyne started out his presentation by noting that back in 2007, seven firms controlled 99% of handset profits: Nokia, Samsung, Ericsson, Motorola, LG, RIM and HTC.That same year, Apple’s iPhone was born and began gobbling up market share.By 2015, only one of the former incumbents had any profit at all, while Apple now generated 92% of the industry’s global profits.

What happened? “Is it likely all 7 incumbents had failed strategies, run by clueless management, lacking execution capabilities?,” he asked.“Or was something more fundamental happening?…Nokia and the others had classic strategic advantages that should have protected them: strong product differentiation, trusted brands, leading operating systems, excellent logistics, protective regulation, huge R&D budgets, and massive scale.For the most part, those firms looked stable, profitable, and well entrenched.”How can we explain their rapid free fall?

We all know the answer to Van Alstyne’s rhetorical questions. “Apple (along with Google’s competing Android system) overran the incumbents by exploiting the power of platforms and leveraging the new rules of strategy they give rise to.Platform businesses bring together producers and consumers in high-value exchanges. Their chief assets are information and interactions, which together are also the source of value they create and their competitive advantage.”

Davenport started his talk by noting that over the past two centuries we’ve seen three distinct stages of automation, based on the kinds of jobs that were replaced by machines.The machines of the first automation era “relieved humans of work that was manually exhausting,” making up for our physical limitations, - steam engines and electric motors enhanced our physical power while railroads, cars and airplanes helped us go faster and farther.

Next came the automation of jobs involving routine tasks that could be well described by a set of rules and were thus prime candidates for IT substitution.“Era Two automation doesn’t only affect office workers.It washes across the entire services-based economy that arose after massive productivity gains wiped out jobs in agriculture, then manufacturing.”It threatened many transactional service jobs that “are so routinized that they are simple to translate into code,” from bank tellers to airline reservations clerks.

We’ve now entered the third era of automation.Our increasingly smart machines are “now breathing down our necks…This time the potential victims are not tellers and tollbooth collectors, much less farmers and factory workers, but rather all those knowledge workers who assumed they were immune from job displacement by machines,…” including, - as Davenport and Kirby poignantly remind us, - “People like the writers and readers of this book.”

April 18, 2016

Over the past several decades, information technologies (IT) have been fundamentally transforming companies, industries and the economy in general.In its early years, - ’60s, ’70s, ’80s - companies deployed IT primarily to automate their existing processes, - leaving the underlying structure of the business in place.It wasn’t until the 1990s, - with the pioneering work of Michael Hammer and others on business process reengineering, - that companies realized that just automating existing processes wasn’t enough. Rather, to achieve the promise of IT, it was necessary to fundamentally redesign their operations, examine closely the flow of work across the organization, and eliminate legacy processes that no longer added value to the business.

Organizational transformation was then taken beyond the boundaries of the company with the explosive growth of the Internet. The Internet made it significantly easier to obtain goods and services outside the firm, enabling companies to rely on business partners for many of the functions once done in-house.To compete effectively in an increasingly interconnected global economy, companies now had to optimize not only the flow of work within their own organizations but across their supply chain ecosystems.Over the past 20 years, such ecosystem-wide transformations have been disrupting the business models of industry after industry, - from retail and manufacturing to media and entertainment.

The banking industry has long been one of the major users of IT, - among the first to automate its back-end and front-office processes and to later embrace the Internet and smartphones.However, banking has been relatively less disrupted by digital transformations than other industries.In particular, change has come rather slowly to the world’s banking infrastructure.

April 05, 2016

The March 12 issue of The Economist includes a special report on the future of computing after the very impressive 50-years run of Moore’s Law.

In his now legendary 1965 paper, Intel co-founder Gordon Moore first made the empirical observation that the number of components in integrated circuits had doubled every year since their invention in 1958, and predicted that the trend would continue for at least ten years, a prediction he subsequently changed to a doubling every two years. The semi-log graphs associated with Moore’s Law have since become a visual metaphor for the technology revolution unleashed by the exponential improvements of just about all digital components, from processing speeds and storage capacity to networking bandwidth and pixels.

The 4004, Intel’s first commercial microprocessor, was launched in November, 1971.The 4-bit chip contained 2,300 transistors.The Intel Skylake, launched in August, 2015, contains 1.75 billion transistors which collective deliver about 400,000 more computing power than the 4004.Moore’s Law has had quite a run, but like all good things, especially those based on exponential improvements, it must eventually slow down and flatten out.

In its overview article, The Economist reminds us that Moore’s Law was never meant to be a physical law like Newton’s Laws of Motion, but rather “a self-fulfilling prophecy - a triumph of central planning by which the technology industry co-ordinated and synchronised its actions.”It also reminds us that its demise has been long anticipated: for a while now, the number of people predicting the death of Moore’s Law has also been doubling every two years.

March 22, 2016

How can we best anticipate the future of a complex, fast changing industry like IT?Which hot technology innovations, - e.g., artificial intelligence, the blockchain, cloud computing, - will end up having a big impact and which are destined to fizzle?What can we learn from the IT industry’s 60-year history that might help us better prepare for whatever lies ahead?

A major way of anticipating the future of any economic or social entity, - be it a company, industry, university, government agency or city, - is to explore and learn from its history.While there’s no guarantee that historical patterns will continue to apply going forward, they might well be our most important guides as we peer into an otherwise unpredictable future.

I’ve been involved with computers since the early 1960s, first as a student at the University of Chicago, then in my long career at IBM, and subsequently through my relationship with a number of companies and universities.I’ve thus had a ringside seat from which to observe the journey the IT industry’s been on since those early days.

Let me share some of my personal impressions of this journey through the lens of three key areas, each of which has played a major role throughout IT’s history, and will continue to do so well into its future: data centers, transaction processing, and data analysis.

January 26, 2016

Having lived through IBM’s near-death experience in the early 1990s, respect for the forces of the marketplace is edged deep in my psyche.It’s frankly sobering how many once powerful IT companies are no longer around or are shadows of their former selves. The carnage might be more pronounced in the fast-changing IT industry, but no industry is immune. It would seem as if darwinian principles apply in business almost as much as they do in biology.

A few weeks ago I discussed a paper published last year, The Mortality of Companies, by physicist Geoffrey West and his collaborators at the Santa Fe Institute.Based on their extensive analysis of data about publicly traded US companies, they were surprised to discover that a typical firm lasts about ten years before it gets merged, acquired or liquidated, and that a firm’s mortality rate is independent of its age, how well established it is or what it does.While beyond the scope of their study, the authors speculated that biological ecosystems are likely to shed valuable insights into their findings.

“Some business thinkers have argued that companies are like biological species and have tried to extract business lessons from biology, with uneven success,” note the authors.“We stress that companies are identical to biological species in an important respect: Both are what’s known as complex adaptive systems.Therefore, the principles that confer robustness in these systems, whether natural or manmade, are directly applicable to business.”

January 19, 2016

Digital technologies are all around us, - increasingly ubiquitous and commoditized.But, are they a major source of competitive differentiation? Are they still a strategic value to business?Can digital innovation drive long term economic growth?

Several weeks ago, the McKinsey’s Global Institute (MGI) published a report addressing these questions.Digital America: A tale of the haves and have-mores aims to quantify the state of digitization of the US economy.The report introduces the MGI Industry Digitization Index, a methodology for exploring the various ways US companies are going about their digital journey, - based on 27 indicators that measure how they’re building digital assets, expanding digital usage, and creating a digital workforce.

“Digital innovation, adoption, and usage are evolving at a supercharged pace across the US economy,” notes the report in its opening paragraph. “As successive waves of innovation expand the definition of what is possible, the most sophisticated users have pulled far ahead of everyone else in the race to keep up with technology and devise the most effective business uses for it.”

January 12, 2016

Transformational innovations don’t always play out as originally envisioned.Once in the marketplace, they seem to acquire a life of their own.Lest we forget, the Internet started out as a DARPA sponsored project aimed at developing a robust, fault-tolerant computer network.ARPANET was launched in 1969, and by the mid-1980s, it had grown and evolved into NSFNET, a network widely used in the academic and research communities.And, the World Wide Web was first developed by Tim Berners-Lee at CERN in the late 1980s to facilitate the sharing of information among researchers around the world. They’ve both gone on to change the world, - to say the least.

The blockchain first came to light around 2008 as the architecture underpinningbitcoin, the best known and most widely held digital currency.But, as with the Internet, the Web and other major technologies, the blockchain has now transcended its original objective.It has the potential to revolutionize the finance industry and transform many aspects of the digital economy.

Two press announcements released in mid-December are serious milestones in its evolution. Let me explain.

About 20 years ago, West got interested in whether some of the techniques and principles from the world of physics could be applied to complex biological and social systems.In particular, he wondered if we could apply empirical, quantifiable and predictive scientific methods to help us better understand complex biological organisms and social organizations like cities and companies.

In the 1990s, his attention first turned to biology.There are enormous variations in the characteristics of living creatures, their live spans, pulse rates, metabolism, and so on. How do these characteristics change with body size? Why do human beings live roughly 80 to 100 years, while mice live only two to three years. Are there some common principles that apply to all living creatures regardless of size?Can we find empirical mathematical models that might allow scientists to ask big questions about life, aging and death?

September 29, 2015

The September issue of the Harvard Business Review features a spotlight on The Evolution of Design Thinking. With four articles on the subject, HBR’s overriding message is that design is no longer just for physical products, being increasingly applied to customer experiences, innovation, business strategy, and complex problem solving.

Last week I discussed this expanded view of design thinking based on one of the articles, - Design Thinking Comes of Age. I now want to turn my attention to a second article in the HBR issue, Design for Action, which applies design thinking not to the actual artifact being designed, - whether product, service, business strategy, or complex system, - but to a very different kind of problem: the introduction of the designed artifact into the world.

When first introduced, disruptive innovations are likely to encounter stiff resistance, both within one’s own organization and in the marketplace, - otherwise we wouldn’t call them disruptive. The article argues that we should apply design thinking to the launch of the disruptive innovation itself - a process they call intervention design.

September 22, 2015

Design Thinking is the featured topic in the September issue of the Harvard Business Review with four articles on the subject. “It’s no longer just for products. Executives are using this approach to devise strategy and manage change,” reads the tagline in its cover.

It’s not surprising that many companies don’t understand what is meant by design thinking, let alone its potential value to their business. It’s much easier to appreciate the role of design when it comes to physical objects: cars, bridges, buildings, dresses, shoes, jewelry, smartphones, laptops, and so on. But, it’s considerably harder to appreciate its importance when it comes to more abstract entities like systems, services, information and organizations. Their very nature is vague. You can’t touch them. Yet, they account for the bulk of the growing complexity in our daily lives.

The report introduces the Digital Evolution Index, which was created to quantify the unique digital journey being pursued by each of 50 advanced and developing countries, to measure the rate of change of their digital evolution, and to provide information-based insights to companies, investors and governments. The Index is based on four key underlying drivers:

In The Next Wave, Markoff talks about a wide variety of topics, but I’d like to focus my attention on what I think is the main thread throughout the report, the state of Moore’s Law, - the observation that the number of transistors in an integrated circuit has been doubling approximately every two years.

He believes that Silicon Valley, - where he grew up and has long lived, - has been fundamentally about Moore’s Law. And Moore’s Law has played a central role in his career since becoming a technology reporter in 1977. But then, “I suddenly discovered it was over.”

As a result of Moore’s Law, the costs of computing have been falling at an exponentially accelerating rate for almost five decades. But for the past two years, we seem to be at a plateau as the price per transistor has stopped falling. “I see evidence of that slowdown everywhere. The belief system of Silicon Valley doesn’t take that into account.”

July 28, 2015

I like to tell people that the key to being and/or sounding smart is to hang out with smart people. And, one of the names that would quickly comes to mind if asked to recommend who to hang out with is John Seely Brown, aka JSB. JSB, - who’s been a friend for over 25 years, - was chief scientist at Xerox and director of its Palo Alto Research Center (PARC) until June of 2000. He is now the independent co-chairman of the Deloitte Center for the Edge and a visiting scholar at USC. But, as noted in his personal website, his chief occupation is Chief of Confusion,“helping people ask the right questions, trying to make a difference through my work- speaking, writing, teaching.”

Why is it so important to foster a more entrepreneurial style of learning? Our educational system has long been based on the assumption that learning is mostly about acquiring knowledge as well as critical skills to enable us to apply whatever we learned in school throughout our careers, say 30 - 40 years.

July 14, 2015

Like many technological advances, the Internet of Things (IoT) has been long in coming. Ubiquitous or Pervasive Computing dates back to the 1990s, when neither the necessary low-cost devices nor ubiquitous wireless networking were anywhere near ready. But IoT has now reached an inflection point, with over 10 billion interconnected smart devices already out there, a number that’s expected to rapidly expand to 10s of billions by 2025 and to 100s of billions in the decades ahead.

“The myriad possibilities that arise from the ability to monitor and control things in the physical world electronically have inspired a surge of innovation and enthusiasm,” writes McKinsey in a June, 2015 report - The Internet of Things: Mapping the Value Beyond the Hype. The report analyzes the long term economic potential of IoT, and uses this analysis to estimate the economic value of several IoT-based solutions by 2025. Given the breadth and complex nature of IoT-based solutions, quantifying their potential value is very difficult indeed, especially since such smart solutions, - e.g., smart cities, smart homes, smart healthcare, - are still in the very early stages of development.

The McKinsey study introduces an innovative approach for estimating IoT’s future impact and economic value. In addition, it identifies the key enablers required to realize this value and generates a broad set of findings on the likely evolution of IoT over time. Details of the study can be found in the comprehensive 144-page report, whose key points I will summarize below.

Settings-based Economic Value

The study is based on a careful analysis of over 150 concrete IoT applications across the global economy. It applied a bottom-up modeling methodology to estimate the potential economic value of these various applications, including productivity improvements, time savings, better asset utilization and reduced accidents.

After reviewing the results of their analysis, the study concluded that looking at IoT through the lens of individual applications and industries was inadequate. Instead, IoT should be viewed through the lens of so-called settings, - the physical environments in which these various systems and applications are deployed, - which better captures the overall value created for all parties in each setting, - e.g., companies, government, consumers and workers. 9 such distinct settings were defined, along with the major applications that apply in each.

People have long worried about the impact of technology on society, whether discussing railroads, electricity, and cars in the Industrial Age, or the Internet, mobile devices and smart connected products now permeating just about all aspect of our lives. But the concerns surrounding AI may well be in a class by themselves. Like no other technology, AI forces us to explore the very boundaries between machines and humans.

Some experts fear that at some future time, sentient, superintelligent AI machines might pose an “existential risk” that “could spell the end of the human race.” Others are dismissive of such dire concerns while agreeing that we must work hard to ensure that our complex AI systems do what we want them to do.

Whether right or wrong, these long term worries are still decades into the future. Much more immediate is the impact of our smart machines on jobs and the economy. Will AI turn out like other major innovations, - e.g., steam power, electricity, cars, - highly disruptive in the near term, but ultimately beneficial to society? Or, will our smart machines take over not just low-skilled tasks but high-skilled ones too? What will life be like in such an AI future, when highly intelligent machines, - many far surpassing human cognitive capabilities, - are all around us?

Money was one of the themes in the exhibit, represented by four different objects: one of the world’s first gold coins produced over 2500 years ago; a 1375 banknote from the Ming Dynasty; a silver coin minted in Bolivia in the late 16th Century; and a plastic credit card exemplifying the changing role of money in the modern world. Why include money in such an exhibit? Because, as one of the podcasts noted, money, - along with sex and war, - has been one of the great constants in human affairs.

Transaction records actually pre-dated the advent of money. By analyzing the earliest recorded transactions, researchers believe that writing evolved in ancient Mesopotamia thousands of years ago, as an innovation to keep track of financial records. Money was later invented as a store of value and a medium of exchange to make commerce more efficient. For a long time, money was embodied in precious metals like gold and silver, but with the introduction of banknotes, money started to decouple from physical objects with intrinsic value.

“Today… we remain more or less content with paper money - not to mention coins that are literally made from junk,” wrote Harvard historian Niall Ferguson in The Ascent of Money: a Financial History of the World. “[W]e are happy with money we cannot even see. Today’s electronic money can be moved from our employer, to our bank account, to our favorite retail outlets without ever physically materializing. It is this virtual money that now dominates what economists call the money supply… The intangible character of most money is perhaps the best evidence of its true nature.”

Another bestseller, The Second Machine Age, published last year by Brynjolfsson and McAfee, is now helping to explain our new era of data science, AI and advanced automation. These second age machines are being increasingly applied to activities requiring intelligence and cognitive capabilities that not long ago were viewed as the exclusive domain of humans, while enabling us to process vast amounts of information and tackle ever more complex problems.

These advanced machines are ushering a new kind of digital capital economy, quite different from the flat-world economy of only a decade ago. The winners are no longer those able to compete solely based on cheap labor or ordinary capital, both of which are being squeezed by automation. “Fortune will instead favor a third group: those who can innovate and create new products, services, and business models,” says the article. “So in the future, ideas will be the real scarce inputs in the world - scarcer than both labor and capital - and the few who provide good ideas will reap huge rewards. Assuring an acceptable standard of living for the rest and building inclusive economies and societies will become increasingly important challenges in the years to come.”

April 07, 2015

Software-intensive systems are generally quite flexible, - able to evolve and adapt to changing product and market requirements. However, their very flexibility makes it difficult to adequately anticipate and test all the interactions between the various components of the system. Even if all the components are highly reliable, problems can still occur if a rare set of interactions arise that compromise the overall behavior and safety of the system.

According to Carlson and Doyle, you can find very simple biological organisms in nature, and you can design very simple objects. The key ingredient you give up is not their basic functionality, but their robustness - that is, the ability to survive, for biological organisms, or to perform well, for engineered objects - under lots of different conditions, including the failures of individual components or dealing with new, unanticipated events. Robustness implies the ability to adapt and keep going in spite of a rapidly changing environment.

There is a continuing struggle between complexity and robustness in both evolution and human design. A kind of survival imperative, whether in biology or engineering, requires that simple, fragile systems become more robust. But the mechanisms to increase robustness will in turn make the system considerably more complex. Furthermore, that additional complexity brings with it its own unanticipated failure modes, which are corrected over time with additional robust mechanisms, which then further add to the complexity of the system, and on and on and on. This balancing act between complexity and robustness is never done.

This is also the case in the automobile industry, one of the largest in the world. “Not since the first automotive revolution has there been such stunning innovation in the industry,” notes an excellent recent KPMG report, - Me, my car, my life… in the ultraconnected age. “Autonomous vehicles are only part of the story. The convergence of consumer and automotive technologies and the rise of mobility services are transforming the automotive industry and the way we live our lives.”

Self-driving cars have commanded our attention in the last few years. The advent of self-driving vehicles like the Google car is not only a truly dramatic milestone in AI, but concrete evidence that digital technologies are having a huge impact on the future of the automobile. Opinions vary as to the commercial prospects for such truly autonomous vehicles. Some feel that they will be all around us within a decade, navigating our present roads right along human-driven cars. Others are not quite so sure due to the highly complex technical and societal issue that remain to be worked out. Time will tell.

In any event, car companies are working hard to try to keep pace with the speed of innovation in consumer technologies, a huge challenge to the auto industry, notes the KPMG report: “Melding the two worlds - consumer electronics, with its rapid new product launch cadences and willingness to accept iterative software releases, and automotive engineering, with its mass customization, millions of product configurations, and critical safety, durability, and reliability requirements - is not an easy prospect.”

March 17, 2015

On February 24 I attended a workshop in MIT on the Future of Health Analytics. The event was sponsored by MIT Connection Science, a recently organized research initiative aimed at leveraging data science to quantify and analyze human behaviors, and to leverage the new insights thus obtained in key societal applications, including healthcare, transportation and finance. Connection Science, - with which I’m affiliated as a Fellow, - was founded by Media Lab professor Alex “Sandy” Pentland. He’s the author of several books, including the recently published Social Physics: How Good Ideas Spread.

March 03, 2015

People have long argued about the future impact of technology. But, as AI is now seemingly everywhere, the concerns surrounding its long term impact may well be in a class by themselves. Like no other technology, AI forces us to explore the boundaries between machines and humans. What will life be like in such an AI future?

Not surprisingly, considerable speculation surrounds this question. At one end we find books and articles exploring AI’s impact on jobs and the economy. Will AI turn out like other major innovations, e.g., steam power, electricity, cars, - highly disruptive in the near term, but ultimately beneficial to society? Or, as our smart machines are being increasingly applied to cognitive activities, will we see more radical economic and societal transformations? We don’t really know.

These concerns are not new. In a 1930 essay, for example, English economist John Maynard Keynes warned about the coming technological unemployment, a new societal disease whereby automation would outrun our ability to create new jobs.

February 17, 2015

On January 27, Citi and Imperial College held their third annual Digital Money Symposium in London. As in previous years, the Symposium convened a group of leaders in their field to explore the state of adoption of digital money and its economic and societal impacts around the world.

February 10, 2015

“Any sufficiently advanced technology is indistinguishable from magic,” is one of Arthur C. Clarke’s, - author of 2001: A Space Odyssey, - most memorable quotes. I still remember a Monday in the summer of 1996, when around 4 am I was in my Tokyo hotel room doing e-mail on my laptop while listening over the Internet to a live baseball game being played in New York, where it was Sunday afternoon. Today this would be no big deal, but at the time it felt like one of those magical moments Clarke had in mind, perhaps the moment when I truly understood the transformative power of the rapidly growing Internet.

Artificial Intelligence may now be going through such an Internet moment. “Artificial intelligence is suddenly everywhere. It’s still what the experts call soft A.I., but it is proliferating like mad.” So starts an excellent Vanity Fairarticle I recently wrote about. “Everything that we formerly electrified we will now cognitize,” observed another great article. “Experts envision automation and intelligent digital agents permeating vast areas of our work and digital lives by 2025, but they are divided on whether these advances will displace more jobs than they create,” was the overriding finding of a report published by the Pew Research Center this past August.

February 03, 2015

In mid-January, the Transportation Research Board (TRB) of the National Academies held its 94th Annual Meeting in Washington DC. The conference attracted around 12,000 transportation researchers and practitioners from around the world. During the meeting, the TRB Executive Committee held a policy session on the impact of big data on transportation systems. I was one of three panelists invited to the session. Each of us made a short presentation which was then followed by an extensive discussion between the panel and the Committee.

Big data and related information-based disciplines, - e.g., data science, artificial intelligence, - are everywhere. Why are we so excited about them? Is it mostly hype, or is there something truly profound going on? In my presentation, I tried to briefly address these questions based on three key observations.

January 27, 2015

Few subjects are as important, - or as challenging to predict, - as the future of jobs in our emerging digital economy. While the US unemployment rate continues to improve, - finishing 2014 at 5.6% or 8.7 million people, - almost one third of the unemployed have been jobless for over 27 weeks. The total number unemployed or underemployed, - what economists call the U6 unemployment rate, - stands at 11.2% or 17.4 million people. And, the employment to population ratio remains at under 60%, the lowest such percentage since the 1970s. “The economic challenge of the future will not be producing enough. It will be providing enough good jobs,” wrote Harvard professor and former Treasury Secretary Larry Summers in a recent WSJarticle.

Technological revolutions are highly disruptive to economies and societies. This was the case for much of the Industrial Revolution, as is the case today. “The digital revolution has yet to fulfil its promise of higher productivity and better jobs,” saidThe Economist in a special report on Technology and the World Economy in its October 4th issue. “The modern digital revolution - with its hallmarks of computer power, connectivity and data ubiquity - has brought iPhones and the internet.” But, “it is disrupting and dividing the world of work on a scale not seen for more than a century. Vast wealth is being created without many workers; and for all but an elite few, work no longer guarantees a rising income.”

Middle class jobs have been in decline for the past few decades in the US and other advanced economies, - particularly since 2000. And, the livelihood of significant additional workers is potentially threatened, as our increasingly smart machines continue to be applied to activities requiring cognitive capabilities that not long ago were viewed as the exclusive domain of human. How will these relentless advances in technology and automation affect the balance between humans and machines in the workplace, and the skill composition of future jobs?

This critical question was addressed in a recent paper, Racing With and Against the Machine: Changes in Occupational Skill Composition in an Era of Rapid Technological Advance, by MIT’s Frank MacCrory, George Westerman and Erik Brynjolfsson along with Yousef Alhammadi from the Masdar Institute in Abu Dhabi. The paper analyzed the changes between 2006 and 2014 in the skill composition of 674 occupations using the US Government’s O*NET data base, - the most comprehensive data sets of occupational skill requirements. The period from 2006 to 2014 saw the advent of several major digital innovations, including smart mobile devices, social media, big data and analytics, cloud computing and the Internet of Things.

January 21, 2015

Big data, powerful analytics and AI are everywhere. After years of promise and hype, technology is now being applied to activities that not long ago were viewed as the exclusive domain of humans. Our digital revolution had led to amazing applications, but also to considerable pain for many workers who’ve been experiencing declining employment and wages. Mid-skill jobs have been particularly threatened. Many of these jobs, - which include blue-collar production activities as well as information-based white-collar ones, - are based on well understood procedures that can be described by a set of rules that machines can then follow.

But, what will be the impact of our increasingly intelligent machines on senior management positions? In principle, such jobs deal with non-routine, cognitive tasks requiring high human skills, including expert problem solving, complex decision-making and sophisticated communications for which there are no rule-based solutions. “As artificial intelligence takes hold, what will it take to be an effective executive?” asks a recent McKinsey article - Manager and Machine: The new leadership equation. “What would it take for algorithms to take over the C-suite? And what will be senior leaders’ most important contributions if they do?”

After asking these questions to senior managers across a broad range of industries, McKinsey concluded that two key things need to happen for technology to more deeply transform their jobs. First, much still needs to be done to create the proper data sets that would enable intelligent computers to assist in decision-making. Garbage in, garbage out applies as much to data analysis today as it has to computing in general since its early years. Organizations must have a data-analytics strategy that cuts across internal informational silos and properly incorporates external information sources like social media.

And most important, senior managers must learn to let go, something which is quite difficult because it runs counter to decades of organizational practices. Given our rapidly rising oceans of data, the command-and-control approach to management, where information flows up the organization and decisions are made at high levels, would sink the senior executive teams. As data science and AI permeate the organization, it’s important to delegate more autonomy to the business units that hopefully have the proper skills, the advanced tools and the necessary information to make better decisions on their own.

January 13, 2015

“Artificial intelligence is suddenly everywhere. It’s still what the experts call soft A.I., but it is proliferating like mad.” So starts an excellent Vanity Fair article, - Enthusiasts and Skeptics Debate Artificial Intelligence, - by author and radio host Kurt Andersen. Artificial intelligence is indeed everywhere, but these days, the term is used in so many different ways that it’s almost like saying that computers are now everywhere. It’s true, but so general a statement that we must probe a bit deeper to understand its implications, - starting with what is meant by soft AI, versus its counterpart, strong AI.

Soft, weak or narrow AI is inspired by, but doesn’t aim to mimic the human brain. These are generally statistically oriented, computational intelligence methods for addressing complex problems based on the analysis of vast amounts of information using powerful computers and sophisticated algorithms, whose results exhibit qualities we tend to associate with human intelligence.

Soft AI was behind Deep Blue, - IBM’s chess playing supercomputer which in 1997 won a celebrated chess match against then reigning champion Gary Kasparov, - as well as Watson, - IBM’s question-answering system which in 2011 won the Jeopardy! Challenge against the two best human Jeopardy! players. And, as Andersen notes in his article, it’s why “We’re now accustomed to having conversations with computers: to refill a prescription, make a cable-TV-service appointment, cancel an airline reservation - or, when driving, to silently obey the instructions of the voice from the G.P.S.”

The Lab conducts research projects in a number of areas at the intersection of media and technology. From time to time it hosts what it calls Think & Do events that bring together students, faculty members, research staff, industry executives, artists, entrepreneurs and policy makers to collaboratively explore new research ideas. The topic for this particular workshop was Leveraging Engagement. It was aimed at exploring new frameworks for fan engagement in a variety of media, including entertainment, music and sports.

Like few others, the media industries have been severely disrupted by the digital revolution and the forces of creative destruction. Everything seems to be changing at once, from the way content is produced and delivered, to the sources of revenue and profits. One of the major changes is the relationship between the creators and distributors of media content and their audience, especially their most committed audience members or fans.

What do we mean by fan? The briefing book prepared for the workshop draws a distinction between an audience of relatively passive listeners/spectators and fans. Fans are “enthusiastic followers or admirers… have a passion, are emotionally connected to the object of their passion, and experience their passion through their own subjective lens.”

The Internet, smartphones and related technologies, are enabling fans to play a more central and active role in the evolving media ecosystem. They are active participants in social networks. They are critics, co-creators, and brand influences. They are also potential consumers of all kinds of goods and services related to their passion.

November 26, 2014

For the past few years, some have justifiably questioned whether innovation has been going through a period of stagnation, especially when compared to major 19th and 20th century innovations like electricity, cars and airplanes. Have we pretty much stopped solving big problems? “We wanted flying cars - instead we got 140 characters,” is how PayPal cofounder Peter Thiel succinctly expressed this sentiment in a 2011 New Yorkerarticle.

Others argue that while the nature of innovation is definitely changing as we evolve and adapt to an information-based digital economy, it’s impact is no less transformative. Last week, for example, I discussed a recent article in Wired by Kevin Kelly on the future of AI. Kelly fully expects that AI will transform the global economy and civilization in general, much as electricity did more than a century ago. “Everything that we formerly electrified we will now cognitize,” he said. “There is almost nothing we can think of that cannot be made new, different, or interesting by infusing it with some extra IQ… This is a big deal, and now it’s here.” I totally agree.

What should we expect from this new generation of AI machines and applications? Are they basically the next generation of sophisticated tools enhancing our human capabilities, as was previously the case with electricity, cars, airplanes, computers and the Internet? Or are they radically different from our previous tools because they embody something as fundamentally human as intelligence?

According to Kelly, the AI future that’s coming into view is nothing like the “potentially homicidal” HAL 9000 from 2001: A Space Odyssey or “a Singularitan rapture of superintelligence.” The AI he foresees is more like a kind of “cheap, reliable, industrial-grade digital smartness running behind everything, and almost invisible except when it blinks off. This common utility will serve you as much IQ as you want but no more than you need. Like all utilities, AI will be supremely boring, even as it transforms the Internet, the global economy, and civilization.”

October 29, 2014

A few weeks ago I attended MIT’s Second Machine Age Conference, where I heard a number of very interesting presentations on the evolution of AI, robotics, and other advanced technologies. The prospects for truly autonomous vehicles was one of the main topics of discussion. With most other topics, there was considerable audience consensus, but not so with self-driving cars. While many thought that fully autonomous vehicles will be all around us within a decade, others, myself included, were not quite so sure due to the many technical and societal issues involved.

What do we really mean by self-driving cars? There seems to be no precise definition. Are we talking about a human driver assisted by all kinds of advanced technologies, or is there no driver whatsoever? Will such vehicles operate amidst regular human-driven ones, or will they be confined to special lanes equipped with sophisticated navigational technologies? And, is self-driving per se the actual objective, or is it a metaphor for the development of near-crashless cars regardless of whether human drivers are still in the picture.

These questions are not surprising given the very early stages of such a complex area. When exciting new initiatives are first launched, we sometimes describe them using an attention-grabbing phrase that, while potentially unattainable in practice, should be taken more as a marketing pointer to a general direction rather than as a realistic near-term objective.

These devices generate massive amounts of data, a lot of which requires real-time actions. It’s impractical to move all that data to a central cloud for analysis and actions. Computing and intelligence thus have to move closer to the edge, to both ameliorate the data transport challenges and enable real-time actions as required. This all leads to what has come to be known as edge computing, - an architectural bridge between the clouds at the center and the IoT devices all around the edges of the Internet. Cisco calls it fog computing, an allusion to what happens to clouds as they get closer to the ground.

Cisco defines fog computing as “a highly virtualized platform that provides compute, storage, and networking services between end devices and traditional Cloud Computing Data Centers, typically, but not exclusively located at the edge of network.” Fog complements cloud with a number of important capabilities, including edge location awareness, low latency in highly constrained connectivity environments, bandwidth and energy optimizations, and nearly unlimited scale.

Enescu notes that cloud and fog represent two very different IoT paradigms for dealing with huge amounts of data. In cloud, the data generated by the smart sensors at the edges is transferred and stored in the center, where it’s analyzed, and the appropriate actions then flow back across the network. With fog on the other hand, the analysis is done in real time at or near the edge devices, from which actions then flow across the network, with only the data to be stored being transferred to the central cloud.

September 24, 2014

A couple of weeks ago I attended MIT’s Second Machine Age Conference, an event inspired by the best-selling book of the same title published earlier this year by MIT’s Erik Brynjolfsson and Andy McAfee. The conference presented some of the leading-edge research that’s ushering the emerging second machine age, and explored its impact on the economy and society. It was quite an interesting event. Let me discuss a few of the presentations as well as my overall impressions.

In his opening keynote, Brynjolfsson explained what the second machine age is all about. “Like steam power and electricity before it, the explosion of digitally enabled technologies is radically transforming the landscape of human endeavor. Astonishing progress in robotics, automation, and access to information presents major challenges for institutions from small businesses and communities to large corporations and governments, but it also creates opportunities to rethink how we live and work in profoundly positive ways.”

The machines of the industrial economy, - the first age, - made up for our physical limitations, - steam engines enhanced our physical power, railroads and cars helped us go faster, and airplanes gave us the ability to fly. For the most part, they complemented, rather than replaced humans. The second age machines are now enhancing our cognitive powers, giving us the ability to process vast amounts of information and make ever more complex decisions. They’re being increasingly applied to activities requiring intelligence and cognitive capabilities that not long ago were viewed as the exclusive domain of humans. Will these second age machines complement or replace humans?

The impact of technology on jobs is a very important subject. “The economic challenge of the future will not be producing enough. It will be providing enough good jobs,” wrote Harvard professor and former Treasury Secretary Larry Summers in a recent WSJarticle. These concerns are not new. In a 1930 essay, English economist John Maynard Keynes wrote: “We are being afflicted with a new disease of which some readers may not yet have heard the name, but of which they will hear a great deal in the years to come - namely, technological unemployment. This means unemployment due to our discovery of means of economising the use of labour outrunning the pace at which we can find new uses for labour.”

As it turned out, the 20th century saw the creation of many new jobs and industries. But, fears of technological unemployment have been rising in the emerging digital economy, as our increasingly smart machines are being applied to activities requiring intelligence and cognitive capabilities that not long ago were viewed as the exclusive domain of humans.

The Pew Research report explored the impact of AI and robotics on the future of employment based on the responses of nearly 1,900 experts to a few open-ended questions, including: “Will networked, automated, artificial intelligence (AI) applications and robotic devices have displaced more jobs than they have created by 2025?; and “To what degree will AI and robotics be parts of the ordinary landscape of the general population by 2025?”

August 06, 2014

Lean principles is the name given to a group of production techniques developed by Japanese manufacturing companies around the 1970s-1980s to maximize customer value while reducing wasteful resource. Lean production methods have also been described as aiming “to combine the flexibility and quality of craftsmanship with the low costs of mass production.” Such methods include quality control of the processes involved in production; just-in-time production to reduce the costs associated with excess inventory; and continuous improvement involving everyone in the organization in the quest for new, easy to implement ideas.

As companies transition from the industrial to the digital economy, lean philosophies are influencing just about every aspect of their operations, strategy, organization and culture. Lean’s core idea, - maximize customer value while minimizing waste, - feels particularly applicable to our times, as organizations must better understand what customers truly value; organize their work activities to efficiently develop and deliver the appropriate products and services; and continuously improve customer value and efficiency based on real marketplace feedback. Such a pull approach to business is quite different from the push approaches of the past.

June 25, 2014

By now, most will agree that cloud computing is a major transformational force in the world of IT. But, cloud has not the easiest concept to grasp. Not that long ago, even people who agreed that something big and profound was going on, were not totally quite sure what it was they were excited about. “There is a clear consensus that there is no real consensus on what cloud computing is,” was the overall conclusion of a June, 2008 conference on The Promise and Reality of Cloud Computing. Cloud has continued to evolve and advance over the ensuing years. People are no longer starting their sessions by saying: “let’s define cloud computing,” said an article on a 2013 cloud conference.

I find it helpful to look at cloud along two key dimensions: as a technology to improve IT productivity, and as a platform for enabling business innovation. A 2012 survey of business and technology executives found that two thirds of respondents viewed cloud as a leading priority for their IT organizations, while one third said that it was a company-wide business priority. Only one company in six viewed cloud as a way of fostering business innovation.

An increasing number of companies are now deploying cloud-based solutions. Most are focused on improving the economics of IT. They are looking to cloud to help them expand their current offerings without major investments in additional IT infrastructure. Cloud offers financial flexibility, reducing fixed IT costs by shifting from capital to operational, pay-per-use expenses, as well as the ability to easily and economically scale business operations by provisioning IT resources on an as-needed basis. In addition, cloud can help business users become more agile and keep up with the fast pace of technological and market changes. Turning to an external cloud service provider is often a faster and less costly way of prototyping and deploying a new application than relying on the internal IT organization.

A few years later, Friedman was a keynote speaker at a 2011 conference commemorating IBM’s Centennial. To illustrate the incredibly fast pace of change, Friedman noted that in the short time-span since World is Flat was published, we were already transitioning from a connected to an increasingly hyperconnected world. Many of the companies and technologies that are now part of our every day conversations, - Facebook, Twitter, Cloud, Smartphones, Big Data, LinkedIn, 4G or Skype, - were not mentioned in the connected global economy he wrote about only 6 years earlier because they had not been born or were in their infancy.

What do we mean by a hyperconnected economy? I recently found a really good answer in an excellent study by the McKinsey Global Institute (MGI), Global Flows in the Digital Age. The study takes an in-depth look at the expansion of cross-borders flows in the economy. It carefully analyzed these flows in 5 different categories: goods, services, finance, people and data and communications. It then developed the MGI Connectedness Index, which measures these 5 flows in 131 countries, examined how the flows had changed over the past 10 - 20 years, and predicted how they are likely to evolve over the next decade.

May 07, 2014

A 2013 Deloitte report, Success or Struggle?, observed that despite high profits and stock valuations, the reality for many companies is far from upbeat. Strong competitors keep appearing from all corners of the world. Advances in technology are eroding the prevailing business models in industry after industry. Companies are under intense pressure, as they struggle to defend their vulnerable revenues and market share while pursuing elusive new opportunities and profitable growth.

“Companies are broken and many don’t know,” reads the provocative opening line of the Deloitte report. It highlights the paradox that “Many companies are reporting record profits, but longer-term trends suggest they are struggling.” As an indicator of such struggles the report cites the topple rate, which measures how rapidly companies lose their leadership position. The topple rate has increased by almost 40 percent since 1965. The tenure of companies on the S&P 500 was 61 years in 1958; it’s now 18 years. If these trends continue, 75 percent of the S&P 500 companies will have changed over the next 15 years.

It’s been getting harder for even successful companies to maintain their leadership positions. I remember reading In Search of Excellence: Lessons from America’s Best Run Companies when it first came out in 1982, - one of the most influential business books of all time. The book examined 43 of the Fortune 500 top-performing US companies, highlighting what made them great and what management lessons could be learned from them. Similarly, Built to Last: Successful Habits of Visionary Companies, became an influential business best-seller in 1994 as it took and in-depth look at 18 companies which it identified as visionary.

April 15, 2014

On April 7, 1964, IBM announced the Systems/360 family of mainframes. “System/360 represents a sharp departure from concepts of the past in designing and building computers,” said IBM’s then chairman and CEO Thomas J. Watson Jr. “It is the product of an international effort in IBM’s laboratories and plants and is the first time IBM has redesigned the basic internal architecture of its computers in a decade. The result will be more computer productivity at lower cost than ever before. This is the beginning of a new generation - not only of computers - but of their application in business, science and government.”

In April of 1964 I was a second year college student at the University of Chicago, working part time at the university’s computation center, which used IBM computers. I still remember attending a presentation on the announcement given by a visiting IBM executive. Over the next several years I used high-end models of S/360 for the physics calculations I was doing as part of my doctorate studies. My thesis sponsor, - Chicago professor Clemens Roothaan, one of the early leaders in computational sciences, - was consulting with IBM on the design of future versions of S/360, and I was also involved in some of this work. This relationship with IBM researchers led to my joining the computer sciences department at IBM’s Watson Research Center once I finished my studies in 1970.

I was closely associated with mainframes through most of my 37 years in IBM. I was involved in a number of research initiatives on the future of large systems. After moving to the large systems products divisions in the mid 1980s, I worked on the evolution of mainframes to CMOS-based microprocessors and parallel architectures. Later in the 1990s and 2000s I worked closely with the mainframe teams as they supported the Internet, Linux and other emerging initiatives I was involved in.

Having lived through the near demise of mainframes in the early 1990s, - which would have inevitably led to IBM’s own demise, - their ability to have survived after all these years is truly impressive. A look at the IT industry over the past several decades will reveal the large number of once great products and companies that are no longer around. Few computer families could trace their vintage to the 1980s, let alone the 1960s. There is something pretty unique about the mainframe being not only alive but doing so well after all these years.

Why is the mainframe still around, celebrating its 50th birthday last week? What enables it to keep reinventing itself while embracing the latest technologies, including the Mobile Internet, Cloud Computing and Big Data? In a world where product life-cycles are measured in web years, what can we learn from the mainframe’s rather unique longevity?