After Slow Start, Artificial Intelligence Is Beginning to Move ; Progress of Technology Unfolds in Small Steps, Industry Veterans Say

Article excerpt

Silicon Valley veterans argue that people routinely fail to
estimate accurately the timing of new technology. A.I., they add, is
no exception.

When IBM's Watson computer triumphed over human champions in the
quiz show "Jeopardy!" it was a stunning achievement that suggested
limitless horizons for artificial intelligence.

Soon after, IBM's leaders moved to convert Watson from a
celebrated science project into a moneymaking business, starting
with health care.

Yet the next few years after its game-show win proved humbling
for Watson. Today, IBM executives candidly admit that medicine
proved far more difficult than they anticipated. Costs and
frustration mounted on Watson's early projects. They were scaled
back, refocused and occasionally shelved.

IBM's early struggles with Watson point to the sobering fact that
commercializing new technology, however promising, typically comes
in short steps rather than giant leaps.

Despite IBM's own challenges, Watson's TV victory -- five years
ago this month -- has helped fuel interest in A.I. from the public
and the rest of the tech industry. Venture capital investors have
poured money into A.I. start-ups, and large corporations like
Google, Facebook, Microsoft and Apple have been buying fledgling
A.I. companies. That investment reached $8.5 billion last year, more
than three and a half times the level in 2010, according to Quid, a
data analysis firm.

And software engineers with A.I. skills are treated like star
athletes, with bidding wars for their services.

"We're definitely at a peak of excitement now," said Jerry
Kaplan, a computer scientist, entrepreneur and author, who was a co-
founder of a long-forgotten A.I. start-up in the 1980s.
"Expectations are way ahead of reality."

The term A.I. has long been a staple of science fiction -- as
machines that think for themselves and help humankind or as
ungrateful creations that try to wipe us out. Or so the thinking at
the movies goes.

The reality, however, is a little less dramatic. The automated
voice on your smartphone that tries to answer your questions? That's
a type of A.I. So are features of Google's search engine. The
technology is also being applied to complex business problems like
finding trends in cancer research.

The field of artificial intelligence goes back to the beginning
of the computer age, and it has rolled through cycles of optimism
and disillusion ever since, encouraged by a few movie robots and one
very successful game show contestant.

The history of tech tells A.I. backers to hang in there. Silicon
Valley veterans argue that people routinely overestimate what can be
done with new technology in three years, yet underestimate what can
be done in 10 years.

Predictions made in the '90s about how the new World Wide Web
would shake the foundations of the media, advertising and retailing
industries did prove to be true, for example. But it happened a
decade later, years after the dot-com bust.

Today's A.I., even optimists say, is early in that cycle.

"I think future generations are going to look back on the A.I.
revolution and compare its impact to the steam engine or
electricity," said Erik Brynjolfsson, director of the Initiative on
the Digital Economy at Massachusetts Institute of Technology's Sloan
School of Management. "But, of course, it is going to take decades
for this technology to really come to fruition."

There are reasons for enthusiasm. Computers continue to get
cheaper even as they get more powerful, making it easier than ever
to crunch vast amounts of data in an instant. Also, sensors,
smartphones and other tech devices are all over the place, feeding
more and more information into computers that are learning more and
more about us.

Just in the last year or two, researchers have made rapid gains
using a machine-learning technique called deep learning to improve
the performance of software that recognizes images, translates
languages and understands speech. …

Related books and articles

A primary source is a work that is being studied, or that provides first-hand or direct evidence on a topic. Common types of primary sources include works of literature, historical documents, original philosophical writings, and religious texts.