Tuesday, June 28, 2011

Reading List: The Age of Spiritual Machines

Ray Kurzweil is one of the most vocal advocates of the view
that the exponential growth in computing power (and allied
technologies such as storage capacity and communication bandwidth)
at constant cost which we have experienced for the last half
century, notwithstanding a multitude of well-grounded arguments
that fundamental physical limits on the underlying substrates
will bring it to an end (all of which have proven to be
wrong), will continue for the foreseeable future: in all likelihood
for the entire twenty-first century. Continued exponential
growth in a technology for so long a period is unprecedented in the human experience,
and the consequences as the exponential begins to truly
“kick in” (although an exponential curve is self-similar,
its consequences as perceived by observers whose own criteria for
evaluation are more or less constant will be seen to reach a
“knee” after which they essentially go vertical and
defy prediction). In The Singularity Is Near
(October 2005), Kurzweil argues that once the point is reached
where computers exceed the capability of the human brain and begin
to design their own successors, an almost instantaneous (in terms of
human perception) blow-off will occur, with computers rapidly
converging on the ultimate physical limits on computation, with
capabilities so far beyond those of humans (or even human society
as a whole) that attempting to envision their capabilities or intentions
is as hopeless as a microorganism's trying to master quantum field
theory. You might want to review my notes on 2005's
The Singularity Is Near before reading the
balance of these comments: they provide context as to the extreme
events Kurzweil envisions as occurring in the coming decades, and
there are no “spoilers” for the present book.

When assessing the reliability of predictions, it can be enlightening
to examine earlier forecasts from the same source, especially if
they cover a period of time which has come and gone in the interim.
This book, published in 1999 near the very peak of the
dot-com bubble
provides such an opportunity, and it provides a useful calibration
for the plausibility of Kurzweil's more recent speculations
on the future of computing and humanity. The author's view
of the likely course of the 21st century evolved substantially
between this book and Singularity—in
particular this book envisions no singularity beyond which the
course of events becomes incomprehensible to present-day human
intellects. In the present volume, which employs the curious
literary device of “trans-temporal chat” between
the author, a MOSH (Mostly Original Substrate Human), and a
reader, Molly, who reports from various points in the century
her personal experiences living through it, we encounter
a future which, however foreign, can at least be understood in
terms of our own experience.

This view of the human prospect is very odd indeed, and to
this reader more disturbing (verging on creepy) than the
approach of a technological singularity. What we encounter
here are beings, whether augmented humans or software
intelligences with no human ancestry whatsoever, that despite
having at hand, by the end of the century, mental capacity
per individual on the order of 1024
times that of the human brain (and maybe hundreds of orders of
magnitude more if quantum computing pans out),
still have identities, motivations, and goals which remain
comprehensible to humans today. This seems dubious in the
extreme to me, and my impression from Singularity
is that the author has rethought this as well.

Starting from the publication date of 1999, the book serves
up surveys of the scene in that year, 2009, 2019, 2029,
and 2099.
The chapter describing the state of computing in 2009 makes
many specific predictions. The following are those which
the author lists in the “Time Line” on
pp. 277–278. Many of the predictions in the
main text seem to me to be more ambitious than these, but
I shall go with those the author chose as most important
for the summary.
I have reformatted these as
a numbered list to make them easier to cite.

A $1,000 personal computer can perform about a
trillion calculations per second.

Personal computers with high-resolution visual
displays come in a range of sizes, from those
small enough to be embedded in clothing and
jewelry up to the size of a thin book.

The majority of text is created using continuous
speech recognition. Also ubiquitous are language
user interfaces (LUIs).

Most routine business transactions (purchases, travel,
reservations) take place between a human and a virtual
personality. Often, the virtual personality includes an
animated visual presence that looks like a human face.

Although traditional classroom organization is still
common, intelligent courseware has emerged as a common
means of learning.

Pocket-sized reading machines for the blind and visually
impaired, “listening machines” (speech-to-text
conversion) for the deaf, and computer-controlled orthotic
devices for paraplegic individuals result in a growing
perception that primary disabilities do not necessarily
impart handicaps.

Translating telephones (speech-to-speech language translation)
are commonly used for many language pairs.

Accelerating returns from the advance of computer technology
have resulted in continued economic expansion. Price deflation,
which has been a reality in the computer field during the twentieth
century, is now occurring outside the computer field. The
reason for this is that virtually all economic sectors are
deeply affected by the accelerating improvements in the
price performance of computing.

Human musicians routinely jam with cybernetic musicians.

Bioengineered treatments for cancer and heart disease have
greatly reduced the mortality from these diseases.

The neo-Luddite movement is growing.

I'm not going to score these in detail, as that would be both
tedious and an invitation to endless quibbling over particulars, but
I think most readers will agree that this picture of
computing in 2009 substantially overestimates the actual state
of affairs in the decade since 1999. Only item (3) seems
to me to be arguably on the way to achievement, and yet I do not
have a single wireless peripheral connected to any of my computers
and Wi-Fi coverage remains spotty even in 2011. Things get
substantially more weird the further out you go, and of course
any shortfall in exponential growth lowers the baseline for further
extrapolation, shifting subsequent milestones further out.

I find the author's accepting continued exponential growth as
dogma rather off-putting. Granted, few people expected the
trend we've lived through to continue for so long, but eventually
you begin to run into physical constraints which seem to have
little wiggle room for cleverness: the finite size of atoms,
the electron's charge, and the speed of light. There's nothing
wrong with taking unbounded exponential growth as a premise
and then exploring what its implications would be, but it seems to
me any forecast which is presented as a plausible future needs to
spend more time describing how we'll actually get there: arm
waving about three-dimensional circuitry, carbon nanotubes, and
quantum computing doesn't close the sale for me. The author entirely
lost me with note 3 to chapter 12 (p. 342), which concludes:

If engineering at the nanometer scale (nanotechnology) is practical in the year
2032, then engineering at the picometer scale should be practical in about forty years
later (because 5.64 = approximately 1,000), or in the year 2072. Engineering at the
femtometer (one thousandth of a trillionth of a meter, also referred to as a quadrillionth
of a meter) scale should be feasible, therefore, by around the year 2112. Thus I
am being a bit conservative to say that femtoengineering is controversial in 2099.

This is just so breathtakingly wrong I am at a loss for where to
begin, and it was just as completely wrong when the book was published
two decades ago as it is today; nothing relevant to these statements
has changed. My guess is that Kurzweil was thinking of “intricate
mechanisms” within hadrons and mesons, particles made up of quarks
and gluons, and not within quarks themselves, which then and now are
believed to be point particles with no internal structure whatsoever and
are, in any case, impossible to isolate from the particles they compose.
When Richard Feynman
envisioned
molecular nanotechnology in 1959, he based his
argument on the well-understood behaviour of atoms known from chemistry
and physics, not a leap of faith based on drawing a straight line on
a sheet of semi-log graph paper. I doubt one could find a single current
practitioner of subatomic physics equally versed in the subject as was
Feynman in atomic physics who would argue that engineering at the level
of subatomic particles would be remotely feasible. (For atoms, biology
provides an existence proof that complex self-replicating systems of atoms
are possible. Despite the multitude of environments in the universe since
the big bang, there is precisely zero evidence subatomic particles have
ever formed structures more complicated than those we observe
today.)

I will not further belabour the arguments in this vintage book. It
is an entertaining read and will certainly expand your horizons as
to what is possible and introduce you to visions of the future you
almost certainly have never contemplated. But for a view of the future
which is simultaneously more ambitious and plausible, I recommend
The Singularity Is Near.