Devlin's
Angle

November 2002

The inaccessibility of modern mathematics

In late October, my new book The
Millennium Problems: The Seven Greatest Unsolved
Mathematical Puzzles of Our Time
went on sale across the country, and this month
sees me doing the usual round of public lectures,
bookstore talks, and magazine, radio and TV
interviews that these days accompany the publication
of any new book the publisher thinks has even the
ghost of a chance of becoming the next popular
science bestseller.

Of all the books I have written for a general
audience, this latest one presented by far the
greatest challenge in trying to make it as accessible
as possible to non-mathematicians. The seven unsolved
problems I discuss -- the Clay Millennium Problems
-- were chosen by a small, stellar, international
committee of leading mathematicians appointed by the
Clay Mathematics
Institute, which offers a cash prize of $1 million
to the first person to solve any one of the problems.
The committee's mission was to select the most
difficult and most significant unsolved problems at
the end of the second millennium, problems that had
for many years resisted the efforts of some of the
world's greatest mathematicians to find a solution.

No one who is at all familiar with modern mathematics
will be surprised to find that none of the seven
problems chosen is likely to be solved by elementary
methods, and even the statement of most of the
problems cannot be fully understood by anyone who has
not completed a mathematics major at a university.

In writing the book, I had to ignore the oft-repeated
assertion that every mathematical formula you put in
a book decreases the sales by 50%. (Personally, I don't
think this is literally true, but I do believe that
having pages of formulas does put off a lot of potential
readers.) Although my book is mostly prose, there are
formulas, and some chapters have technical appendices
that are little else but formulas.

Now, as I gear up for the promotional campaign, I face
the same challenge again. With the book, I think I
found a way to present the story of the Millennium
Problems in 250 pages of text. But what can I say
about the book's contents in a twenty minutes talk in
a bookstore or a ten minute interview on a radio talk
show? Thinking about this made me reflect once more
about the nature of modern mathematics. Put simply:
Why are the Millennium Problems so hard to understand?

Imagine for a moment that Landon Clay -- the wealthy
mutual fund magnate who founded the Clay Institute and
provided the $7 million of prize money for the seven
problems -- had chosen to establish his prize competition
not for mathematics but for some other science, say
physics, or chemistry, or biology. It surely would not
have taken an entire book to explain to an interested
lay audience the seven major problems in one of those
disciplines. A three or four page expository article
in Scientific American or 1,500 words in
New Scientist would probably suffice. Indeed,
when the Nobel Prizes are awarded each year, newspapers
and magazines frequently manage to convey the gist
of the prize-winning research in a few paragraphs.
In general you can't do that with mathematics.
Mathematics is different. But how?

Part of the answer can be found in an observation first
made (I believe) by the American mathematician Ronald
Graham, who for most of his career was the head of
mathematical research at AT&T Bell Laboratories.
According to Graham, a mathematician is the only
scientist who can legitimately claim: "I lie down on
the couch, close my eyes, and work."

Mathematics is almost entirely cerebral -- the actual
work is done not in a laboratory or an office or a
factory, but in the head. Of course, that head is
attached to a body which might well be in an office
-- or on a couch -- but the mathematics itself goes
on in the brain, without any direct connection to
something in the physical world. This is not to
imply that other scientists don't do mental work. But
in physics or chemistry or biology, the object of the
scientist's thought is generally some phenomenon in
the physical world. Although you and I cannot get
inside the scientist's mind and experience her thoughts,
we do live in the same world, and that provides the
key connection, an initial basis for the scientist to
explain her thoughts to us. Even in the case of
physicists trying to understand quarks or biologists
grappling with DNA, although we have no everyday
experience of those objects, even a nonscientifically
trained mind has no trouble thinking about them. In
a deep sense, the typical artist's renderings of
quarks as clusters of colored billiard balls and DNA
as a spiral staircase might well be (in fact are)
"wrong," but as mental pictures that enable us to
visualize the science they work just fine.

Mathematics does not have this. Even when it is
possible to draw a picture, more often than not the
illustration is likely to mislead as much as it
helps, which leaves the expositor having to make up
with words what is lacking or misleading in the
picture. But how can the nonmathematical reader
understand those words, when they in turn don't
link to anything in everyday experience?

Even for the committed spectator of mathematics,
this task is getting harder as the subject grows more
and more abstract and the objects the mathematician
discusses become further and further removed from
the everyday world. Indeed, for some contemporary
problems, such as the Hodge Conjecture -- one of the
seven Millennium Problems -- we may have already
reached the point where the outsider simply can't
make the connection. It's not that the human mind
requires time to come to terms with new levels of
abstraction. That's always been the case. Rather,
the degree and the pace of abstraction may have
finally reached a stage where only the expert can
keep up.

Two and a half thousand years ago, a young follower
of Pythagoras proved that the square root of 2 is
not a rational number, that is, cannot be expressed
as a fraction. This meant that what they took to be
the numbers (the whole numbers and the fractions)
were not adequate to measure the length of the
hypotenuse of a right triangle with width and height
both equal to 1 unit (which Pythagoras' theorem says
will have length the square root of 2). This discovery
came as such a shock to the Pythagoreans that their
progress in mathematics came to a virtual halt.
Eventually, mathematicians found a way out of the
dilemma, by changing their conception of what a
number is to what we nowadays call the real numbers.

To the Greeks, numbers began with counting (the
natural numbers) and in order to measure lengths
you extended them to a richer system (the rational
numbers) by declaring that the result of dividing
one natural number by another was itself a number.
The discovery that the rational numbers were not in
fact adequate for measuring lengths led later
mathematicians to abandon this picture, and instead
declare that numbers simply are the points on
a line! This was a major change, and it took two
thousand years for all the details to be worked out.
Only toward the end of the nineteenth century did
mathematicians finally work out a rigorous theory
of the real numbers. Even today, despite the simple
picture of the real numbers as the points on a line,
university students of mathematics always have
trouble grasping the formal (and highly abstract)
development of the real numbers.

Numbers less than zero presented another struggle.
These days we think of negative numbers as simply the
points on the number line that lie to the left of 0,
but mathematicians resisted their introduction until
the end of the seventeenth century. Similarly, most
people have difficulty coming to terms with complex
numbers -- numbers that involve the square root of
negative quantities -- even though there is a simple
intuitive picture of the complex numbers as the
points in a two-dimensional plane.

These days, even many nonmathematicians feel
comfortable using real numbers, complex numbers, and
negative numbers. That is despite the fact that
these are highly abstract concepts that bear little
relationship with counting, the process with which
numbers began some ten thousand years ago, and even
though, in our everyday lives, we never encounter a
concrete example of an irrational real number or a
number involving the square root of -1.

Similarly in geometry, the discovery in the
eighteenth century that there were other geometries
besides the one that Euclid had described in his
famous book Elements caused both the experts
and the nonmathematicians enormous conceptual problems.
Only during the nineteenth century did the idea of
"non-Euclidean geometries" gain widespread acceptance.
That acceptance came even though the world of our
immediate, everyday experience is entirely Euclidean.

With each new conceptual leap, even mathematicians
need time to come to terms with the new ideas, to
accept them as part of the overall background against
which they do their work. Until recently, the pace
of progress in mathematics was such that, by and
large, the interested observer could catch up with
one new advance before the next one came along. But
it has been getting steadily harder. To understand
what the Riemann Hypothesis says, the first problem
on the Millennium list, you need to have understood,
and feel comfortable with, not only complex numbers
(and their arithmetic) but also advanced calculus,
and what it means to add together infinitely many
(complex) numbers and to multiply together infinitely
many (complex) numbers.

Now that kind of knowledge is restricted almost
entirely to people who have majored in mathematics
at university. Only they are in a position to see
the Riemann Hypothesis as a simple statement, not
significantly different from the way an average
person views Pythagoras' theorem. My task in
writing my book, then, was not only to explain what
the Riemann Hypothesis says but to provide all of
the preliminary material as well. Clearly, I
cannot do that in a ten minute radio interview!

The root of the problem is that, in most cases, the
preparatory material cannot be explained in terms
of everyday phenomena, the way that physicists, for
example, can explain the latest, deepest, cutting-edge
theory of the universe -- Superstring Theory -- in
terms of the intuitively simple picture of tiny,
vibrating loops of energy (the "strings" of the theory).

Most mathematical concepts are built up not from
everyday phenomena but from earlier mathematical
concepts. That means that the only route to getting
even a superficial understanding of those concepts
is to follow the entire chain of abstractions that
leads to them. My readers will decide how well I
succeed in the book. But that avenue is not
available to me in a short talk.

Perhaps, then, instead of trying to describe the
Millennium Problems themselves, I'll tell my
audiences why they are so hard to understand.
I'll explain that the concepts involved in the
Millennium Problems are not so much inherently
difficult -- for they are not -- as they are very,
very unfamiliar. Much as the idea of complex numbers
or non-Euclidean geometries would have seemed
incomprehensibly strange to the ancient Greeks.
Today, having grown familiar with these ideas,
we can see how they grow naturally out of concepts
the Greeks knew as commonplace mathematics.

Perhaps the best way to approach the Millennium
Problems, I will say, is to think of the seven
problems as the commonplace mathematics of the
25th century.