SCIENCE fiction is about revolutionary ideas and amazing inventions. In this column, we're going to talk about something that is both. It's a very simple thing—so simple that most people don't think about it at all. Over the centuries, people have come to take this invention for granted. But at the Exploratorium we are in the business of paying attention to things that other people ignore, and we've decided it's time to call attention to this remarkable invention.

This idea was the brainchild of a group of astronomers in the Indus Valley about fifteen hundred years ago (give or take a century). Back in the fifteenth century, the impact of this invention on the mathematics of Europe could be compared to the social changes resulting from computers today. What is it?

Why, it's nothing.

That is, it's the zero, a little bit of nothing bounded by a line in the shape of a goose egg. The invention of the zero revolutionized the practice of mathematics and, in the process, made modern science possible. And yet we have noticed that lots of science fiction deals with the concept of infinity, but not so much deals with nothing. In an effort to correct this oversight, we will tell you a bit about the weird history of zero, about a century-long battle between the Abacists and the Algorists, about the year zero, and about the difficulty of counting. With any luck, some science fiction writers reading these pages will be inspired.

COUNT LIKE A ROMAN

To understand how the zero transformed mathematics, we need to take a close look at a numerical system that doesn't use a zero. For simplicity's sake, let's consider Roman numerals, one of the most familiar of the numbering systems that preceded the zero's invention. In Roman numerals, different symbols stand for different numbers. As you probably learned in elementary school, I stands for one, V stands for five, and X for ten. If you want to count higher, you also need to know that L stands for fifty, C for 100, D for 500, and M for 1000.

To write a number, you combine these signs according to certain rules. If you place a sign for a smaller unit to the right of a larger unit, you add the numbers together: VI stands for five plus one, or six. But if you place a smaller unit to the left of a larger unit, you subtract the small one from the big one: IV translates to five minus one, or four. MCMLVI, for example, represents 1956. (Moving from left to right, M equals 1000; CM equals 900; L equals 50; VI equals 6. Add them all together and you get 1956.

This system has one obvious drawback. You've got seven symbols so far and you can write numbers in the thousands. But if you want to count higher, you need more symbols. A bar over a Roman numeral indicates thousands. Symbols exist for ten thousand, fifty thousand, and so on. But additional symbols don't solve the basic problem: the higher you go, the more symbols you need. Without the zero, there's no upper limit to the number of symbols you need to write numbers.

Now compare Roman numerals to our present system. Using a zero and nine other symbols, we can write very large numbers. That's because we use a place-value system, in which the position of a symbol is just as important as the symbol itself. The symbol 8 may mean eight, eighty, eight hundred, or eight thousand. It all depends on its position or place. The 8 in 893 stands for eight hundred, but the 8 in 983 stands for 80. The symbol's value depends on which column it's in.

The ancient Babylonians came up with the idea of a place-value system, but they didn't have the zero—and that led to problems. Suppose one of the places is empty. You might have eight hundreds, for example, but no tens and no ones. The Babylonians left a blank space when a position was empty. But a scribe in a hurry could easily omit this space, changing the value of a number. Eventually, the Babylonians started using a dot as a zero, but they omitted this placeholder when it was on the right side of a number. The resulting notations were ambiguous. Without the zero on the right, 11 looks just like 110.

WRITING REALLY BIG NUMBERS

Finally, early in the sixth century A.D., the zero came along. A group of Indian astronomers were looking for a numerical system that allowed them to represent large numbers easily. They knew of the Babylonian place-value system. They had symbols—the Brahmi numerals—for the numbers one to nine. And they used the counting board, a calculating device that lent itself to thinking about empty places.

The counting board took many forms, but generally all the versions used parallel lines or columns and some system of counters. A common form was a board marked with parallel columns. Flat disks or counters known as calculi (which means "pebbles" in Latin) were placed in the columns. The column in which a counter rested determined its value. Going from right to left, each counter in the first column stood for a single unit; each counter in the second column stood for ten units; each counter in the third column stood for one hundred units.

Sound familiar? The columns of the counting board match the positions we use to write numbers: ones, tens, hundreds, thousands, and so on.

Those Indian astronomers realized that they could record the number of counters in each column of the counting board using their Brahmi numerals. But their stroke of genius relates to the columns in which there were no counters. To note an empty column, they made a zero symbol: originally a small dot, later on a circle or cross.

This new system let them write large numbers with ease, using the nine Brahmi numerals and the zero (which wasn't regarded as a numeral). Zero was just a mark to put in any empty place. The notion that zero is a number like any other is a modern idea, not part of the original concept.

The new system also made arithmetic calculations easier. It's tough to multiply and divide using Roman numerals. It can be done, but it isn't easy. When the Romans wanted to multiply or divide or do any other complicated arithmetic problem, they used a counting board or an abacus. (More on counting boards in a bit.)

THE MAGIC CIPHER

This system of writing numbers invented by the Indian astronomers spread to Europe by way of the Arab culture, following the rise of the Islamic empire. (In 732 A.D., the Islamic Empire reached from the borders of China to Spain.) Before the end of the tenth century, the Arabs of Spain had begun using the Hindu system of reckoning.

In the Middle Ages, when the Arabic-Hindu numerals that we now use were first introduced in Europe, people regarded them with great distrust. The most magical and powerful of those symbols was, of course, the zero, a mysterious and bewildering sign. Sometimes a zero was nothing at all. But if you put it to the right of another numeral, it multiplied the value of the number by ten. How could this be?

The transition from Roman numerals to the new Hindu-Arabic system took centuries, lasting from the twelfth until the fifteenth century. Abacists, who defended the use of the counting table, battled the Algorists, who preferred the new numbers. Distrust of the foreign numerals gave the Abacists an edge. In 1299, the City of Florence required the use of Roman numerals in account books, outlawing the new numerals in an effort to prevent fraud. The government feared that the new numerals could easily be falsified, since a zero could be transformed into a 6 or an 8 or a 9. For this reason, documents written with Roman numerals carried more weight in court.

Despite the suspicion and confusion, the new numerals became increasingly important in commerce. In fifteenth-century Italy and Germany, merchants learned the arts of bookkeeping, computation, and calculation with the foreign numerals. Toward the end of the century, mercantile houses and offices were using the new numerals in their everyday calculations. By the beginning of the sixteenth century, the Algorists had won. In fact, by the eighteenth century, Europeans had completely forgotten the counting board and the abacus, tools that had proven useful for centuries. In the nineteenth century, one of Napoleon's generals who had been captured by the Russians returned to France with a Russian abacus, which was regarded as a curiosity. The adoption of zero had rendered it obsolete.

THE BEGINNING OR THE END

Today, the zero is part of our mathematics—but there are still indications that suggest we are not entirely comfortable with the goose egg that represents a little bit of nothing. Many tall buildings lack a 13th floor, skipping from 12 to 14 to avoid that dreaded number. Most buildings—at least in the U.S.A.—also lack a zeroth floor. (You'll find a zeroth floor in a few buildings housing math departments and in lots of buildings in the Spanish-speaking world.)

You can make people uneasy by asking them whether zero is a number or not. And if it's a number, is it even or odd?

According to Paul, zero is indeed a number. It's an integer, one of the group of numbers that includes all the counting numbers (1,2,3, and so on), their negatives, and zero. Zero is considered to be even, but neither negative nor positive. (One interesting way to include zero in a list of integers is to say "all non-negative numbers," thus describing all the positive numbers plus zero.)

But even though zero is a number, it's a tricky one that comes with its own set of rules.

Add zero to any number and you get that number: 0 + 2 = 2

Subtract zero from any number and you get that number: 2 – 0 = 2

Add zero to itself and you get zero: 0 + 0 = 0

Multiply any number by zero and you get zero: 0 x 2 = 0

Multiply zero by itself and you get zero: 0 x 0 = 0

The Indian mathematicians who invented zero knew all this. But they slipped up (according to modern mathematicians) when it came to the value of 1/0. Mahavira, an Indian mathematician living in 830 A.D., published that 1/0 = 0, which is certainly wrong.

But before we take issue with Mahavira, what do you think the value of 1/0 is? Infinity, perhaps?

Mathematicians say that 1/0 is meaningless. Here's how they think about it. Consider that division is the inverse of multiplication. If 10/5 = 2, then you can multiply both sides by 5 to get 10 = 5 x 2.

Follow that logic through with 1/0. If 1/0 = N, then you can multiply both sides by 0 to get 1 = 0 x N. But any number multiplied by 0 equals 0. So there is no number N that will multiply zero to produce 1. There is no answer! What you get when you divide 1 by 0 is no number. The operation 1/0 is meaningless.

(This reminds Paul of Odysseus, who told the Cyclops that his name was "no man" and then poked out the Cyclops's eye. The Cyclops then complained that "no man" had poked out his eye. In the story of 1/0, the culprit is "no number.")

Things get even more interesting when you divide zero by zero: 0/0 = ? What do you think the answer is? Many people think the answer is 1 since any number (other than 0) divided by itself is 1.

But think about that for a moment. Could the answer be 0? After all, any number with 0 in the numerator is zero. Or could the answer be no number at all, since any number divided by zero is no number?

Mathematicians say that 0/0 is indeterminate, since the result can be any number. They get to that an-swer using the same logic as above. If 0/0 = N, you can multiply both sides by zero and get 0 = 0 x N. This is true for all numbers N. So the result can be any number at all. This is quite different from 1/0 where the result is no number at all.

Even with addition, zero is a special case. Every number N has an additive inverse (–N). When you add a number to its additive inverse, the result is 0. For example, 2 plus –2 equals zero. Zero is special in that it is the only number that is its own additive inverse. 0 + 0 = 0.

OFF BY ONE

Most people start counting from one—but sometimes it can be useful to start with zero. Suppose you're trying to find the period of a pendulum, the time it takes the pendulum to swing back and forth and return to the same spot.

You could start the pendulum swinging and use a stopwatch to time how long it takes to go from one end of its swing to the other and return. But to get a more accurate measurement, you'd be better off timing ten swings of the pendulum and then dividing the total time for ten swings by ten.

If you do this you should count "zero" as you start the stopwatch, count "one" the first time the pendulum returns to its starting point, and count again each time the pendulum returns until you reach "ten" and stop the watch. If you count "one" when you start the watch, count again each time the pendulum returns, and stop at "ten," you will time only nine complete swings, not ten. This counting error occurs so often it has its own name—the "off by one" error.

THE YEAR ZERO

Zero—by its absence—is also responsible for all that annoying discussion back on January 1, 2000, when lots of people celebrated the turn of the century. Folks at the U.S. Naval Observatory pointed out the correct time to celebrate the next 1000 years was in January 2001. Why the discrepancy?

Our current Gregorian Calendar was adapted from the Julian Calendar in 1582, before zero entered into common use. As a result there is no year 0. We go straight from 1 B.C. to 1 A.D.

This makes counting difficult. There is only one year between January First in 1 B.C. and January First in 1 A.D. On a number line, there are two units between –1 and 1.

Starting counting from the Year One is a bit like counting one when you start your stopwatch, rather than waiting for the first swing of the pendulum. You end up with an "off by one" error.

This antipathy toward zero is not true of all calendars. The Maya started every month with day 0. In one of their calendars they had 18 months of 20 days numbered 0 to 19, and one month of 5 days numbered 0 to 4.

IT ALL COMES TO NOTHING

Incidentally, the zero isn't the only case where nothing ends up being very important. In music, the rest is just as important as the notes. In graphic design, white space is just as important (or perhaps even more important) than the space that's filled with pictures and type.

Today, it's hard not to take the zero for granted. It's a goose egg, a whole lot of nothing. But without the zero—and the higher mathematics that its invention made possible—the progress of modern science, industry, and commerce was unlikely, if not impossible.

The invention of something which represents nothing changed the world. And if that's not science fiction, we don't know what is.

--------

The Exploratorium is San Francisco's museum of science, art, and human perception—where science and science fiction meet. Pat Murphy and Paul Doherty both work there. To learn more about Pat Murphy's science fiction writing, visit her web site at www.brazenhussies.net/murphy. For more on Paul Doherty's work and his latest adventures, visit www.exo.net/~pauld.