JavaScript: Playing the Numbers Game

Built-in type-conversion in JavaScript makes using numbers so easy that developers rarely even think about using them. Surely that simplicity isn't as perfect as it seems! Just how good is the number support in JavaScript anyway? In this article you'll explore the edges of the world of numbers in JavaScript.

by Nigel McFarlane

Sep 5, 2003

Page 1 of 5

f you're a Web developer, you may well have written something like the following input tag, which accepts a number that a user types in. Unhappily, you've probably also experienced the failure that can result:

<input type="text" name="age"
onchange="return
(this.value>0)">

The problem is that user-typed strings don't all convert to numbersthe user can enter "XXY" as easily as "23.1". A correct solution, maybe with extra processing, is of course:

Mostly, however, JavaScript's string and number handling is flexible and invisible. But how flexible? Are there traps? Can you have an array of 3.5 elements? Can you add 1 to 1,000,000 and be sure to get 1,000,001, not 1.000E6? Will it display correctly? In the dim past, browsers had horrible numeric bugs. Can you relax yet, or are there still pitfalls?
This article explores the limits of numeric processing by the script engine in modern browsers, versions 5.x onwards (and Netscape 4.7x, and Mozilla 1.x). At the end you'll know what's safe and what's not safe about numbers in browsers.

One Case of Science: Modeling the Real World
A number is a very handy concept. The most popular kinds are integers and reals. In your head you probably know what these numbers are. Trivial cases like 2 and 23.45 are easy to write down. Some, though, are not trivial. How do you write down the integer recently found by the Great Mersenne Prime Search? It has four million-plus digits. How do you write down the decimal value of pi (just under 3.141592654)? It has infinite digits. You can't.

To write down numbers on paper or in a computer, you must use an encoding which represents the number. It's important to realize that the encoded representation isn't actually the number itself. All encodings have shortcomings, whether that's a size limit or some other complexity. Is "2" an integer OR a real, or is it an integer AND a real? From your school days you may remember that you can represent a repeating digit with a dot, for example, by putting a dot over the 9 in 1.9, rather than writing 1.99999999 forever. It turns out that one-point-nine-recurring is exactly the same number as 2. It's a convergent geometric sum of 1 + (0.9 + 0.09 + 0.009 + ) which equals 2. So there are two 2s. Three if you include 2.0000. Not all number concepts are written uniquely, even on paper! That's highly inconvenientand barely believable.