Speedometer Scandal!

Can you trust your most frequently consulted gauge?

Regular readers have probably noticed that when we describe a vehicle that really gets our juices flowing, we tend to hyperbolize about the accuracy and precision with which the steering wheel and pedals communicate exactly what is happening down where the rubber meets the road. It has recently come to our attention, however, that many of the cars we like best are surprisingly inaccurate about reporting the velocity with which the road is passing beneath the tires. Or, to put it another way, speedometers lie.

Yes, ladies and germs, we are scooping 20/20 and 60 Minutes with this scandal: Speedometers Lie! Okay, "exaggerate" may state it more aptly, if less provocatively.

When traveling at a true 70 mph, as indicated by our highly precise Datron optical fifth-wheel equipment, the average speedometer (based on more than 200 road-tested vehicles) reads 71.37 mph. Wait, wait! Before you roll your eyes and turn the page, let us dig just a bit deeper and reveal some dirt.

Sorted by price, luxury cars are the least accurate, and cars costing less than $20,000 are the most accurate. By category, sports cars indicate higher speeds than sedans or trucks. Cars built in Europe exaggerate more than Japanese cars, which in turn fib more than North American ones. And by manufacturer, GM's domestic products are the most accurate, and BMW's are the least accurate by far. One other trend: Only 13 of our 200 test speedos registered below true 70 mph, and only three of those were below 69 mph, while 90 vehicles indicated higher than 71 mph. Are our cars trying to keep us out of traffic court?

To understand, let's first study the speedometer. In the good old days, plastic gears in the transmission spun a cable that turned a magnet, which imparted a rotational force to a metal cup attached to the needle. A return spring countered this force. Worn gears, kinked or improperly lubed cables, tired springs, vibrations, and countless other variables could affect these mechanical units.

But today, nearly all speedometers are controlled electronically. Typically, they are driven by either the vehicle's wheel-speed sensors or, more commonly, by a "variable reluctance magnetic sensor" reading the speed of the passing teeth on a gear in the transmission. The sine-wave signal generated is converted to speed by a computer, and a stepper motor moves the needle with digital accuracy.

Variations in tire size and inflation levels are the sources of error these days. Normal wear and underinflation reduce the diameter of the tire, causing it to spin faster and produce an artificially high reading. From full tread depth to baldness, speeds can vary by up to about two percent, or 1.4 mph at 70 mph. Lowering tire pressure 5 psi, or carrying a heavy load on the drive axle, can result in about half that difference. Overinflation or oversize tires slow down the speedometer. All our speed measurements were made on cars with new stock tires correctly inflated, but one might expect a manufacturer to account for wear and to bias the speed a bit low; results suggest that not to be the case.

So we sought out the rule book to find out just how much accuracy is mandated. In the U.S., manufacturers voluntarily follow the standard set by the Society of Automotive Engineers, J1226, which is pretty lax. To begin with, manufacturers are afforded the latitude to aim for within plus-or-minus two percent of absolute accuracy or to introduce bias to read high on a sliding scale of from minus-one to plus-three percent at low speeds to zero to plus-four percent above 55 mph. And those percentages are not of actual speed but rather a percentage of the total speed range indicated on the dial. So the four-percent allowable range on an 85-mph speedometer is 3.4 mph, and the acceptable range on a 150-mph speedometer is 6.0 mph.