American Speed Limits Are Based on 1950s Science

Speed limits might make you feel safe, or incredibly frustrated, or both. But either way there’s a bigger issue at hand: they’re based on outdated data and science from the mid-20th century.

In the US, our speed limits are derived from old studies, like this one from 1964 by traffic systems researcher David Solomon that looked only at rural roads in the 1950s. In line with conventional thinking, Solomon’s study fuels the premise that speed limits should be based on the speed at which 85 percent of the drivers on a road are maintaining. That means, if most cars on the highway are going 60 mph, that’s what determines the speed limit.

But with around 40,000 people dying in car accidents on American roads every year, something isn’t working, John Lower, a transportation engineer in California, told me. That includes the 85 percent formula, which traffic advocates have called for to be repealed. They’re calling instead for a data-driven system that reflects the actual traffic using sensor technology. In many cases, this will force us to drive slower.

Lower has spent decades as a city transportation manager, and now works at Iteris, an analytics company. He believes it’s time to reinvent the way we implement speed limits. “The way it works now, there are higher-than-expected crash rates along the system,” he said.

Lower’s solution is in line with Vision Zero, a network of traffic safety advocates he is part of, who want to use more recent data and technology to inform our speed limits. (The network is funded by entities including Kaiser Permanente, a health insurance company.)

In an ideal scenario, Lower said, we would be using smart sensors to collect the information from vehicles, bicycles, and pedestrians to understand traffic flows. (A quick spin around the internet reveals multiple sensors are already on the market like this, including this one from Urbiotica and another from SMATS.) This data would then be analyzed to set speed limits based on the traffic flow, and the presence of the most vulnerable vehicles (bicycles) and people on the roads.

“Every traffic signal has to have some form of detection,” Lower added.

Other possible changes tend to be more controversial, Lower said. For example, we could have variable speed limits that change with the traffic flow. Or photographic reinforcement, which would then test the limits of how much people want to be surveilled.

And then there’s the consideration of autonomous vehicles, which could become more adept than humans at maintaining safe speeds, though the technology right now falls short. “I’m looking forward to the day of autonomous vehicle where the driver is removed from the equation of safe travel,” Lower said. He’s hoping this will be especially be helpful in safeguarding bicyclists and pedestrians.

As someone who’s still dealing with the aftermath of a taxi-bike crash, I hope so too.