Rear-End Crashes Go Up After Red-Light Cameras Go In

When the nation's No. 1 cheerleader for red-light cameras admits there might be one teensy-weensy downside to the program, you just know it's going to be a lulu so large it couldn't be crammed under the carpet without making a bulge the size of a circus tent.

The Insurance Institute for Highway Safety (IIHS) recently enthused over traffic-tickets-by-mail schemes for an entire issue of its Status Report. On red-light cameras, however, it did allow that "most studies also reported increases in rear-end crashes."

It went on to say, "This isn't surprising. The more people stop on red, the more rear-end collisions there will be."

Duh!

Not to worry, however, because "photo enforcement leads to significant overall reductions in crashes," assures Susan Ferguson, the institute's senior vice-president for research.

Well, that depends on who's telling the story. The institute itself did two studies, both in Oxnard, California, the most recent one published in 2001. Other studies have been done, but the IIHS roundly pooh-poohs them. Why? Because they don't follow a curious methodology the IIHS invented especially for Oxnard.

IIHS insists that all red-light-camera studies must account for "regression to the mean" and for "spillover effects."

Regression to the mean is a fact of life; in any one year, there could be an extraordinarily large number of crashes at a particular intersection, but over several years the count will revert back to average (mean).

Funny that IIHS insists regression be accounted for in studies at stoplights when it never considers the same factor in its studies of speed limits.

Spillover effect is IIHS's trick for giving the cameras credit for reducing fatalities even where they aren't. It assumes that red-light cameras at a few intersections will cause drivers to stop promptly all over town, or all over the county, or maybe all over the state, so improvements outside the cameras' ZIP Codes are credited to them nonetheless. As statistical acrobatics go, this one is breathtaking.

But you ain't seen nothin' yet. The obvious way to gauge the payoff of red-light cameras is to compare intersections with cameras to those without, then zoom in on crashes actually caused by drivers running red lights. Instead, IIHS considered all crashes at all 125 signalized intersections in Oxnard and concluded that injury crashes dropped by 29 percent due to the cameras, even though they were installed at only 11 intersections.

Spillover effect, don't you know.

Skeptics will notice that crashes went down rather randomly all over town, and some ordinary intersections outperformed those with the gotcha equipment. The cameras look remarkably ineffectual until, just in time, spillover effect arrives to snatch victory from the jaws of ho-hum.

Skeptics will also notice that these IIHS studies, which pretend to be about red-light running, never bother to isolate those crashes specifically caused by running red lights. Why? It says, "The crash data did not contain sufficient detail to identify crashes that were specifically red-light-running events."

This is believable only to those who've never heard of police reports. Oxnard, like most California jurisdictions, reports crashes according to the California Highway Patrol protocol, which includes a "primary collision factor," i.e., the cause of the crash. Those reports are collected into a CHP database (SWITRS). Running red lights falls under the category of "stop signals and signs." According to Steve Kohler of the CHP, it includes stop signals and stop signs. Nothing else.

Since all signalized intersections in Oxnard are, by definition, controlled by signals and not stop signs, red-light running should be neatly isolated as a "primary collision factor." When IIHS finds numbers that support the story it wants to tell, it jumps on them like a trampoline. When it hides from numbers as it did in this case, you can bet they go the wrong way.