Too Much Safety Could Make Drivers Less Safe

Back in 1908, when Mercedes-Benz was a startup and drivers had to navigate around horse-drawn carriages, signaling turns by rolling down an isinglass window and sticking out an arm, psychologists Robert Yerkes and John Dodson first described a paradox that continues to plague would-be automotive automators.

The Yerkes-Dodson law, as it became known, describes the relationship between arousal and performance. Give people too little to pay attention to and they’ll become complacent. Give them too much and they’ll become overwhelmed. For the best performance, the two researchers said, humans must work in the sweet spot where manageable tasks keep them interested.

In automotive terms, this means drivers are at their best when they’re paying attention to their surroundings but they aren’t flummoxed. You’re just as likely to do a bad job driving down some lonely treeless straightaway in Kansas as you are merging onto the New Jersey Turnpike at rush hour, lost, with a scorpion on your shoulder.

This is important because, as automakers pack their cars with more and more semiautonomous safety technology like adaptive cruise control and automatic braking, driving a car becomes easier and easier. We are, essentially, given less to pay attention to while we’re taught that our cars are watching out for us.

Just how the rapid adoption of semiautonomous vehicle systems will affect overall motor vehicle safety remains to be seen, but it may fundamentally affect the kinds of crashes we see and whether active safety systems gain widespread acceptance.

These systems, in practice if not by design, allow drivers to pay less attention to the road ahead. The impact on performance and safety depends upon how big a workload drivers have. In stressful situations where drivers are easily overwhelmed, such as stop-and-go traffic or searching for an address in an unfamiliar neighborhood, electronic nannies can be a big help to a driver whose cognitive load is maxed out.

But put that same driver on an arrow-straight road in Kansas in the dead of the night and it could be a problem. The driver isn’t paying attention and may not see trouble coming until it’s too late.

In that situation, “we want to increase demand, we don’t want to decrease demand,” said Bryan Reimer, a researcher at the MIT AgeLab and associate director of the New England University Transportation Center. “You want to have enough workload that you maintain an adequate load of performance.”

Distraction isn’t a matter of choice, either. “It’s not whether you want to attend or you don’t want to attend,” Reimer said. “It’s fundamentally in the back of the brain that you need a certain amount of demand to sustain attention.”

Clifford Nass, a communications professor at Stanford University who studies multitasking, put it more bluntly.

“People are always happy to be lazy, and it’s sort of a rule of safety design,” he said. “So if you give people the slightest opportunity to be lazy, they’ll take to it with great gusto and joy.”

This is especially true for frequent multitaskers — and most apparent with young people, whose brains have developed to crave new information.

“Younger people — the majority of multitaskers — like new things, new information rather than old information,” said Nass. “As a result, if you can say, ‘You don’t have to keep on staring at this boring road,’ that makes their brains extremely happy. It doesn’t make them safe, but it makes them happy.”

Almost Doesn’t Count

The big problem with semiautonomous vehicles is summed up by the adjective used to describe them. The “semi” in semiautonomous means the technology still requires human interaction. That can throw a wrench in the works.

“The intersection of autonomy and human behavior is a difficult problem, and we are the problem,” said Reimer. “We’re not predictable, nor are we completely rational.”

Your car can’t tell if you’re tired, daydreaming or listening to a conference call on speakerphone. Therefore, it doesn’t know whether it’s appropriate to engage active safety features. In theory, perfectly programmed fully autonomous cars would be safer than human drivers, but that technology is still years away.

The automakers offering semiautonomous active safety systems primarily have been concerned with developing reliable technology. And rightly so: We wouldn’t want new Benzes and Lexuses steering into traffic or hitting the brakes at random. But these systems fall short in creating a seamless transition between human and machine.

“The point the automakers are making, which is true, is that they go to extreme lengths to make these systems work and extremely reliable,” Nass said. “The reliability on these systems is very high. If you have automatic cruise control, it’s not extremely often you have to jump into the fray.”

Therein lies the problem. We come to count on our cars to keep us out of trouble, even in situations where the technology isn’t designed to.

“Road hazards other than the car in front of you are so rare, especially on the highway where these adaptive cruise control systems would be in play, that they would, over time, encourage a complacency that undermines safety,” said Erik Blaser, a psychology professor at the University of Massachusetts, Boston, who studies vision and perception. “You stop paying attention to the driving.”

In a controlled environment such as a lab, Blaser said, subjects without distractions may be more “tuned” to particular visual stimuli, such as jumping deer and flashing brake lights. But driving isn’t a controlled environment. Friends send texts, lousy songs come on the radio and interesting scenery passes by — and important visual information goes unnoticed. Active safety systems can exacerbate this.

“I wouldn’t be surprised if in the long term you actually wind up missing more because you learn that you don’t have to pay attention to the driving as much,” Blaser said.

The Learning Curve

Reimer said semiautonomous vehicles work best with drivers who trust the technology and are adequately trained how and when to use it. Gaining trust in new technology isn’t a problem — you don’t see many people demanding cars without antilock brakes or airbags — but teaching drivers how to use it poses unique challenges.

“The functionality of the technology is very good at this point, but how do you teach people how to use it appropriately?” Reimer said. “Reading the owner’s manual is not going to provide the information that you need.”

Instead, he suggests ongoing, lifetime driver training and an end to the American tradition of driver’s education only for new drivers. Auto dealerships should spend more time working with customers to fully explain the limits of automotive safety technology before letting them drive home. Looking further ahead of the curve, cars could one day actively detect drivers’ states — whether they’re tired or distracted, for instance — and allow the use of semiautonomous safety technologies when appropriate.

The limitations of active safety systems must be second nature to drivers, said Nass. Drivers must know what the technology can and can’t do so they don’t rely upon it in situations where it won’t work.

“It’s always a problem with partially autonomous systems,” he said. “You’ll always have the issue of remembering what it does and what it doesn’t do, and in real time we don’t want people pondering that.”

Nass says the best safety systems go unnoticed. He calls them “secret” safety systems, those unseen fail-safe systems like anti-lock brakes and electronic stability control (ESC). Such systems step in only when a driver is in trouble. Because drivers aren’t constantly made aware of their presence, they tend not to change their behavior. In other words, it doesn’t make them lazy.

“When it’s obvious [safety systems] are doing something, we say, ‘Oh I don’t have to do as much,'” Nass said.

Anti-lock brakes and stability control, on the other hand, gained widespread praise even though drivers hardly know they’re there.

“And it’s because people don’t alter their driving because they have ESC, because we don’t advertise that,” Nass said. “It’s a very different psychology.”

Time for a tradeoff.

Consider the example of driving a lonely road with adaptive cruise control engaged and your brain disengaged, convinced the car is looking out for you. A car suddenly cuts you off. You don’t notice it, but your car’s radar does and hits the brakes, avoiding a collision. Score one for technology.

Twenty miles down the road, a deer darts across the road, requiring evasive action. But you’ve been zoned out for the past 15 miles, convinced your electronic nanny will protect you. Trouble is, it wasn’t designed to detect a deer and you plow into the animal.

Therein lies the paradox of semiautonomous vehicles: They’re very good at avoiding some problems but may exacerbate — or even create — others. We may not even know what those problems are until we see a lot more vehicles with the technology.

Collision mitigation systems won’t let us collide in the ways we’re used to, Reimer said, but they may let us collide in ways we aren’t. Automatic braking has the potential to nearly eliminate rear-end collisions, and lane-departure warning could drastically reduce the instance of side-swipe and merging collisions. But how human drivers react to, and rely upon, these technologies could create new, unseen problems.

“I’m really a believer that the roads are going to get a little more difficult and dangerous with autonomous systems in the vehicle before they get safer,” Reimer said. “When the driver’s in the loop but yet has control to take over, it’s tough.”

How the roads change remains to be seen.

“I won’t predict what it is — but the likelihood is that there is an effect,” Reimer said. “The effect may be smaller than the problem itself, but there is an effect.”