This site may earn affiliate commissions from the links on this page. Terms of use.

Tesla now has recorded one Autopilot fatal crash in 130 million miles of driving in the wake of a May 7 accident in Williston, Florida. Joshua Brown, 40, of Canton, OH, was in self-driving (Autopilot) mode when his Tesla Model S failed to brake when a tractor-trailer turned left in front of Brown’s Tesla. Tesla said it quickly notified the National Highway Traffic Safety Administration (NHTSA), which is opening an investigation.

Now comes reports that Brown, identified as a technology guru and former Navy Seal, may have been distracted. Brown may have been watching a Harry Potter video at the time, according to a report by the Associated Press. The AP quoted the truck driver, Frank Baressi, saying that the Tesla driver was “playing a Harry Potter on the TV screen … it was still playing when he died and snapped a telephone pole a quarter mile down the road.” Baressi said he heard, not saw, the video playing. Tesla says it’s not possible to play videos on the 17-inch center stack LCD when the car is moving.

First autonomous car fatality ever?

It is believed that this is the first fatality in any self-driving car in autonomous mode. That would be Level 2 self-driving, defined as a vehicle using multiple sensors and driver assists — typically adaptive cruise control (ACC), lane centering assist (keeps the car centered), and blind spot detection (with return to lane). That’s on a scale of 0 (no features) to 4 (self-driving, no driver required, and there may not even be a steering wheel).

The police accident report, a public document, makes no mention of any distractions such as a video playing. Tesla said, “Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied …. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert.” Visual sensors might miss a light color against a light background, but ACC radar would be equipped to lock onto a truck body or trailer if it’s moving. Some adaptive cruise control algorithms don’t respond to a solid mass that isn’t moving, such as a car stopped at traffic light, as opposed to a car slowing for the light. Tesla uses both optical and radar systems for autonomous driving.

In the US, there’s one auto fatality per 94 million miles driven, Tesla says. NHTSA expresses this as fatalities per 100 million miles driven, or 1.06 based on Tesla’s report. That is close to NHTSA’s report of 1.08 per 100 million miles driven as of 2014, the last year for which NHTSA has reported final data. It takes NHTSA six months for even a preliminary report from its Fatality Analysis Reporting System (FARS) database.

A NHTSA analysis dated today says preliminary data shows fatalities for 2015 will be about 1.12 per 100 million miles driven, with about 32,500 deaths versus 32,675 last year, an increase of about 8%, or 4% per 100 million miles driven. Look for dire-consequences warnings from the feds as this data goes final, likely with distracted driving mentioned as a cause for the increase. Never in history has NHTSA added to a report, “But if you look at traffic fatalities over longer periods, the trend line is always down with occasional hiccups.” That’s not scary enough for an agency strapped for funding. Fifty years ago (1966), there were 5.50 deaths per 100 million miles traveled, five times as bad as today. In 1989-90, deaths per 100 million miles driven were twice today’s numbers.

Tesla was quick to note (see below) that its Autopilot software is still in beta, even though it’s widely used.

Tesla: “Be prepared to take over at any time”

We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.

Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.

It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.

We do this to ensure that every time the feature is used, it is used as safely as possible. As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.

The customer who died in this crash had a loving family and we are beyond saddened by their loss. He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission. We would like to extend our deepest sympathies to his family and friends.

What can we learn from the Tesla crash?

Look for lots of second-guessing about the safety of self-driving cars, especially when filtered through the lens of writers without a solid background in auto safety reporting or basic statistics. Nobody compares safety today with a generation ago. Cars get safer every year, although as noted above, there are blips.

It’s a misnomer to call Tesla Autopilot self-driving in the way we’ll describe self-driving in five or 10 years. This is assisted driving that centers the car in the lane, keeps pace with the car ahead, and pulls you back if you try to change lanes when there’s a car coming up in your blind spot. It’s amazing cars do this much today. But: You still have to watch the road and keep your hands on the wheel.

There’s a video online of the deceased driver behind the wheel of “Tessy,” as Joshua Brown, hands off, shows how well his Tesla drives on Autopilot. He has credited Tessy for avoiding an accident. It’s possible Level 2 autonomous driving can build a false sense of security. It’s unclear if the report of a video playing will pan out. Friends recall Brown as a talented geek with the skills equivalent to an electrical engineer, so there’ll be questions about whether the dashboard display could have been modded to allow what Tesla says can’t be allowed.

Don’t lose sight of how safe semi-self-driving cars have become — especially the accidents avoided when lane departure warning or blind spot detection keeps the car in lane, or when adaptive cruise control slows you down when the car ahead breaks while you’re tuning the radio or sneaking a peek at your phone. Nobody tells NHTSA about the fatal accidents they almost got in.

Tagged In

This site may earn affiliate commissions from the links on this page. Terms of use.

ExtremeTech Newsletter

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.

Email

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our
Terms of Use and
Privacy Policy. You may unsubscribe from the newsletter at any time.