Automotive

﻿

Tesla Autopilot limitations played role in deadly crash, NTSB says

Share
Facebook
Tweet
Pinterest
Email

The National Transportation Safety Board has determined the probable cause of a May 2016 crash involving a semitruck and a Tesla Model S, in which the electric sedan drove under the truck's trailer, killing the driver. According to the agency, the "truck driver’s failure to yield the right of way and a car driver’s inattention due to overreliance on vehicle automation" were determined as the probable cause of the crash.

Joshua Brown, 40, was killed near Williston, Florida, on May 7, 2016, when the sedan driven by him collided with a semitruck that was crossing a divided highway. The impact ripped the roof off the Tesla sedan, which continued to travel several hundred feet, with Brown being the sole victim of the crash. Early on, Tesla's semi-autonomous Autopilot driver assist system was viewed as a possible contributing factor in the crash, prompting industry observers to speculate that the system may have misinterpreted the appearance of a trailer several hundred feet in front of it while engaged, mistaking it for a highway overhead sign. The first fatal accident involving the Autopilot system had cast suspicion upon the limitations and operation of the system, which uses radar and cameras to interpret the environment around it but requires that drivers keep their hands on the steering wheel (for a majority of the time) and pay attention to the road, sparking an NTSB investigation into the crash as well as public debate about the latest autonomous technology.

"The NTSB also determined the operational design of the Tesla’s vehicle automation permitted the car driver’s overreliance on the automation, noting its design allowed prolonged disengagement from the driving task and enabled the driver to use it in ways inconsistent with manufacturer guidance and warnings," the agency said in a statement.

Perhaps the most jarring finding by the NTSB, one suspected early on, was the fact that Autopilot system could not identify the truck crossing the road directly in front of it.

Tesla responds to death of Autopilot driver

Tesla Motors revealed Thursday it was investigating a crash involving a Model S EV in Autopilot mode that resulted in the death of an Ohio man on May 7 of this year. Joshua Brown, 40, ...

"The Tesla’s automated vehicle control system was not designed to, and could not, identify the truck crossing the Tesla’s path or recognize the impending crash," the NTSB said in a statement. "Therefore, the system did not slow the car, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate."

Autopilot invited additional criticism following several other crashes in which driver inattention was suspected, including at least one crash captured on a dashcam in China in which a Tesla Model S drove into the back of a truck at full speed, seemingly without any attempt by the driver to intervene in the seconds before the crash. Several Tesla owners in China reported that the car was advertised as "self-driving," with Tesla store employees reported to have given demo drives with their hands off the steering wheel. Tesla subsequently updated the Autopilot system to require greater driver attention during its operation.

The NTSB statement singled out this particular aspect of the system, citing the potential and the ease of Autopilot's misuse by drivers.

"The way in which the Tesla 'Autopilot' system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement," the agency determined.

The published NTSB findings also faulted the Tesla driver's pattern of use of the system, which suggested overreliance on Autopilot in addition to a lack of understanding of its limitations. The agency noted that Tesla subsequently updated the system to require greater attention, but did not speculate on whether the remedies had significantly decreased the risk of system misuse by drivers.

The NTSB ruled out mechanical failure or road design -- two factors that were raised early on by industry observers. The agency also noted that the truck driver had used marijuana sometime before the crash, citing a post-crash drug test, but could not determine his level of impairment during the crash, if such existed.

Tesla's initial response to the crash, which occurred some weeks after the crash itself when news of the incident first became public, focused on reiterating standard warnings about the system. The automaker's response drew some criticism at the time for essentially relying on drivers' willingness to adhere to safety precautions 100 percent of the time, and for absolving itself from blame in event of operator misuse while neglecting to address very real public misconceptions about Autopilot's actual limitations.

"While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles,” said NTSB chairman Robert L. Sumwalt III. “Smart people around the world are hard at work to automate driving, but systems available to consumers today, like Tesla’s ‘Autopilot’ system, are designed to assist drivers with specific tasks in limited environments. These systems require the driver to pay attention all the time and to be able to take over immediately when something goes wrong. System safeguards that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways were lacking, and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened."

Sumwalt noted in his comments that Brown and the truck driver both had at least 10 seconds to see and respond to each other's movements, a comment that highlighted the avoidable nature of the accident while also seemingly downplaying the effect of the truck driver's failure to yield to oncoming traffic.

The agency has made seven new safety recommendations as a result of its investigation of the accident, including the need for event data recorders, more stringent safeguards that limited an automated system's operation to conditions for which it was designed, as well as new systems to monitor driver attention and operation of the system.