Bloomberg

Bloomberg | Quint is a multiplatform, Indian business and financial news company. We combine Bloomberg’s global leadership in business and financial news and data, with Quintillion Media’s deep expertise in the Indian market and digital news delivery, to provide high quality business news, insights and trends for India’s sophisticated audiences.

Tesla Draws Rebuke for Blaming Autopilot Death on Model X Driver

(Bloomberg) -- Consumer-safety advocates and autonomous-vehicle experts criticized Tesla Inc. for issuing another statement about the death of a customer that pinned the blame on driver inattentiveness.

Days after publishing a second blog post about the crash involving Walter Huang, a 38-year-old who died last month in his Model X, Tesla issued a statement in response to his family speaking with San Francisco television station ABC7. The company said the “only” explanation for the crash was “if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.”

“I find it shocking,” Cathy Chase, president of the group Advocates for Highway and Auto Safety, said by phone. “They’re claiming that the only way for this accident to have occurred is for Mr. Huang to be not paying attention. Where do I start? That’s not the only way.”

Groups including Advocates for Highway and Auto Safety and Consumer Reports have criticized Tesla for years for naming its driver-assistance system Autopilot, with the latter calling on the company to choose a different moniker back in July 2016. The two organizations share the view of the National Transportation Safety Board, which has urged carmakers to do more to ensure drivers using partially autonomous systems like Autopilot remain engaged with the task of driving. The U.S. agency is in the midst of two active investigations into Autopilot-related crashes.

It’s Tesla’s responsibility to provide adequate safeguards against driver misuse of Autopilot, including by sending visual and audible warnings when the system needs a human to take back over, Chase said. “If they’re not effective in getting someone to re-engage -- as they say that their drivers have to -- then they’re not doing their job.”

High Stakes

The stakes for Tesla’s bid to defend Autopilot are significant. The NTSB’s investigation of the March 23 crash involving Huang contributed to a major selloff in the company’s shares late last month. Chief Executive Officer Elon Musk claimed almost 18 months ago that the system will eventually render Tesla vehicles capable of full self-driving, and much of the value of the $51 billion company is linked to views that it could be an autonomous-car pioneer.

Tesla has declined to say how long drivers can now use Autopilot between visual or audible warnings to have a hand on the wheel. It’s also refused to comment on how many alerts can be ignored before the system disengages, what version of Autopilot software was in Huang’s Model X, or when the car was built.

“Just because a driver does something stupid doesn’t mean they -- or others who are truly blameless -- should be condemned to an otherwise preventable death,” said Bryant Walker Smith, a professor at the University of South Carolina’s School of Law, who studies driverless-car regulations. “One might consider whether there are better ways to prevent drivers from hurting themselves or, worse, others.”

Under Investigations

The NTSB is looking into the crash that killed Huang, as well as a collision in January involving a Tesla Model S that rear-ended a fire truck parked on a freeway near Los Angeles with Autopilot engaged. The agency said after Tesla’s second blog post about the Huang incident that it was unhappy with the company for disclosing details during its investigation.

In its latest statement, Tesla said it is “extremely clear” that Autopilot requires drivers to be alert and have hands on the steering wheel. The system reminds the driver this every time it’s engaged, according to the company.

“Tesla’s response is reflective of its ongoing strategy of doubling down on the explicit warnings it has given to drivers on how to use, and not use, the system,” said Mike Ramsey, an analyst at Gartner Inc. “It’s not the first time Tesla has taken this stance.”

Huang’s wife told ABC7 he had complained before the fatal crash that his Model X had steered toward the same highway barrier he collided with on March 23. The family has hired Minami Tamaki LLP, which said in a statement Wednesday that it believes Tesla’s Autopilot is defective and likely caused Huang’s death. The San Francisco-based law firm declined to comment on Tesla’s statement.

Crash-Rate Claim

The National Highway Traffic Safety Administration, which has the power to order recalls and fine auto manufacturers, found no defect after investigating the May 2016 crash involving a Tesla Model S driven on Autopilot by Josh Brown, a former Navy SEAL. The agency closed its probe in January 2017.

According to data Tesla gave NHTSA investigators prior to its decision against any recall, Autopilot’s steering system may prevent the rate of crashes per million miles driven by about 40 percent, a figure the company cited in its latest statement.

“We empathize with Mr. Huang’s family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road,” Tesla said. “The reason that other families are not on TV is because their loved ones are still alive.”

Neither Tesla nor NHTSA has released the underlying data to support the crash-rate reduction claim.

“Tesla explicitly uses data gathered from its vehicles to protect itself, even if it means going after its own customers,” said Ramsey, the Gartner analyst.