Facing increased scrutiny recently about the safety of the Autopilot feature on his company's cars, Tesla Motors CEO Elon Musk over the weekend hinted at a coming announcement about a "Top Secret Tesla Masterplan, Part 2."

Tesla Motors' original master plan was detailed by Musk in a 2006 blog post describing his goal to "help expedite the move from a mine-and-burn hydrocarbon economy towards a solar electric economy." In a tweet yesterday morning, Musk said he hoped to publish an update to the plan "later this week."

Earlier this month, news emerged that a 2015 Tesla Model S was linked to the first known fatal crash involving self-driving car technology, prompting some calls for Tesla to disable the feature. And a Tesla proposal announced last month to acquire the solar energy company SolarCity, co-founded by two of Musk's cousins and chaired by Musk, has also been criticized in light of Tesla's and SolarCity's huge debts.

Musk, who also leads the space transport company SpaceX, has a reputation as a technology visionary. However, he has also been criticized for overpromising and under-delivering, especially on his pledge to produce an affordable, mass-market electric car. Tesla has yet to do so, and its role in last month's fatal crash of an Ohio man has raised new questions about the safety of its technology.

Customers as 'Guinea Pigs'

Following news of the May 7 death of Tesla owner Joshua D. Brown, 40, in a collision with a truck in Florida, the advocacy organization Consumer Watchdog published an open letter to Musk calling on his company to disable the self-driving Autopilot feature on its cars immediately until it can be shown to meet highway safety standards.

"One of the most troubling aspects of Tesla's deployment of autopilot is the decision to make the feature available in so-called Beta mode," according to the letter. "That's an admission it's not ready for prime time. Beta installations can make sense with software such as an email service; they are unconscionable with systems that can prove fatal when something goes wrong. You are in effect using your customers as human guinea pigs."

The National Highway Traffic Safety Administration (NHTSA) is investigating the accident involving Brown's 2015 Tesla Model S, which was reportedly using the Autopilot feature when it drove under an 18-wheel semi-trailer truck that had passed in front of it. The agency is also looking into a July 1 incident in which a Tesla Model X drove into a concrete median barrier in Pennsylvania.

A spokesperson for the NHTSA told us today that the agency's Office of Defects Investigation is opening up a preliminary evaluation of the design and performance of automated driving systems in the Tesla Model S. She said the agency is also collecting information from the Pennsylvania State Police, Tesla and the driver of the vehicle involved in the July 1 crash to determine whether the Autopilot feature was activated at the time.

"The opening of the Preliminary Evaluation should not be construed as a finding that the Office of Defects Investigation believes there is either a presence or absence of a defect in the subject vehicles," the spokesperson added.

Autopilot Finger-Pointing

Released Thursday, the Consumer Watchdog's letter also took Tesla to task for an "emerging pattern of blaming victims" involved in crashes of its vehicles. It pointed to a June 30 blog post on Tesla's Web site that described Autopilot as "as assist feature that requires you to keep your hands on the steering wheel at all times."

The Consumer Watchdog letter also pointed to Tesla's response to the non-fatal July 1 crash that stated: "Based on the information we have now, we have no reason to believe that Autopilot had anything to do with this accident." Such statements show that Tesla is not willing to assume responsibility when Autopilot fails, according to the watchdog group.

In addition to its investigations of the two recent Tesla crashes, the NHTSA is also looking into 20 complaints from owners of 2016 Tesla Model S and 2016 Tesla Model X vehicles. They include several reports expressing concern about the safety of the Autopilot feature, along with complaints about stalling, unintended acceleration and battery problems.

Tell Us What You Think

Comment:

Name:

Former Tesla Scientist:

Posted: 2016-08-10 @ 7:45am PT

I worked at Tesla and left because of Elon's inability to understand that ANDROID is not something you want to have running mission critical applications - they had the opportunity to utilize an operating system from a company in Santa Barbara that in 30 years have never had a driving system or autopilot fail on jets, both fighter and commercial along with powering car systems that don't have current issues. Instead Elon mandated Android because it was FREE.

The system we had running actually would have allowed to view exactly what happened in the accident. I predict that Tesla and the 'teams' that seem to change often will be at the root of many issues related to security and the integrity of the systems in the cars.