Despite Tesla's claims to the contrary, safety advocates believe the company's use of the name Autopilot is misleading.

It’s called Autopilot but, at least for now, Tesla’s semi-autonomous technology is a not completely reliable co-pilot, it can require rapid human intervention to avert a crash. And that, two consumer groups argue, makes the use of the Autopilot name “deceptive and misleading.”

The Center for Auto Safety, or CAS, and Consumer Watchdog aren’t the first to raise concerns about the Tesla technology. German regulators briefly considered banning the name, Autopilot. But the American non-profit safety groups have taken their concerns to the Federal Trade Commission, asking it to initiate an investigation of how Tesla not only has named, but also how it promotes its semi-autonomous system.

Stay Plugged In!

In a letter to the federal agency, they claim Tesla has taken steps to “mislead and deceive customers into believing that Autopilot is safer and more capable than it is known to be.”

First introduced in 2015, Autopilot was one of the first driver assistance systems capable of briefly taking control of a vehicle under optimum conditions – usually on limited access highways. Nissan introduced the ProPilot Assist system for 2018, as did Cadillac with its new Super Cruise. Mercedes-Benz and Audi are among others who are launching what are technically known as Level 2 systems.

Tesla's Elon Musk has repeatedly defended using the name Autopilot, claiming users well understand the system's limitations.

But controversy has swirled around Tesla’s Autopilot almost from the moment it was activated on 60,000 Model X SUVs and Model S sedans – with the system now built into all new Tesla vehicles, including the compact Model 3.

Early owners frequently posted about the system on social media, one owner going so far as to record himself on video climbing into the backseat while his vehicle operated hands-free. Tesla CEO Elon Musk himself was photographed driving with his then-wife, hands out the window of his Model S, apparently to show off its capabilities.

But Tesla toned down its promotion of Autopilot’s capabilities after a May 2016 crash in Florida that took the life of 40-year-old Joshua Brown. The former Navy SEAL’s vehicle crashed into the side of a truck that had turned in front of it. The National Highway Traffic Safety Administration initially put the blame on Brown for failing to oversee the vehicle’s operation. But the National Transportation Safety Board’s separate investigation also faulted Autopilot for failing to distinguish between the white truck and a bright Florida sky.

Tesla subsequently modified both the hardware and software used to control the semi-autonomous system while also putting more emphasis on the need for a motorist to remain vigilant and be ready to regain control at a moment’s notice.

A Tesla spokesperson told TheDetroitBureau.com, “makes clear that although every vehicle has the hardware necessary for Full Self-Driving, actual ‘self-driving’ functionality is dependent on extensive software validation and regulatory approval.” But Musk continues to brag auto the technology and recently told analysts and reporters participating in a teleconference covering Tesla’s first-quarter earnings that a fully autonomous version should be ready to activate next year.

Tesla's own semi-autonomous Autopilot tech came under suspicion after this January crash.

(Click Here for details about Uber shutting down its autonomous testing program in Arizona.)

In the meantime, the automaker insists that owners know Autopilot’s capabilities are far more limited. “The back that we get from our customers shows that they have a very clear understanding of what Autopilot is, how to properly use it, and what features it consists of,” the company said in a statement reacting to the CAS/Consumer Watchdog complaint.

Critics question whether owners really do have a “clear understanding” of Autopilot’s limitations, however. They not only point to those early videos but a series of crashes that not only include the one that killed Joshua Brown but another earlier this year that saw a Model X slam into a freeway barrier in California killing the driver. Tesla subsequently confirmed it was operating in Autopilot mode at the time of the crash, but CEO Musk also asserted that onboard data showed the driver repeatedly ignored alerts to retake control of the electric SUV in the moments leading up to the crash.

At least one other serious incident involving a vehicle operating in Autopilot mode has been reported in recent weeks, in this case a Model S slamming into the back of a stopped fire truck at a speed of around 60 mph.

There are now at least two NTSB investigations involving Tesla vehicles underway, though one focuses on the battery fire that followed a crash in Florida. Two teens were killed, a third seriously injured.

Whether Tesla has pushed too far in promoting Autopilot is a debate that has crossed borders. Germany considered banning the name in 2016, but ultimately relented when Tesla provided data indicating motorists were aware of the system’s limits.

Meanwhile, the Silicon Valley carmaker isn’t the only manufacturers that may be causing confusing among the public. The AAA this week called for automakers, in general, to adopt the use of common and clearer names for the self-autonomous technologies coming to market in rapidly increasing numbers “to help prevent (their) accidental misuse.”

(To see more about Elon Musk striking back after tough reviews of the Model 3, Click Here.)

That call came at the same time the road and travel service released the results of a new, random study of more than 1,000 U.S. motorists and it found 73% would be “afraid” to ride in a fully self-driving vehicle. Even while walking or riding a bike, 63% of those surveyed said they would feel less safe knowing they must share the road with self-driving vehicles.

4 Responses to “Safety Groups Want Tesla Autopilot Name Banned”

“…CEO Musk also asserted that onboard data showed the driver repeatedly ignored alerts to retake control of the electric SUV in the moments leading up to the crash…”. It does not mention as to whether or not the driver had been receiving alerts for the entire time they had been driving, not just the moments before the accident.

One of the problems may be that “Autopilot” is aviation related. It essentially keeps the plane level and on course. In cars, it might be referred to as a cruise control. However, pilots, (who are trained to accept the responsibility for their actions, including use of any flight assistance features), know the limitations of the aircraft systems. Additionally, the two vehicles operate in an entirely different environment: the aircraft (usually) has a tremendous distance between itself and anything else in front, behind, left, right, above and below. You can actually lose control of the aircraft, (deliberately or otherwise), and still have room to recover (which I know from experience; ask my (ex) flight instructor). Autos are stuck to the ground and can easily hit anything in any direction (including up if you include trucks going under low bridges/tunnels). There is little, if any, room for error. Many (most?) drivers may associate “Autopilot” with assumptions they have from too many movies that refer to something they don’t fully understand (and which the movies don’t explain or depict). That said, I am not sure that the problem isn’t with the term, but with the operator.

Bingo, and now you are back to the issue of Elon Musk getting so bent out of shape about the media. Musk refuses to question his own judgement and, like someone else he knows, thus feels that a critical media can only be “fake news” deserving attack.