Consumer Reports says Tesla should drop Autopilot name

July 14, 2016 by Tom Krisher And Dee-Ann Durbin

In this Sept. 15, 2015, file photo, a Tesla Model S is on display on the first press day of the Frankfurt Auto Show IAA in Frankfurt, Germany. Consumer Reports magazine is calling on electric car maker Tesla Motors to change the name of its Autopilot semi-autonomous driving system and to disconnect the automatic steering feature after a fatal crash in Florida. The magazine says in a statement that calling the system Autopilot promotes a dangerous assumption that Teslas can drive themselves. (AP Photo/Michael Probst, File)

Consumer Reports said Thursday that Tesla Motors is misleading car owners by calling its semi-autonomous driving system "Autopilot," potentially giving them too much trust in their car's ability to drive itself.

The influential magazine said Tesla should drop the Autopilot name and disconnect the automatic steering system until it's updated to make sure a driver's hands stay on the wheel at all times. The system currently warns drivers after a few minutes of their hands being off the wheel.

In an e-mail, a Tesla spokeswoman said the company has no plans to change the name, and that data it collects show drivers who use Autopilot are safer than those who don't.

With its statement, Consumer Reports joined a debate over autonomous driving technology that escalated after authorities revealed that Joshua Brown, 40, of Canton, Ohio, died in a May crash in Florida with the Autopilot on in his 2015 Model S. The system didn't detect a tractor-trailer that had turned in front of the car in bright sunshine, and Brown also failed to react.

The National Highway Traffic Safety Administration is investigating the wreck and the functioning of the Autopilot system. After the Brown crash, critics accused Tesla of giving drivers access to a system that wasn't ready, while supporters contended the company was improving automotive safety.

Tesla's Autopilot system uses cameras, radar and computers to detect objects and automatically brake if the car is about to hit something. It also can steer the car to keep it centered in its lane. The company says that before Autopilot can be used, drivers must acknowledge that it's an "assist feature" that requires both hands on the wheel at all times. Drivers also must be prepared to take over at any time, Tesla has said.

Yet Laura MacCleery, Consumer Reports' vice president of consumer policy, said naming the system Autopilot gives drivers a false sense of security. Autopilot, she wrote, can't actually drive the car, but it lets consumers keep their hands off the steering wheel for minutes at a time.

"We're deeply concerned that consumers are being sold a pile of promises about unproven technology," she said in a statement.

Earlier this week Tesla disclosed that a Model X SUV crashed early Saturday in Montana while the driver was using the autosteer feature on a two-lane road, which is not recommended by the company. Tesla, which gets information from its cars over the internet, said the car warned the driver at least once to place his hands on the steering wheel before it crashed.

MacCleery called on the Palo Alto, California, company to disable automatic steering until it updates the computer program to ensure a driver's hands are on the wheel.

Consumer Reports also said Tesla should issue clearer guidance on how Autopilot is used and what its limitations are. Tesla CEO Elon Musk has said he'll provide more thorough guidance in a blog posting, and the spokeswoman said that was coming.

Tesla released Autopilot last fall and says the system is still in a "public beta," or testing phase. Critics have complained that Tesla is using drivers as "guinea pigs"—a sentiment echoed by Consumer Reports.

Tesla said Autopilot underwent millions of miles of internal testing and is updated constantly. "We will continue to develop, validate, and release those enhancements as the technology grows," the spokeswoman said.

Not all magazines that test cars are critical of Tesla and Autopilot. Road and Track said on its website this week that Autopilot is a technological achievement that should make America proud. Autopilot is at least as safe as human drivers on the highway, in a car that doesn't use gasoline and performs like a sports car, the magazine said.

Consumer Reports has expressed concerns about Autopilot before. During a November podcast, Jake Fisher, auto testing editor, said the system provided an added layer of confidence. But he was surprised that he could take his hands off the wheel for 2 ½ minutes at a time and browse the web on its dashboard screen while driving.

In February, Consumer Reports urged Tesla to change a feature within Autopilot known as Summon, which lets owners start cars and move them out of a garage or parking spot automatically using a key fob or a smartphone. The magazine found that users couldn't stop the cars right away if they pressed the wrong button on the key fob. It also found that the cars kept moving when the smartphone app was closed. Tesla responded with a software update that limited the Summon feature to smartphones and required the user to keep a finger on the phone screen when the car was being summoned.

The first U.S. fatality using self-driving technology took place in May when the driver of a Tesla S sports car operating the vehicle's "Autopilot" automated driving system died after a collision with a truck in Florida, ...

Recommended for you

Researchers from the University of Washington and Microsoft have demonstrated the first fully automated system to store and retrieve data in manufactured DNA—a key step in moving the technology out of the research lab and ...

One of the ocean's little known carnivores has been allocated a new place in the evolutionary tree of life after scientists discovered its unmistakable resemblance with other sea-floor dwelling creatures.

In research that casts cells as curators of their own history, Dana-Farber Cancer Institute scientists have discovered that adult tissues retain a memory, inscribed on their DNA, of the embryonic cells from which they arose. ...

New photonic tools for medical imaging can be used to understand the nonlinear behavior of laser light in human blood for theranostic applications. When light enters biological fluids it is quickly scattered, however, some ...

7 comments

Drivers also must be prepared to take over at any time, Tesla has said.

Which they can't, according to studies. It takes too long for the switchover if the driver is in any ways distracted, such as fiddling with their phone.

Autopilot is at least as safe as human drivers on the highway, in a car that doesn't use gasoline and performs like a sports car, the magazine said.

Those sort of statements are highly irresponsible, considering how stupid the autopilot actually is.

Again, the failure in this case wasn't that the car failed to -see- the obstacle, because it had a multitude of sensors that all reported it, but a failure of the computer to -understand- the obstacle for what it was. The AI just has zero situational awareness, zero memory, and it's simply reacting to rudimentary pre-programmed cues which lead to the failure.

It's patently unsafe because it's practically impossible to make an all-covering set of rules that would take care of every detail

Programming a car to drive safely is actually the same problem as computer vision in general. Given a picture of an umbrella, the computer "sees" just ones and zeroes, and it has to do some pattern matching trick to percieve it.

So the traditional way to do it is simply to load the computer up with two billion pictures of all possible umbrellas, opened and closed, partially closed, partially broken, upside down and filled with water... red, green, blue umbrellas, multicolored umbrellas etc. until the computer can find at least one example that matches with high confidence to say "that's an umbrella".

Immediately you see what the problem is when the computer has to identify more than umbrellas. Likewise, it's very much impossible to program a car with such an exhausting amount of data to identify its surroundings, however they might be

The AI needs to develop an actual understanding, but that's a hard problem the computer scientists have been wrestling with since the 70's

So the problem is thus: you got algorithms that are maybe 90% accurate at identifying particular pre-programmed things like "a billboard, a car, a road sign" from the noisy sensor data, and you make it drive a car.

99.999% of the time it's going to drive just fine because most of the time nothing unusual happens or the driver takes over before anything gets to happen, and the simple rules are perfectly sufficient: stay on the road between lane markers, keep to the speed limit and don't crash into (what look like) obstacles.

Fine. Then a tractor-trailer turns in front of you and the computer thinks it's a hanging road sign, attempts to limbo under the trailer and decapitates the driver.

So the company programmers add a new rule to detect low hanging obstacles, and the system works fine for a while, but then another driver gets killed over a different reason - and there are millions of possible exceptions and anomalies.

Very well said. This is part of a culture of aggressively insisting that the future is here now, when it really isn't. I fully expect fully autonomous cars will be a reality... sometime in the mid-30's. Trying to make a car fully autonomous... no steering wheel, no controls... with weak AI is not credible. They are moving toward a fully autonomous car the way a monkey climbing a tree is moving toward the moon. It will take strong AI to drive a car, and it won't be the Tesla or any other car maker who will develop strong AI.

I'm also concerned that this the future is here now culture completely ignores human psychology. They try to reprogram the person to fit their technology, for no other reason than the gosh wowness of it all.

Case in point: Their operational definition of driver attention is having hands on the wheel. This is not valid. The human brain is a mechanism and works as such. Attention is produced in a mechanical way. Look up "salience." Attention goes away when there are no salient inputs. Eyes open, hands on the wheel - attention goes away. You can't reprogram a human away from this reliably. One human at certain times, alright. But not a large group consistently.

Critics are focusing on drivers being distracted or getting too comfortable and putting their attention elsewhere. But even if you have a mechanism that forces the driver to keep hands on the wheel, or even eyes open looking forward, attention is still going to go away without the stimulus that comes from physically and continuously driving the car.

Tesla and others are completely ignoring the science of cognitive psychology.

Your writer says in the article, "With its statement, Consumer Reports joined a debate over autonomous driving technology ..."

But it is NOT autonomous. That is Consumer Reports' point about using Autopilot. People are mislead by the word. In aviation, where we mostlly get the concept, flight crews are trained to know that it doesn't mean autonomous, but that the equipment **assists** the crew by taking care of some aspects of flying.

Once the writer uses autonomous without 'semi' in front of it, CR's point is confirmed.

Please sign in to add a comment.
Registration is free, and takes less than a minute.
Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.