Share this story

On Friday, Sen. Edward Markey (D-Mass.) called on Tesla to adopt "common sense recommendations" in its Autopilot driver assist to "guarantee the safety of its technology." Specifically, he's asking the automaker to stop implying that the system is capable of self-driving and also asks Tesla to fit a proper driver-monitoring system. The senator began his investigation into the company's driver-assist package following multiple reports of drivers circumventing the cars' rudimentary safety controls.

Autopilot is a flawed system, but I believe its dangers can be overcome... I have been proud to work with Tesla on advancing cleaner, more sustainable transportation technologies. But these achievements should not come at the expense of safety. That's why I'm calling on Tesla to use its resources and expertise to better protect drivers, passengers, pedestrians, and all other users of the road. I urge Tesla to adopt my common sense recommendations for fixing Autopilot, which include rebranding and remarketing the system to reduce misuse, as well as building backup driver monitoring tools that will make sure no one falls asleep at the wheel. Tesla can and must do more to guarantee the safety of its technology.

Further Reading

This is not the first time that the name Autopilot has come under fire. In 2016, the German transport minister told the company "to no longer use the misleading term for the driver-assistance system of the car." In 2018, two US consumer safety groups asked the Federal Trade Commission to address Autopilot's "deceptive and misleading" branding. In 2019, we discovered that the National Highway Traffic Safety Administration told the company to stop making "misleading statements" when it comes to safety, and the company repeatedly made claims about the safety of Autopilot that were not supported by fact. (The data showed that Autosteer—a component of the Autopilot suite of assists—actually increased crashes by 59 percent.)

Markey's second safety request is for Tesla to fit its vehicles with an effective driver-monitoring system (DMS). He's not alone—in 2019, in response to yet another video of someone asleep in the driver's seat of a moving Tesla, my colleague Tim Lee laid out a case for why a steering wheel torque sensor is inadequate compared to a proper DMS like the kind used by Cadillac or Subaru, which use eye-tracking cameras to ensure the driver has their attention on the road ahead.

Further Reading

This is not the first time that Markey has turned his attention to new car technology. In 2015, his office released a report on the lack of security measures to prevent connected cars from being hacked.

Tesla did not respond when contacted by Ars for its comment on Sen. Markey's requests.

For those that are trying to equate the use of the term "autopilot" in a commercial airline to the use of the term in a Tesla.

1. Is the autopilots functionality clearly defined in a way that is unambiguous and possible to rigorously implement. Tesla: No Airliner: Yes

2. Is the specification certified to function correctly against the specification defined in 1.Tesla: No Airliner: Yes (mostly, right Boeing?)

3. Are the modes of failure well understoodTesla: No Airliner: Yes

4. Are the operators of the "autopilot" properly trained when failure is encountered?Tesla: No

Hi Tesla owner here...

Regarding (1) the sales folks and website make it clear on how you are expected to operate the vehicle when using "autopilot". The car itself is clear when you first enable the autopilot feature in the vehicles settings. The car itself is clear every time when you activate it for use. The car actively reminds you if it believes you are not ready to take control of the vehicle and it will after a few warnings disable autopilot for the remainder of the trip. (now they likely can improve the driver monitoring, in my case I get false warnings while others like to attempt to trick it .. again against the clear statements of how to use autopilot)

Also the car will degrade its functionality with warnings if it believes it cannot operate given current road conditions (weather, construction, etc.) or if aspect of the vision system are impaired (weather, condensation, etc.).

Regarding (4) all of us are "trained" to drive a vehicle, that is the only training needed when operating a Tesla with or without autopilot engaged. If you break or sufficiently steer (doesn't take much) the autopilot will disengage with clear notification (you can also pull a paddle on the steering column to disengage). You just need to be ready to drive your vehicle.

Regarding (2) and (3) a range of regulations exist and certifications exist at the federal and state level but likely they need to be much better implemented. The failure modes in general are the driver takes control since they are expected to be ready to do so in a short time span when using autopilot (just like an autopilot in a boat or aircraft).

I personally don't have any issue with the name autopilot since I also have been a small aircraft pilot and much more often a boat captain, I likely have a clearer understanding what autopilot means (not all that automagic). I would agree changing the name may help in some regard but lets be clear you have to go out of your way to ignore the information you are presented with well before you first enable autopilot in a Tesla.

Oh one last thing, the autopilot in a Tesla and other vehicles with similar navigation capabilities are actually more capable then just about any other system that we call an "autopilot" on a ship or aircraft.

Fta: A survey conducted in 2019 showed that nearly 50 percent of drivers thought Autopilot was safe to use hands-free.

Fwiw, that link to the ARS article about that survey was a survey about driver's thoughts on the meaning of autopilot and other similar terms. Most, probably all, weren't Tesla owners. (The ARS article says something similar several paragraphs after the subtitle: Six percent of drivers say it's safe to take a nap when Autopilot is engaged) All of the few Tesla owners that may or may not have been in the survey probably answered the survey in the same manner that this Tesla owner did.

Although I'm pretty much in agreement with the analogous definition being fairly accurate, the misunderstanding of the general public that doesn't use it isn't the real problem—an issue that could use some tweaking if Tesla might want to deal with public relations. My problem seems to be with Tesla announcing under the same breath at the same time that Tesla has autopilot and Tesla will be fully autonomous soon.

The FSD5 part referenced in one of the two ideas appears to be a contradiction and may help to confuse the general public into mistakenly think autopilot and self-driving are one and the same.

[Edit: Moved a sentence into another paragraph; shortened last paragraph.]

[On the contrary, understanding the capabilities of the system is what leads to over-reliance and risk taking. Even having a deep technical knowledge of how the feature works doesn't stop you from becoming over-reliant. Being unfamiliar or a new driver means you're less likely to take dumb risks. Many accidents on Autopilot have happened to experienced users. If you've used Autopilot a lot, you start to realize you can get away with things, sending a text, or checking on the dog, etc.

I would argue that doing something that you shouldn't be doing because you think you'll get away with it is a very strong sign of overconfidence and not understanding the capabilities and limitations of a system.

I've thought for a while they need to rename it. I'm an aviation buff so I understand that autopilot on aircraft rarely do more than hold altitude, heading, attitude and/or throttle but to the average person it means "full automation" which makes the naming choice by Tesla questionable.

Of course, Tesla Autopilot is, in actuality, closer to a ground-based version of the fairly limited features of real world aviation autopilot, but that's not how most people view it and to a large degree that's not how they've sold it.

I'm afraid you're excusing stupidity of the average person, and renaming it won't help:

It doesn't matter what an Autopilot actually is, what it's typically perceived to be by the general public, or what Tesla names their assisted driving system.

It only matters how it works and how well they communicate to their customers how it works. You can change the name to "Extremely limited lane and speed holding service" and people will still abuse it. If they can force customers through a training that clearly demonstrates it's limitations and safe operation, whether on-screen or on delivery, that's going to be far more useful than any name change.

Is there anyone clamoring for autopilot in airplanes to be called something different?

A key difference is that pilots (a very small group compared to car drivers) are specifically trained to know the limitations of "autopilot" on planes . Tesla drivers don't go through intensive training courses about the functionality/limitations of this touted feature.

I predict that changing the name will have no discernable effect on anything, other than people complaining about the name.

They should probably change it, just in case someone does misinterpret it to mean something that autopilot has never actually meant in practice.

But let's be real. The name itself is just a red herring. Tesla have repeatedly implied and stated that its vehicles are, or will soon be, capable of fully autonomous driving. More importantly, though, their vehicles can actually accomplish this fairly well. Just not safely. It is only natural that customers will push the boundaries of what it can do--despite probably knowing it is not "allowed" or recommended.

As far as I can tell, this discussion is completely around changing the name because some users are idiots. I'll hazard a guess that you could call it the Interactive Macadam Scanning Device and you'll still have idiots duing stupid things with it.

I went through some flight training decades back towards a private pilots license. You do not need a special test or qualifications to operate the autopilot in a Cessna 172. Most 172's don't even have an autopilot, and the ones that do are usually wing-levellers and maybe heading hold. Trim is for maintaining altitude.

I would also point out that the usual training time was 30 hours of classroom and 20 hours of dual time for your solo. A good driver's education program has similar number of hours, so drivers should be as skilled in their environment as a PPL is in their's.

Lastly, does it no behove the government to pass a set of rules for all companies to meet? I ddin't drill down on the senator's request (I'm not from USA) but are they pushing for a baseline standard of testing that all manufactures can build towards?

please, as the Ars automotive editor, please assign any future story involving Tesla to any, any, other writer. As it is, it doesn't seem possible for Ars to publish a story by you involving Tesla that is believed to be unbiased (by a not insignificant ratio of readers). You may even believe yourself to be totally fair but what matters is the irrelevant and unnecessary distraction, rancor and discord generated everytime you post a story where you mention Tesla - whether, like now, they are the subject of the story or for any other reason.

My Model 3 is already equipped with an interior camera. It’s not currently being used, but Tesla does seem ready to roll out camera-based driver monitoring if necessary.

I’ve found that Autopilot (as referred to in the car’s interface) is really just a good adaptive cruise, not radically different than that found in a VW Golf. It maintains a follow distance and reacts pretty well to shenanigans - even to the point of slowing down in areas where other Teslas usually slow down (tight turns on the Interstate) and backing off if there’s a big speed differential between your lane and the adjacent one. Both are useful. But that’s all it does, speed control. You still have to steer. And you still have to pay attention. You only relinquish control of the accelerator, and we’ve been doing that for 50 years.

Autosteer, on the other hand, is the bit that takes care of, well, steering. It feels like an exceptionally nervous 16 year old on their first time behind the wheel, and nobody wants that chauffeur. I tried it a couple of times on the highway then disabled it again. It did not feel ready and it was just as much work to ride herd on the Autosteer as it was to look where I was going and have my hands follow my eyes. I have a hard time believing anyone who’s tried it feels differently and is willing to do other things while the car is driving.

please, as the Ars automotive editor, please assign any future story involving Tesla to any, any, other writer. As it is, it doesn't seem possible for Ars to publish a story by you involving Tesla that is believed to be unbiased (by a not insignificant ratio of readers). You may even believe yourself to be totally fair but what matters is the irrelevant and unnecessary distraction, rancor and discord generated everytime you post a story where you mention Tesla - whether, like now, they are the subject of the story or for any other reason.

The following steps are required to use an autopilot function:1. Specify desired track as defined by heading, course, series of waypoints, altitude, airspeed, and/or vertical speed.

2. Engage the desired autopilot function(s) and verify that, in fact, the selected modes are engaged by monitoring the annunciator panel.

3. Verify that desired track is being followed by the aircraft.

4. Verify that the correct navigation source is selected to guide the autopilot’s track.

5. Be ready to fly the aircraft manually to ensure proper course/clearance tracking in case of autopilot failure or misprogramming.

6. Allow the FD/autopilot to accomplish the modes selected and programmed without interference, or disengage the unit. Do not attempt to “help” the autopilot perform a task. In some instances this has caused the autopilot to falsely sense adverse conditions and trim to the limit to accomplish its tasking. In more than a few events, this has resulted in a total loss of control and a crash.

Needs to be renamed for sure. Calling something Autopilot that *isn't* is misleading and dangerous.

If you want to change it then you have to change name for Autopilot on planes and ships. Because its doing exactly the same and more right now. You would be surprised how much worse those are as they use GPS/Radar for navigation only. Tesla have several more sensors including neural network to learn and shadow mode simulate situations in the background.

Your lack of knowledge is another thing. What you are thinking about is FSD Lv5, or otherwise known as Full Self Driving. And we are not there yet.

I have set it to navigate from Los Angeles to Las Vegas, and it has no problem figuring the route through LA's complex freeway system, making all the right choices and even figuring out when change freeways and correctly merging into traffic on a different freeway. It makes a long trip less arduous - monitoring the car is less tiring than driving it.

On the other hand, it cannot handle anything out of the ordinary, and I would never trust it. It gets confused in some construction zones, especially when lanes are closed or narrowed. Obviously it can't handle hand signals from a police officer, and it can't stop the car when directed by a following police car. Bad weather with reduced visibility, which can occur suddenly, are completely beyond its ability. City streets are out of the question, for me the current system is freeway only.

I recently came across an accident on a freeway at night, seconds after it happened and with wreckage strewn over the road. The car didn't handle it at all, it gave no warning, If I had not taken over, I've no idea what it would have done. It does not have the ability to detect an object in front of it at stop safely from a freeway speed.

Tesla needs to put more and better sensors in their vehicles. My Model S has one forward facing radar with limited range - I just don't trust a single sensor that could be blocked or just fail. That sensor is fixed to point forward, which is great until the car takes a sharp curve and the navigation computer becomes confused. Whiteout situations (like driving into the sun around dawn or dusk) are even worse for cameras than humans - the car can't squint, put on sunglasses, or pull down a visor. It doesn't handle fog or heavy snow at all - it needs clear lane markings on the road.

Tesla believes LIDAR is useless, but I think they are living in past - LIDAR from 10, or even 5, years ago is primitive compared to the latest technology. More sensors giving data related to the car's environment, properly integrated into the navigation software, would enhance the car's safety.

All in all, I love Autopilot, but I would never, ever trust it without constant monitoring.

Needs to be renamed for sure. Calling something Autopilot that *isn't* is misleading and dangerous.

But it basically is that. Autopilot in an aircraft is an aid to, not a replacement for, the pilot. You can't set your autopilot in your A320 to auto land and then go to the lavatory. You still have to fly the plane, as in a Tesla - it's an assistant - you still need to drive the car - not watch a movie on your iPad.

Now. Having said that, most people haven't a clue what autopilot is or does, thanks primarily to movies and TV getting it wrong for decades.

The name isn't the issue. Humans are the issue. Call it whatever you want, people will abuse it.

Abusing Autopilot in a Tesla is the exact same human response as texting and driving - I did it 100 times before and everything was OK, therefore it must be safe. Which is a completely inaccurate analysis.

Many newer commercial airliners DO have "Autoland" capability, including the Boeing 737 and MD11 and Garmin makes an autoland system for a couple private airplanes.

Needs to be renamed for sure. Calling something Autopilot that *isn't* is misleading and dangerous.

But it basically is that. Autopilot in an aircraft is an aid to, not a replacement for, the pilot. You can't set your autopilot in your A320 to auto land and then go to the lavatory. You still have to fly the plane, as in a Tesla - it's an assistant - you still need to drive the car - not watch a movie on your iPad.

Now. Having said that, most people haven't a clue what autopilot is or does, thanks primarily to movies and TV getting it wrong for decades.

The name isn't the issue. Humans are the issue. Call it whatever you want, people will abuse it.

Abusing Autopilot in a Tesla is the exact same human response as texting and driving - I did it 100 times before and everything was OK, therefore it must be safe. Which is a completely inaccurate analysis.

Actually the A320 has autoland so you can do exactly that, you shouldn't, but you can.

The Hawker Siddley Trident had autland in the early 60s. I didn't know that as I thought the Lockheed L-1011 was first and went looking for that.

Is there anyone clamoring for autopilot in airplanes to be called something different?

I personally find this line of argument disingenuous. Plane autopilots are used by extremely highly trained people who know exactly what it can and can't do, and in the context they are used in, high in the air, are a much more uncluttered environment so they can do a lot less and still be relatively safe.

But when the "pilots" are complete randos, people with bad vision, people reading their cellphones and putting on their makeup, in the middle of a swarm of closely maneuvering vehicles in close proximity to the cold, hard earth... no, autopilot no longer has the context to be a reasonable term. Especially when placed in conjunction with tons of actual, directly misleading statements by Musk indicating that the current autopilot is "almost" fully autonomous and will be soon. There is plenty of evidence that actual Tesla drivers do place too much faith in the term and the misleading statments by Musk, and that's what ultimately matters.

Change the name, require better driver monitoring, and demand that Musk stop claiming full autonomy anytime soon, until he can back it up to the satisfaction of experts. I'm totally down with that.

You don't need to be a rocket scientist (or be a trained pilot) to understand what Autopilot can or cannot do. To think that you regard Tesla drivers as so stupid says more about you than them. Following your line of reasoning no human should be allowed to drive - after all they haven't gone through pilot training.

Quote:

A survey conducted in 2019 showed that nearly 50 percent of drivers thought Autopilot was safe to use hands-free.

I believe he has some numbers to back him up. This wasn't necessarily TESLA drivers, but too many Tesla drivers think it's safe to keep their hands off the wheel and that's not a subject that can be hand-waved by a gross generalization fallacy. I expect that most Tesla drivers keep their hands on the wheel.

That ANY don't is a product of Tesla marketing and driver stupidity. I don't think there's any room for argument there.

You really just don’t know what you are talking about because you don’t drive a Tesla.

If you leave your hands off the wheel you get a white warning within 10 - 15 seconds. A few seconds after that you get a red warning and autopilot is disabled for the rest of the trip. No cruise control, no auto steering, no nothing. So no, you can’t drive around with no hands on the steering wheel. Hell, it is so good now on the highway that I don’t even correct it anymore. So I get white warnings even with both hands on the wheel, which I then don’t notice because I am looking way down the road. Which gets me red carded.

It is not marketing or the name that makes people trust autopilot more than they should. It is the fact that it is pretty good and seems reliable that makes people trust it.

If you actually drove a Tesla then you would experience the complete garbage job it does on regular roads. Guess what, nobody trusts it on those. It is a recent feature and completely unreliable.

Your whole thesis is wrong because you are just making stuff up with no actual relevant experience.

I've thought for a while they need to rename it. I'm an aviation buff so I understand that autopilot on aircraft rarely do more than hold altitude, heading, attitude and/or throttle but to the average person it means "full automation" which makes the naming choice by Tesla questionable.

Of course, Tesla Autopilot is, in actuality, closer to a ground-based version of the fairly limited features of real world aviation autopilot, but that's not how most people view it and to a large degree that's not how they've sold it.

As an aviation buff you do know that an autopilot can land a plain in zero visibility, I'm sure.