Tesla needs to fix Autopilot safety flaws, demands Senator Markey

Is there anyone clamoring for autopilot in airplanes to be called something different?

False equivalence is false.

So the answer is no. Got it.

From what I understand, autopilot in a airplane and autopilot without full self driving in a Tesla do the same thing. Could be wrong, I don't know.

But it's Tesla so of course it's different. It always is.

Sometimes you have to 'panda' to the public's lack of technical knowledge. Someone with your name must surely understand that 😉.

The fact is the vast majority of the public have a misunderstanding of what autopilot means. Therefore, Tesla using the term correctly, can still lead to the public thinking the car is more autonomous than it is and therefore lead to more accidents.

Context like your failure to address the "that would be the crux of the while Tesla discussion here" part of my comment?

Airline pilots have far more barriers to its execution than Tesla drivers -- far more rigorous licensing and termination of their careers comes to mind -- so that analogies between airline autopilot and Tesla autopilot as to incentives to avoid abusing it are categorically false.

I agree. I was refuting the claim that it is possible to perform an autoland without supervision. Aircraft pilots cannot perform an autoland unattended without repercussions. Even if they do throw caution to the wind, they have to disregard the training you mentioned.

Therefore one shouldn't drive a Tesla in 'autopilot' destracted.

This whole comparison between aircraft autopilot and Tesla autopilot is asinine. They are completely different beasts. We're in the same boat here.

I don't think the brand or marketing has ever been found to be a safety factor in any real accident? Like NHTSA investigated and found the driver was confused about the capabilities. I get the theory that it could happen and people could be confused, but there's zero evidence of it happening to drivers getting into accidents. Maybe it's confusing or misleading at purchase time, but not for owners/drivers.

I don't think the brand or marketing has ever been found to be a safety factor in any real accident? Like NHTSA investigated and found the driver was confused about the capabilities. I get the theory that it could happen and people could be confused, but there's zero evidence of it happening to drivers getting into accidents. Maybe it's confusing or misleading at purchase time, but not for owners/drivers.

Using it recklessly doesn't mean he's confused about the capabilities. You can fully understand what Autopilot does and doesn't do and still decide to check on your dog. You will get away with it 99.99% of the time too. A stopped vehicle on the highway is going to be at high risk of being hit even by normal drivers.

Context like your failure to address the "that would be the crux of the while Tesla discussion here" part of my comment?

Airline pilots have far more barriers to its execution than Tesla drivers -- far more rigorous licensing and termination of their careers comes to mind -- so that analogies between airline autopilot and Tesla autopilot as to incentives to avoid abusing it are categorically false.

I agree. I was refuting the claim that it is possible to perform an autoland without supervision. Aircraft pilots cannot perform an autoland unattended without repercussions. Even if they do throw caution to the wind, they have to disregard the training you mentioned.

In my dictionary "possible to perform an autoland without supervision" means that it is technically possible. Not legally or other imaginary ways. I believe someone in this thread mentioned the autoland capability of A320 as a proof that it is indeed technically possible.

Aren't you just arguing about definitions? That would be a silly thing.

I don't think the brand or marketing has ever been found to be a safety factor in any real accident? Like NHTSA investigated and found the driver was confused about the capabilities. I get the theory that it could happen and people could be confused, but there's zero evidence of it happening to drivers getting into accidents. Maybe it's confusing or misleading at purchase time, but not for owners/drivers.

Using it recklessly doesn't mean he's confused about the capabilities. You can fully understand what Autopilot does and doesn't do and still decide to check on yodog. You will get away with it 99.99% of the time too. A stopped vehicle on the highway is going to be at high risk of being hit even by normal drivers.

With enough excuses you might even overcome the whole "Over the last few years, Tesla had a problem where people would crash their cars and blame it on Autopilot..." part. Does "overly reliant upon the system" not qualify as misunderstanding the capabilities of the system in your universe?

In my dictionary "possible to perform an autoland without supervision" means that it is technically possible. Not legally or other imaginary ways. I believe someone in this thread mentioned the autoland capability of A320 as a proof that it is indeed technically possible.

Aren't you just arguing about definitions? That would be a silly thing.

I was giving the original post in this conversation the benefit of the doubt when he used the word "can't", and read his post considering his argument as a whole. Obviously you aren't.

Most people use that word colloquially as meaning they are unable to something for various reasons, not just physically unable. I'm sure you know this, and have even done it.

For those that are trying to equate the use of the term "autopilot" in a commercial airline to the use of the term in a Tesla.

1. Is the autopilots functionality clearly defined in a way that is unambiguous and possible to rigorously implement. Tesla: No Airliner: Yes

2. Is the implementation certified to function correctly against the specification defined in 1.Tesla: No Airliner: Yes (mostly, right Boeing?)

3. Are the modes of failure well understood?Tesla: No Airliner: Yes

4. Are the operators of the "autopilot" properly trained when failure is encountered?Tesla: No Airliner: Yes

5. Will the vehicle be pulled from service when an incident causing a fatality or fatalities is encountered?Tesla: No Airliner: Yes

Airliner autopilots are certified to work how they do, they are able to operate an aircraft from takeoff to touchdown with zero input from the pilot to the control surfaces and they are certified to do so. Autopilot for Tesla's is a marketing term that sounds far more amazing than "driver assist" does, it REMAINS however, a driver assist and should be called such.

Needs to be renamed for sure. Calling something Autopilot that *isn't* is misleading and dangerous.

The fact is, the public doesn't really know what an aviation autopilot does except from what they see in film/TV.

Autopilots vary by aircraft model and by how much the owner wants to augment original equipment.

1. Basic A/P will simply hold a level heading, no course or elevation changes.2. The next level will be able to make turns to a new heading, no elevation changes.3. Then be able to follow a course with multiple turns, no elevation changes.4. A/P which can change altitudes usually requires what's called "auto-throttle" or FADEC (Full-Authority Digital Engine Control) -- up until recently reserved for jets but is now available on smaller, piston general aviation aircraft.5. Auto-land is a different animal and there are multi functional levels there.

Collision avoidance via TCAS (Traffic Collision Avoidance System -- which is usually only advisory and doesn't take evasive action) or ADS-B (Automatic Dependent Surveillance—Broadcast) data is only available rarely on a very few complex jet aircraft; it is not a common feature as it presumes a pilot will take action based upon the alerts -- which are often incorrect.

This is an over-simplification as there are cost deltas in there which involve orders (or two) of magnitude of cost and training requirements.

So "autopilot" by its fuzzy definition was a bad choice by the automakers no matter how you look at it as it set a false sense of functionality in the lay public's mind.

Because when you take away 95% of the driver's task responsibilities, they get bored. So you're not only fighting irresponsible driving behavior in general, but the added fact that the driver literally has nothing to do for long stretches of time in uneventful circumstances and a biological imperative to pay attention to something else.

I don't think the brand or marketing has ever been found to be a safety factor in any real accident? Like NHTSA investigated and found the driver was confused about the capabilities. I get the theory that it could happen and people could be confused, but there's zero evidence of it happening to drivers getting into accidents. Maybe it's confusing or misleading at purchase time, but not for owners/drivers.

Using it recklessly doesn't mean he's confused about the capabilities. You can fully understand what Autopilot does and doesn't do and still decide to check on yodog. You will get away with it 99.99% of the time too. A stopped vehicle on the highway is going to be at high risk of being hit even by normal drivers.

With enough excuses you might even overcome the whole "Over the last few years, Tesla had a problem where people would crash their cars and blame it on Autopilot..." part. Does "overly reliant upon the system" not qualify as misunderstanding the capabilities of the system in your universe?

On the contrary, understanding the capabilities of the system is what leads to over-reliance and risk taking. Even having a deep technical knowledge of how the feature works doesn't stop you from becoming over-reliant. Being unfamiliar or a new driver means you're less likely to take dumb risks. Many accidents on Autopilot have happened to experienced users. If you've used Autopilot a lot, you start to realize you can get away with things, sending a text, or checking on the dog, etc.

I don't think the brand or marketing has ever been found to be a safety factor in any real accident? Like NHTSA investigated and found the driver was confused about the capabilities. I get the theory that it could happen and people could be confused, but there's zero evidence of it happening to drivers getting into accidents. Maybe it's confusing or misleading at purchase time, but not for owners/drivers.

Using it recklessly doesn't mean he's confused about the capabilities. You can fully understand what Autopilot does and doesn't do and still decide to check on yodog. You will get away with it 99.99% of the time too. A stopped vehicle on the highway is going to be at high risk of being hit even by normal drivers.

With enough excuses you might even overcome the whole "Over the last few years, Tesla had a problem where people would crash their cars and blame it on Autopilot..." part. Does "overly reliant upon the system" not qualify as misunderstanding the capabilities of the system in your universe?

On the contrary, understanding the capabilities of the system is what leads to over-reliance and risk taking. Even having a deep technical knowledge of how the feature works doesn't stop you from becoming over-reliant. Being unfamiliar or a new driver means you're less likely to take dumb risks. Many accidents on Autopilot have happened to experienced users. But if you've used Autopilot a lot, you start to realize you can get away with things, sending a text, or checking on the dog, etc.

Jesus Christ, you're just demanding, by fiat, a default that everyone understands the system and is intentionally flouting it at all times. It's not ever overestimating capabilities.

I don't think the brand or marketing has ever been found to be a safety factor in any real accident? Like NHTSA investigated and found the driver was confused about the capabilities. I get the theory that it could happen and people could be confused, but there's zero evidence of it happening to drivers getting into accidents. Maybe it's confusing or misleading at purchase time, but not for owners/drivers.

Using it recklessly doesn't mean he's confused about the capabilities. You can fully understand what Autopilot does and doesn't do and still decide to check on yodog. You will get away with it 99.99% of the time too. A stopped vehicle on the highway is going to be at high risk of being hit even by normal drivers.

With enough excuses you might even overcome the whole "Over the last few years, Tesla had a problem where people would crash their cars and blame it on Autopilot..." part. Does "overly reliant upon the system" not qualify as misunderstanding the capabilities of the system in your universe?

On the contrary, understanding the capabilities of the system is what leads to over-reliance and risk taking. Even having a deep technical knowledge of how the feature works doesn't stop you from becoming over-reliant. Being unfamiliar or a new driver means you're less likely to take dumb risks. Many accidents on Autopilot have happened to experienced users. But if you've used Autopilot a lot, you start to realize you can get away with things, sending a text, or checking on the dog, etc.

Jesus Christ, you're just demanding, by fiat, a default that everyone understands the system and is intentionally flouting it.

Needs to be renamed for sure. Calling something Autopilot that *isn't* is misleading and dangerous.

Now. Having said that, most people haven't a clue what autopilot is or does, thanks primarily to movies and TV getting it wrong for decades.

The name isn't the issue. Humans are the issue. Call it whatever you want, people will abuse it.

This is exactly it. The comparison to airplane autopilot is flawed because the average person is a) not a pilot and b) doesn't know the limitations of airplane autopilot. It's a poorly-chosen name by Tesla.

For those that are trying to equate the use of the term "autopilot" in a commercial airline to the use of the term in a Tesla.

1. Is the autopilots functionality clearly defined in a way that is unambiguous and possible to rigorously implement. Tesla: No Airliner: Yes

2. Is the specification certified to function correctly against the specification defined in 1.Tesla: No Airliner: Yes (mostly, right Boeing?)

3. Are the modes of failure well understoodTesla: No Airliner: Yes

4. Are the operators of the "autopilot" properly trained when failure is encountered?Tesla: No

Hi Tesla owner here...

Regarding (1) the sales folks and website make it clear on how you are expected to operate the vehicle when using "autopilot". The car itself is clear when you first enable the autopilot feature in the vehicles settings. The car itself is clear every time when you activate it for use. The car actively reminds you if it believes you are not ready to take control of the vehicle and it will after a few warnings disable autopilot for the remainder of the trip. (now they likely can improve the driver monitoring, in my case I get false warnings while others like to attempt to trick it .. again against the clear statements of how to use autopilot)

Also the car will degrade its functionality with warnings if it believes it cannot operate given current road conditions (weather, construction, etc.) or if aspect of the vision system are impaired (weather, condensation, etc.).

Regarding (4) all of us are "trained" to drive a vehicle, that is the only training needed when operating a Tesla with or without autopilot engaged. If you break or sufficiently steer (doesn't take much) the autopilot will disengage with clear notification (you can also pull a paddle on the steering column to disengage). You just need to be ready to drive your vehicle.

Regarding (2) and (3) a range of regulations exist and certifications exist at the federal and state level but likely they need to be much better implemented. The failure modes in general are the driver takes control since they are expected to be ready to do so in a short time span when using autopilot (just like an autopilot in a boat or aircraft).

I personally don't have any issue with the name autopilot since I also have been a small aircraft pilot and much more often a boat captain, I likely have a clearer understanding what autopilot means (not all that automagic). I would agree changing the name may help in some regard but lets be clear you have to go out of your way to ignore the information you are presented with well before you first enable autopilot in a Tesla.

Oh one last thing, the autopilot in a Tesla and other vehicles with similar navigation capabilities are actually more capable then just about any other system that we call an "autopilot" on a ship or aircraft.

Using it recklessly doesn't mean he's confused about the capabilities. You can fully understand what Autopilot does and doesn't do and still decide to check on yodog. You will get away with it 99.99% of the time too. A stopped vehicle on the highway is going to be at high risk of being hit even by normal drivers.

With enough excuses you might even overcome the whole "Over the last few years, Tesla had a problem where people would crash their cars and blame it on Autopilot..." part. Does "overly reliant upon the system" not qualify as misunderstanding the capabilities of the system in your universe?

On the contrary, understanding the capabilities of the system is what leads to over-reliance and risk taking. Even having a deep technical knowledge of how the feature works doesn't stop you from becoming over-reliant. Being unfamiliar or a new driver means you're less likely to take dumb risks. Many accidents on Autopilot have happened to experienced users. But if you've used Autopilot a lot, you start to realize you can get away with things, sending a text, or checking on the dog, etc.

Jesus Christ, you're just demanding, by fiat, a default that everyone understands the system and is intentionally flouting it.

Prove it.

Nope, just looking for any evidence to the contrary.

So you admit it. How about looking for any evidence in support, rather than assuming an unproved default "truth."

Because when you take away 95% of the driver's task responsibilities, they get bored. So you're not only fighting irresponsible driving behavior in general, but the added fact that the driver literally has nothing to do for long stretches of time in uneventful circumstances and a biological imperative to pay attention to something else.

People get bored watching the road all the time, especially on freeways. It's not just an Autopilot thing.

That boredom kills people all the time. We need driver attention monitoring everywhere.

Is there anyone clamoring for autopilot in airplanes to be called something different?

I personally find this line of argument disingenuous. Plane autopilots are used by extremely highly trained people who know exactly what it can and can't do, and in the context they are used in, high in the air, are a much more uncluttered environment so they can do a lot less and still be relatively safe.

But when the "pilots" are complete randos, people with bad vision, people reading their cellphones and putting on their makeup, in the middle of a swarm of closely maneuvering vehicles in close proximity to the cold, hard earth... no, autopilot no longer has the context to be a reasonable term. Especially when placed in conjunction with tons of actual, directly misleading statements by Musk indicating that the current autopilot is "almost" fully autonomous and will be soon. There is plenty of evidence that actual Tesla drivers do place too much faith in the term and the misleading statments by Musk, and that's what ultimately matters.

Change the name, require better driver monitoring, and demand that Musk stop claiming full autonomy anytime soon, until he can back it up to the satisfaction of experts. I'm totally down with that.

I'd go one step further. It's often stated that airline pilots are (usually) required to be checked out on what, when, how, where, etc on what their aircraft's systems can do. That's mandated by the government regulatory bodies. Yes, I'm ignoring the fiasco with Boeing's 737-MAX MCAS as an outlier.

As driver assistance systems are becoming more and more common and they change the way the vehicles they are equipped with actually handle, it should be an absolute requirement with no exceptions drivers using these vehicles should have similar instruction in use and what they can and can't do. These classes should be funded by the carmakers at no cost to the purchaser. If carmakers and presumably governments at some point - in the name of safety of course - are going to force this down our throats, they should foot the cost of informing the driving public of their capabilities and limitations. A passing grade on a final test would be mandatory. Violation of the guidlines, similar to FAA rules, would require retaking certification at driver expense, or forfeiture of license.

Is there anyone clamoring for autopilot in airplanes to be called something different?

False equivalence is false.

So the answer is no. Got it.

From what I understand, autopilot in a airplane and autopilot without full self driving in a Tesla do the same thing. Could be wrong, I don't know.

But it's Tesla so of course it's different. It always is.

Close but to complete your understanding realise that cars frolic exclusively on the ground whereas airplanes fly in the sky. Source: third grade

Yep, he started with middle-school false-equivalence and then when shown his obvious error he doubled down and went full 3rd grade "Mean meanies pick on Tesla! WAAAAAAAAAA!Sad.

Edit- The separate universes where aircraft and automobiles operate have no relation to each other. A commercial/private aircraft can veer a few feet laterally and not be in danger of coming in contact with another aircraft, thereby careening wildly after Newtonian physics take over. An automobile doesn't always have that option.

Is there anyone clamoring for autopilot in airplanes to be called something different?

The trick here is that people don't realize generally what the autopilot in an airplane actually does. And mostly, it's much less than they think.

No not really. Most people think that autopilots fly the plane. For the vast majority of flights, besides a short time during take off and landing, that's exactly what it does.

Pilots are actively involved in flying for maybe 10 percent of the average flight time in modern aircraft. Planes absolutely do "fly themselves" in most situations.

That's why the proverb "Flying is easy. It's the landings that are tough." is quite profound. Auto pilots of aircraft don't so much fly them as keep them *in flight*. If the basic parameters are followed flight is maintained. It's when those conditions require constant adjustment is where the tricky parts-the actual *flying*-comes in.

I don't think the brand or marketing has ever been found to be a safety factor in any real accident? Like NHTSA investigated and found the driver was confused about the capabilities. I get the theory that it could happen and people could be confused, but there's zero evidence of it happening to drivers getting into accidents. Maybe it's confusing or misleading at purchase time, but not for owners/drivers.

Using it recklessly doesn't mean he's confused about the capabilities. You can fully understand what Autopilot does and doesn't do and still decide to check on yodog. You will get away with it 99.99% of the time too. A stopped vehicle on the highway is going to be at high risk of being hit even by normal drivers.

With enough excuses you might even overcome the whole "Over the last few years, Tesla had a problem where people would crash their cars and blame it on Autopilot..." part. Does "overly reliant upon the system" not qualify as misunderstanding the capabilities of the system in your universe?

On the contrary, understanding the capabilities of the system is what leads to over-reliance and risk taking. Even having a deep technical knowledge of how the feature works doesn't stop you from becoming over-reliant. Being unfamiliar or a new driver means you're less likely to take dumb risks. Many accidents on Autopilot have happened to experienced users. If you've used Autopilot a lot, you start to realize you can get away with things, sending a text, or checking on the dog, etc.

To me, it seems both scenarios can-and have- come into play. Both situations can contribute depending on the operator/driver in question.

I 100% agree we need active driver monitoring with a camera (which I should note should not be hard for tesla to roll out as an interior camera is already present in every model 3 sold, but for some reason not X/S). That applies for tesla, but also the dozens of other cars that are sold with level 2 systems these days.

That said, studies have routinely found that actual owners of teslas are not unfamiliar with autopilot capabilities. What people on the street think tesla's autopilot is capable of is not nearly as important as what the actual users of the product think it is capable of.

The lowest was 3% of people thinking you could safely take a nap while using the feature vs 6% for tesla. Yes, that's worse, but clearly 3% (or 6%) of people with these systems are not using them while sleeping because when you actually buy one of these products you are told (and clearly aware) of the limitations. 6% of autopilot capable tesla owners is about 50,000 cars.

The issue is not that people see the name autopilot and think "guess I should buy a defeat device so autopilot doesn't nag me and strap a DVD player to the dash" it's people being negligent. The same type of behavior that causes people to text while driving, or drive drunk. Roughly 1/3 of drivers admit to texting while driving, despite the dangers being obvious and well publicized.

However, while the name "controversy" is overdone, I do support driver monitoring via camera. Not because consumers genuinely believe they can safely fall asleep in the drivers seat, but because people will fall asleep in their car or take their eyes off the road for 5+ seconds just like they do in any car. While currently I don't see much evidence level 2 systems cause that temptation in most owners, over time I expect they will (once the system disengages every 600 miles for example vs ~30 which is standard today). This is just another form of negligence, and since driver monitoring is relatively cheap I support requiring new cars with level 2 driver assist to complain when people take their eyes off the road for extended periods of time.

Needs to be renamed for sure. Calling something Autopilot that *isn't* is misleading and dangerous.

Do you feel that you are fixating on a meaningless superficial part of all this: the name?

Let's say you rename it and nothing else changes, do all the problems magically go away? Would anything at all change other than resources wasted on a name change?

I assure you as a Tesla driver that they will not. It seems so good on the highway that it is easy to rely on it to "hold your steering wheel" while you do something like eat a Big Mac or futz with the phone etc. But unlike a human passenger doing that for you, this one is happy to steer right into an edge case. You ask it to do this not because of the name, but because of what it actually does. If it sucked at it you would not. Like for instance its performance off the highway on local roads is horrendously bad. You would never ever have it hold the wheel while you avert your gaze from the road.

A meaningful change is the reduced warning time Tesla now has. Instead of 2 minutes it's more like 15 seconds before the white warning comes on. If you hit the red warning autopilot is disabled until you park and restart.

But this sentiment that Tesla owners are being mislead into thinking that their cars can drive completely by themselves seems weird. Literally every time you turn the system on it tells you to pay attention and be ready to take over at any time. I believe all of the "how to" videos do, as well. The system starts demanding user input to ensure you're paying attention if it doesn't receive any after a short duration of time (I believe it's partially dependent on speed).

So there are really three components to owner education:

1) What the corporate marketing says,2) What the cars dashboard say, and3) What the dealer says!

That last one is where Tesla as a corporation is really doing dangerous shit. My boss just got a Tesla and described the sales environment- just horrifying to me (he happily bought the car.)

The salesperson took him for about a half hour ride around town with extended periods with his hands off the wheel doing other things to demonstrate how good autopilot is... Told my boss that he routinely has the car drive him for long trips while he reads on his phone and sometimes sleeps.

Yesterday my boss just bragged about how his crew Tesla drove him to a work meeting while he reviewed the documents he was presenting.

It feels to me like the dashboard warnings and the corporate high level marketing are essentially Musk winking at his customers while the regulators aren’t looking . His actions and those of his team on the ground tell us how he really feels about Autopilot. He thinks it’s ready for prime time and so do his dealers.

Is there anyone clamoring for autopilot in airplanes to be called something different?

The trick here is that people don't realize generally what the autopilot in an airplane actually does. And mostly, it's much less than they think.

No not really. Most people think that autopilots fly the plane. For the vast majority of flights, besides a short time during take off and landing, that's exactly what it does.

Pilots are actively involved in flying for maybe 10 percent of the average flight time in modern aircraft. Planes absolutely do "fly themselves" in most situations.

That's why the proverb "Flying is easy. It's the landings that are tough." is quite profound. Auto pilots of aircraft don't so much fly them as keep them *in flight*. If the basic parameters are followed flight is maintained. It's when those conditions require constant adjustment is where the tricky parts-the actual *flying*-comes in.

False and uninformed…

Flying is not easy. Planes do not fly themselves.

Landings are not tough. The greatest danger for aircraft and passengers is take-off.

Pilots have very little room to recover a take-off malfunction or catastrophic failure. Loss of control is immediate and irrecoverable.

Landings are simply a controlled crash with recover modes until under the suicide curve for a given aircraft approach.

Is there anyone clamoring for autopilot in airplanes to be called something different?

The trick here is that people don't realize generally what the autopilot in an airplane actually does. And mostly, it's much less than they think.

No not really. Most people think that autopilots fly the plane. For the vast majority of flights, besides a short time during take off and landing, that's exactly what it does.

Pilots are actively involved in flying for maybe 10 percent of the average flight time in modern aircraft. Planes absolutely do "fly themselves" in most situations.

That's why the proverb "Flying is easy. It's the landings that are tough." is quite profound. Auto pilots of aircraft don't so much fly them as keep them *in flight*. If the basic parameters are followed flight is maintained. It's when those conditions require constant adjustment is where the tricky parts-the actual *flying*-comes in.

False and uninformed…

Flying is not easy. Planes do not fly themselves.

Landings are not tough. The greatest danger for aircraft and passengers is take-off.

Pilots have very little room to recover a take-off malfunction or catastrophic failure. Loss of control is immediate and irrecoverable.

Landings are simply a controlled crash with recover modes until under the suicide curve for a given aircraft approach.

A crashed plane has 'landed'. Just not as smooth and precise the people involved would have liked. Thus, 'tough' comes into play. I apologize if my use of irony with a little snark was difficult to detect.

Edit- Whereas 'flying' (soaring through the air) is relatively easy. It's the controlled aspect of it (not landing when and how one doesn't want) that comes into play.

But this sentiment that Tesla owners are being mislead into thinking that their cars can drive completely by themselves seems weird. Literally every time you turn the system on it tells you to pay attention and be ready to take over at any time. I believe all of the "how to" videos do, as well. The system starts demanding user input to ensure you're paying attention if it doesn't receive any after a short duration of time (I believe it's partially dependent on speed).

So there are really three components to owner education:

1) What the corporate marketing says,2) What the cars dashboard say, and3) What the dealer says!

That last one is where Tesla as a corporation is really doing dangerous shit. My boss just got a Tesla and described the sales environment- just horrifying to me (he happily bought the car.)

The salesperson took him for about a half hour ride around town with extended periods with his hands off the wheel doing other things to demonstrate how good autopilot is... Told my boss that he routinely has the car drive him for long trips while he reads on his phone and sometimes sleeps.

Yesterday my boss just bragged about how his crew Tesla drove him to a work meeting while he reviewed the documents he was presenting.

It feels to me like the dashboard warnings and the corporate high level marketing are essentially Musk winking at his customers while the regulators aren’t looking . His actions and those of his team on the ground tell us how he really feels about Autopilot. He thinks it’s ready for prime time and so do his dealers.

Was this with the Autopilot or the “Full Self Driving” option? They’re not the same. The Autopilot doesn’t drive you anywhere, it just tries to keep you on the road. Whatever.

I think this discussion is laughable. You can’t regulate people not to dream, pick their noses or just generally not to drive “on autopilot” (figuratively, not literally) either.