Dutch police pull over Tesla with apparently sleeping, drunk driver

Which begs the question: what happens when more of those cars hit the road ? Seems like police would need a special access to be able to stop the cars.

In the case of the EU, a line of safety devices will be mandatory on new cars by 2021. The ones relevant to this case are:Advanced emergency brakingAlcohol interlock installation facilitationDrowsiness and attention detectionEvent (accident) data recorderEmergency stop signalintelligent speed assistanceLane keeping assist

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot

Less IQ, and more common sense. I've met too many highly-intelligent dumbasses to trust smart people just because they're smart.

To echo what Voltaire allegedly said: "Common sense is not so common".

I used to think that my Mom's, "Common sense, the rarest of all senses." was the coolest Mom statement until a friend's Mom said, " Never let your mouth write a check that your ass can't cover." Between those two statements there might even be a bit of a message to Elon.

I don't know what Tesla does when the driver doesn't pay attention, but I've got the latest VW system, and it's pretty aggressive... Making it possible for a driver to stay asleep at the wheel is irresponsible. I'm not sure how Tesla is getting away with it.

The Tesla uses a torque sensor on the steering wheel to detect the driver’s hands on wheel. If it doesn’t, it issues a series of escalating alerts before stopping the car and calling emergency services. The issue is that the driver fell asleep with his hands on the wheel, the same thing would have happened in your VW if it is also using a torque sensor, which I believe it does.

Lets not get our-self carried into trusting plain old logic with statistical probabilities of death and injury. Emotional response is clearly superior in this case /s

Car is either safer then the driver or is not. In first case we could pass the law to mandate capability just like we did for lights, seat belts, air bags, and a few other things which in grand theme of things saves lives and hundreds of billions in economic loss of people unable to work any more.

I don't know what Tesla does when the driver doesn't pay attention, but I've got the latest VW system, and it's pretty aggressive. I once did a test to see what would happen if I didn't take the wheel when prompted. It goes like that:

1. After 10 seconds: soft beep + the dash shows "please take over steering."2. 5 seconds later: another soft beep+ the dash changes to "Take over steering!" (Note the lack of "please" and the addition of an exclamation mark. I found that little touch funny)3. Another 5 seconds later: a really loud, long and angry beep4. Another 5 seconds later: another angry beep + the car jerks itself by quickly applying and releasing the brakes + the seatbelt is very abruptly tightened.5. I have no clue what happens then, because I didn't dare to continue the experiment. The last step is freaking scary and would definitely wake me up if I was sleeping (and give me an extra adrenaline boost that would keep me awake). (According to the manual, it's supposed to do the car on the shoulder and call emergency services, but I'm not going to test that)

Making it possible for a driver to stay asleep at the wheel is irresponsible. I'm not sure how Tesla is getting away with it.

How is this even a valid comparison?

You are comparing yourself taking your hand off the wheel of your VW with Temporary Autopilot to people that fell asleep still griping the wheel of a Tesla on Autopilot. Their hands are on the wheel and that's why it continues to drive down the road without nagging them to wake up and grab the wheel.

How about the next time you do your little experiment you close your eyes for 5 minutes and keep holding the wheel. Be sure to report back if you're still alive.

PS: There are plenty of YouTube videos that show what Tesla does when you don't grab the wheel when autopilot is engaged. So instead of talking a bunch of nonsense in a non-valid comparison, spend a few minutes and watch a video.

All auto manufacturers would be wise to ensure that the driver has to do something at intervals to register consciousness when the driver assist is engaged. Otherwise, public pressure and the auto insurance lobby will inevitably compel the pols to "Do something about it, quickly!" We know that usually doesn't end well.

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot

Less IQ, and more common sense. I've met too many highly-intelligent dumbasses to trust smart people just because they're smart.

To echo what Voltaire allegedly said: "Common sense is not so common".

I used to think that my Mom's, "Common sense, the rarest of all senses." was the coolest Mom statement until a friend's Mom said, " Never let your mouth write a check that your ass can't cover." Between those two statements there might even be a bit of a message to Elon.

Something I'm curious about - does Tesla use microphone data for their autopilot?

I know Musk is anti-lidar, since people primarily drive by sight, but I definitely use my hearing when driving - sirens, especially coming from an odd direction, or even the noise of a semi truck revving its engine Heck, I usually drive with my window cracked, unless it's raining hard, so I can hear better.

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot

Less IQ, and more common sense. I've met too many highly-intelligent dumbasses to trust smart people just because they're smart.

Musk claims to be smart and he claims and gives the impression that Autopilot is a driver'less system (AKA 'self-driving', 'autonomous driving'). That's probably where these dumb asses are getting this crap that sleeping while driving and/or being drunk while driving in a Tesla with autopilot engaged is somehow OK, they believed the false hype.

You literally can't enable it without it telling you it's a beta and you need to be monitoring it for safety.

Some people behave stupidly with AP because some people behave stupidly with anything.

Except that humans are not capable of the kind of monitoring AP demands so it doesn't matter if it warns you or not. Would you accept Boein shiping 737 Max with a disclaimer that in case of oopsies the passengers are expected to sprout wings and fly on their own?

Tesla is the main reason why these kind of demands are put on drivers despite all the science clearly showing that we are utterly incapable of doing it. Sooner or later someone is going to get smacked by the courts for that bullshit. Hopefully before too many people die, or at least just the drivers falling for Musks propaganda.

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot

Less IQ, and more common sense. I've met too many highly-intelligent dumbasses to trust smart people just because they're smart.

Musk claims to be smart and he claims and gives the impression that Autopilot is a driver'less system (AKA 'self-driving', 'autonomous driving'). That's probably where these dumb asses are getting this crap that sleeping while driving and/or being drunk while driving in a Tesla with autopilot engaged is somehow OK, they believed the false hype.

You literally can't enable it without it telling you it's a beta and you need to be monitoring it for safety.

Some people behave stupidly with AP because some people behave stupidly with anything.

Except that humans are not capable of the kind of monitoring AP demands so it doesn't matter if it warns you or not. Would you accept Boein shiping 737 Max with a disclaimer that in case of oopsies the passengers are expected to sprout wings and fly on their own?

Tesla is the main reason why these kind of demands are put on drivers despite all the science clearly showing that we are utterly incapable of doing it. Sooner or later someone is going to get smacked by the courts for that bullshit. Hopefully before too many people die, or at least just the drivers falling for Musks propaganda.

Is it that it is just too tempting to do other tasks while the AP is engaged? I wonder if most of the "failed to take over" issues (exempting this one of course) are more from getting bored by being asked to act like you are driving without the actual engagement of driving keeping your mind alert and on task.

Uh oh, don't comment on arse Technica about Tesla/ Elon musk. It's full of Tesla fangirls. Any mention that Elon is culpable for Tesla related incidents causes many triggers to go off for his fangirls. The simple truth is marketing autopilot makes people too complacent with technology. Either have fully autonomous systems or have people drive themselves.

There is a big gray area of having driver assist technology and marketing it as that. Tesla overhypes its autopilot, starting with the name.

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot

Less IQ, and more common sense. I've met too many highly-intelligent dumbasses to trust smart people just because they're smart.

Musk claims to be smart and he claims and gives the impression that Autopilot is a driver'less system (AKA 'self-driving', 'autonomous driving'). That's probably where these dumb asses are getting this crap that sleeping while driving and/or being drunk while driving in a Tesla with autopilot engaged is somehow OK, they believed the false hype.

You literally can't enable it without it telling you it's a beta and you need to be monitoring it for safety.

Some people behave stupidly with AP because some people behave stupidly with anything.

Except that humans are not capable of the kind of monitoring AP demands so it doesn't matter if it warns you or not. Would you accept Boein shiping 737 Max with a disclaimer that in case of oopsies the passengers are expected to sprout wings and fly on their own?

Except humans have been doing that kind of monitoring in Aircraft for many years.

Uh oh, don't comment on arse Technica about Tesla/ Elon musk. It's full of Tesla fangirls. Any mention that Elon is culpable for Tesla related incidents causes many triggers to go off for his fangirls. The simple truth is marketing autopilot makes people too complacent with technology. Either have fully autonomous systems or have people drive themselves.

There is a big gray area of having driver assist technology and marketing it as that. Tesla overhypes its autopilot, starting with the name.

The name is only an overhype if you're in the unfortunate position of believing that Aircraft Autopilot systems do not require a trained flight crew to feed the system data and monitor the flight.

Uh oh, don't comment on arse Technica about Tesla/ Elon musk. It's full of Tesla fangirls. Any mention that Elon is culpable for Tesla related incidents causes many triggers to go off for his fangirls. The simple truth is marketing autopilot makes people too complacent with technology. Either have fully autonomous systems or have people drive themselves.

There is a big gray area of having driver assist technology and marketing it as that. Tesla overhypes its autopilot, starting with the name.

The name is only an overhype if you're in the unfortunate position of believing that Aircraft Autopilot systems do not require a trained flight crew to feed the system data and monitor the flight.

Which is probably 99% of the population. Many people are under the misapprehension that today's aircraft take off, fly and land and the pilots are there to provide sexual services to the hostesses during layovers.

All auto manufacturers would be wise to ensure that the driver has to do something at intervals to register consciousness when the driver assist is engaged. Otherwise, public pressure and the auto insurance lobby will inevitably compel the pols to "Do something about it, quickly!" We know that usually doesn't end well.

Uh oh, don't comment on arse Technica about Tesla/ Elon musk. It's full of Tesla fangirls. Any mention that Elon is culpable for Tesla related incidents causes many triggers to go off for his fangirls. The simple truth is marketing autopilot makes people too complacent with technology. Either have fully autonomous systems or have people drive themselves.

There is a big gray area of having driver assist technology and marketing it as that. Tesla overhypes its autopilot, starting with the name.

The name is only an overhype if you're in the unfortunate position of believing that Aircraft Autopilot systems do not require a trained flight crew to feed the system data and monitor the flight.

Very true, but as a former military flight engineer this does takes training and lots of simulator time.

I'm thinking it might be a good idea with AP systems if they occasionally (and they might already do this, I have no personal experience with ANY ground vehicle AP systems) make you take over for a bit to kind of keep your mind on task.

I wish Tesla and all mfrs of L2 assist systems would release their actual data. That would let us see what difference if any the systems make wrt safety. It's clear that people like the L2 systems, as demonstrated in these comment threads, but it's not at all clear whether they improve or decrease safety.

What we get instead is comparisons between different scenarios: when Tesla claims there are fewer accidents with AP enabled than without, they're conveniently leaving out the driving context. It seems reasonable to expect that AP is enabled in general in safer situations (e.g. freeway driving) and not used in situations where accidents are more likely. So, yeah, of course, there are fewer accidents with AP enabled than when it's not enabled.

GM's Super Cruise can only be used in those safe situations. So, if GM said that there were fewer accidents with Super Cruise than without, we'd all say, gosh, isn't that amazing? In the safe (well mapped, controlled access) contexts where you let Super Cruise be used, there are fewer accidents than in all the less-safe contexts where you don't let it be used.

The relevant comparison is like-to-like. Is it safer on a controlled access, multi-lane highway with or without AP or Super Cruise? Does AP/SC do better or worse when traffic is heavier? Is it safer on a 2 lane highway without controlled access with or without AP? Is it safer on a city street with or without AP? Do drivers with AP enabled drive faster or slower than drivers with AP disengaged?

Tesla has this data and has either done the analysis and not shared it or hasn't done the analysis. The fact that they don't share this but instead hype meaningless comparisons strongly suggests that AP is not significantly safer to use. That doesn't mean it doesn't improve the driving experience, but it does suggest that such improvements don't result in real safety improvements.

In looking at this data, I'd be happy to filter out the suicidal idiots and just look at "responsible" use of driver assist tech. The suicidal idiots will hopefully continue to just have single-car and single-person fatalities.

This isn't just for Tesla: I'd really like to see comparable info from any car mfr that releases an L2 driver assist package.

The Instagram post cites a BAC level of 340ug /l, which—if my math is right—is a BAC of 0.34 percent in American units.

It could be that 340µg/L is actually the breath alcohol content (BrAC), which would yield a BAC (blood) of 0.07%.

This is undoubtedly the case here.

BAC is not usually reported in µg/L, and if it were, 340 µg/L equates to a BAC of 0.00%.

BrAC is often reported in µg/L and a BrAC of 340 µg/L equates to BAC of 0.07%, as pointed out by plantagenet. This is below the legal limit in every US state other than Utah (although above the legal limit for everywhere in Europe other than England, Wales, and Northern Ireland).

If this is the case, then the problem is greater than that of drunks who are capable of putting themselves and others at risk even without access to a vehicle. It is not necessary to have been drinking to drop off. If the Tesla could not detect that he was conscious (or even alive) then this was quite literally an accident waiting to happen. Trains are typically equipped with a Dead Man's Handle which requires more than passive touch. Irritating for drivers, but at the current state of development further safeguards may be necessary.

A Dead Man's Switch on a train enables the emergency brakes. This is typically not an issue due to consecutive trains usually being spaced several kilometers apart - this itself is a safety precaution, as a fully-loaded intercity train travelling at top speed will have a stopping distance of more than a kilometer.

In a car, slapping on the brakes as hard as possible might be doable on a 30kph residential zone without wrecking havoc, but the same cannot be said for highways.

Any Dead Man's Switch for cars would first have to find a suitable place to stop, drive to it - safely - and only then apply the brakes. This is a seriously hard engineering problem, and I'm skeptical as to whether it ever can be realized.

I think this is approaching the problem from the wrong end. The point of the Dead Man's Handle is to recognise if the driver is innattentive or unconscious. The appropriate action for a train will be different from a road vehicle but it does not affect the principle.

Teslas cannot currently cope with urban conditions in particular. They do not understand traffic lights, junction priority, cyclist and pedestrian priority etc. So a Tesla continuing to drive without human intervention is a hazard to everyone. There are solutions which would bring it slowly to a halt in reasonable safety.

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot

Less IQ, and more common sense. I've met too many highly-intelligent dumbasses to trust smart people just because they're smart.

Musk claims to be smart and he claims and gives the impression that Autopilot is a driver'less system (AKA 'self-driving', 'autonomous driving'). That's probably where these dumb asses are getting this crap that sleeping while driving and/or being drunk while driving in a Tesla with autopilot engaged is somehow OK, they believed the false hype.

You literally can't enable it without it telling you it's a beta and you need to be monitoring it for safety.

Some people behave stupidly with AP because some people behave stupidly with anything.

yes, but ... Its beta now, but not everyone has a beta.

and you literally can't get Musk to shut up with touting (in some way, if even by implying it) Tesla's autopilot being a driver'less system (AKA 'autonomous' or 'self driving') and its ok to do something else instead of paying attention and being a responsible driver while going down the road with autopilot engaged.

Here is something incontrovertible; 100% of drunks who fall asleep behind the wheel of a moving car for a sufficient amount of time, will stop violently - until now. The difference here is that Autopilot gave this man, and the people around him, a fighting chance of survival for long enough for the police to intervene.

Should autopilot have stopped this guy earlier? Yes. Is it designed to? Yes.

Tesla uses a torque sensor to detect hands on wheel. If it doesn’t, it issues a series of escalating alerts before stopping the vehicle and calling emergency services. The fact that the editor has chosen to promote a comment that questions this, comparing it negatively to VW which also 1) uses torque sensors to detect hands on wheel 2) issues a series of escalating alerts before stopping and calling, is ridiculous. The editor either knows, or reasonably should have known, that this is EXACTLY what the Tesla does. Why didn’t it stop sooner? The guy’s hands were still on the wheel when he fell asleep, defeating the abort cascade. Exactly the same failure would have occurred in the VW. GM’s super-cruise, as far as I know, is the only similar system that would have aborted based on eye tracking instead - point GM.

‘So what, autopilot probably encouraged the guy to drive drunk.’

Based on what, exactly? People have been driving drunk since cars were invented, they have never needed an excuse that ‘the car can handle it’. They are filled with rationalizations, even if ‘my car has autopilot, so there’ were one of them, there is absolutely zero evidence that it has any significant contribution to that decision. ‘But the guy on the bridge told the police his car had autopilot...’ well add that to the long list of excuses drunk drivers have offered the police to (unsuccessfully) justify their behavior.

Here is the nail in the coffin for the argument ‘if he hadn’t had autopilot, he wouldn’t have made the decision to drive drunk’;

Here’s a dirty little secret; even if you ignore all the warnings on the internet, the warnings when you take delivery, the warnings in the manual, and click passed all the screen warnings when you engage autopilot, that say it’s not self driving and you need to be alert; absolutely no one who has driven on autopilot for more than 10 or 20 miles could reasonably be under the impression that, in it’s current form, it can drive itself. Sure, Elon has promised it will someday soon, but it is immediately obvious to anyone who has ever used it - soon is not today. It disengages all the time, makes stupid mistakes, and drives like an insecure teenager - no one who has used it could possibly believe it can drive you home from the bar.

‘But what about the Harry Potter guy, or the .... guy’. Yes, there are people who knowingly test the systems limits outside of its design domain. There are idiots doing stupid things in all walks of society, putting themselves and others at risk. Here’s the thing; the responsibility for that antisocial behavior is on them, not the system. The reality is that in the more than 1 billion miles driven on autopilot, the incidence of stupid people doing stupid things is on par with the incidence of stupid people doing stupid things in all walks of life, you just don’t hear about the billion plus miles of people not doing stupid things on news sites and blogs.

There are plenty of things to criticize about autopilot, but in this instance, it probably saved the guy’s life.

Is it that it is just too tempting to do other tasks while the AP is engaged? I wonder if most of the "failed to take over" issues (exempting this one of course) are more from getting bored by being asked to act like you are driving without the actual engagement of driving keeping your mind alert and on task.

It's not temptation it's the fact that you have to be very concentrated without any feedback or action. Our brains just don't like that. This is what Google realised when they started their project and quickly backed off before killing someone. Tesla didn't, and in fact Teslas bullshit has pushed the entire industry into this kind of approach as before they were all saying that it needs to be a jump from level 1 to full level 5 as levels between depend on a human doing something humans are not able to do so it is stupid to do it. Sadly media jumped on Teslas PR and now we have the issue with humans not monitoring systems and manufacturers hiding behind "manual says they have to sprout wings and fly so we are not doing anything wrong". Oh yea, and someone gets killed from time to time thanks to that. Hard to claim that it's an accident when we already know that what tesla is demanding is not possible. All kinds of monitoring systems will not ensure that humans suddenly become superhuman.

PS: For all the Tesla fanboys talking about pilots and planes: Reaction times in planes are far longer than in cars which means they have far more time to get situational awareness when things go wrong. Plus there are two people there with extensive training, constant checks, and, most importantly, copious check lists for all kinds of things because you can't trust humans to remember so many things.Planes and cockpits are designed around human limitations (experience bought with many lives) instead of Teslas approach of ignoring human limitations and hiding behind legalise.

Is it that it is just too tempting to do other tasks while the AP is engaged? I wonder if most of the "failed to take over" issues (exempting this one of course) are more from getting bored by being asked to act like you are driving without the actual engagement of driving keeping your mind alert and on task.

It's not temptation it's the fact that you have to be very concentrated without any feedback or action. Our brains just don't like that. This is what Google realised when they started their project and quickly backed off before killing someone. Tesla didn't, and in fact Teslas bullshit has pushed the entire industry into this kind of approach as before they were all saying that it needs to be a jump from level 1 to full level 5 as levels between depend on a human doing something humans are not able to do so it is stupid to do it. Sadly media jumped on Teslas PR and now we have the issue with humans not monitoring systems and manufacturers hiding behind "manual says they have to sprout wings and fly so we are not doing anything wrong". Oh yea, and someone gets killed from time to time thanks to that. Hard to claim that it's an accident when we already know that what tesla is demanding is not possible. All kinds of monitoring systems will not ensure that humans suddenly become superhuman.

PS: For all the Tesla fanboys talking about pilots and planes: Reaction times in planes are far longer than in cars which means they have far more time to get situational awareness when things go wrong. Plus there are two people there with extensive training, constant checks, and, most importantly, copious check lists for all kinds of things because you can't trust humans to remember so many things.Planes and cockpits are designed around human limitations (experience bought with many lives) instead of Teslas approach of ignoring human limitations and hiding behind legalise.

That does make sense, I like what Tesla is trying to do but I agree that maybe a human's capability to monitor something in real time with no feedback may not be fully viable.

And yes I can attest to the checklists and 2 persons, good lord the checklists are insanely long and very detailed. When I left the service (was an FE on a CH-47D (yes that long ago)) we were using paper pre-flights, in-flights, pre-landing and post landing. Now they are using touch screen checklists which require you do them on time or you get in big trouble with the feds.

Making it possible for a driver to stay asleep at the wheel is irresponsible.

Wait . . . What?

The biggest value in self-driving cars is that it takes control away from a person and eliminates the biggest causes of accidents and deaths -- like drunk driving and falling asleep at the wheel.

Arresting someone for being drunk or asleep completely defeats the purpose of a self-driving car.

And this exposes the real issue here. Tesla and other companies are fraudulently selling cars with "self-driving" technology that doesn't actually work (reliably) and requires a person to be constantly vigilant -- which, again, completely defeats the whole point of a self-driving car.

Self-driving cars will be tremendously beneficial. Someday. But the technology isn't there yet. It's not even close.

Until cars can safely and reliably handle people who are drunk/asleep/whatever, these companies should not be allowed to sell any vehicles with "autopilot" technology to the general public.

Here is something incontrovertible; 100% of drunks who fall asleep behind the wheel of a moving car for a sufficient amount of time, will stop violently - until now. The difference here is that Autopilot gave this man, and the people around him, a fighting chance of survival for long enough for the police to intervene.

Should autopilot have stopped this guy earlier? Yes. Is it designed to? Yes.

Tesla uses a torque sensor to detect hands on wheel. If it doesn’t, it issues a series of escalating alerts before stopping the vehicle and calling emergency services. The fact that the editor has chosen to promote a comment that questions this, comparing it negatively to VW which also 1) uses torque sensors to detect hands on wheel 2) issues a series of escalating alerts before stopping and calling, is ridiculous. The editor either knows, or reasonably should have known, that this is EXACTLY what the Tesla does. Why didn’t it stop sooner? The guy’s hands were still on the wheel when he fell asleep, defeating the abort cascade. Exactly the same failure would have occurred in the VW. GM’s super-cruise, as far as I know, is the only similar system that would have aborted based on eye tracking instead - point GM.

‘So what, autopilot probably encouraged the guy to drive drunk.’

Based on what, exactly? People have been driving drunk since cars were invented, they have never needed an excuse that ‘the car can handle it’. They are filled with rationalizations, even if ‘my car has autopilot, so there’ were one of them, there is absolutely zero evidence that it has any significant contribution to that decision. ‘But the guy on the bridge told the police his car had autopilot...’ well add that to the long list of excuses drunk drivers have offered the police to (unsuccessfully) justify their behavior.

Here is the nail in the coffin for the argument ‘if he hadn’t had autopilot, he wouldn’t have made the decision to drive drunk’;

Here’s a dirty little secret; even if you ignore all the warnings on the internet, the warnings when you take delivery, the warnings in the manual, and click passed all the screen warnings when you engage autopilot, that say it’s not self driving and you need to be alert; absolutely no one who has driven on autopilot for more than 10 or 20 miles could reasonably be under the impression that, in it’s current form, it can drive itself. Sure, Elon has promised it will someday soon, but it is immediately obvious to anyone who has ever used it - soon is not today. It disengages all the time, makes stupid mistakes, and drives like an insecure teenager - no one who has used it could possibly believe it can drive you home from the bar.

‘But what about the Harry Potter guy, or the .... guy’. Yes, there are people who knowingly test the systems limits outside of its design domain. There are idiots doing stupid things in all walks of society, putting themselves and others at risk. Here’s the thing; the responsibility for that antisocial behavior is on them, not the system. The reality is that in the more than 1 billion miles driven on autopilot, the incidence of stupid people doing stupid things is on par with the incidence of stupid people doing stupid things in all walks of life, you just don’t hear about the billion plus miles of people not doing stupid things on news sites and blogs.

There are plenty of things to criticize about autopilot, but in this instance, it probably saved the guy’s life.

except that's not what happened...

The Tesla uses a torque sensor and the autopilot needs more than just resting hands, it also needs very slight resistance to the autopilot not enough to disable it with a very slight and lite turn of the wheel to the left or right.

Rotational torque sensors on the steering shaft are looking for the weight of your hand resting on the steering wheel causing a slight rotational force or other slight rotational inputs to indicate you are there. Plus, also needed is a constant pressure that provides very slight resistance to the constant movement of the wheel by auto pilot.

Its impossible to supply both of these while asleep. Plus its likely the hands are going to come off the wheel when asleep especially if the driver has been drinking enough to cause this guys BAC because its an involuntary response to draw the hands and arms closer to the body at some point if asleep while intoxicated to such a BAC level thus the hands would come off the wheel. Its takes an aware, coordinated, timed, and conscious and purposeful act of muscle movement to supply that torque and pressure together in such a manner to satisfy autopilot. In other words you have to be awake. Tesla calls this act of satisfying autopilot you are still there 'active driver supervision'. These acts require intentional fine coordinated motor skills a person simply can not do while intoxicated and asleep.

Without these 'inputs' from the person the warning is suppose to start, if no driver interaction then the car slows down, etc.... but that did not happen. Instead the autopilot continues down the road with a sleeping intoxicated driver behind the wheel. The car was following a truck, almost tailgating it as the autopilot would not change lanes like it should have done.

Its a flaw in the autopilot system that's been present since at least 2015. Apparently Tesla has not been able to fix it because it happened to this guy, and the many others. You can find many instances via a google search of reports where this flaw started acting up and its probably either been the cause of or contributed to accidents.

It was not a case where autopilot probably saved the guy’s life. Autopilot was actually endangering his life and that of other drivers on the road.

You are applauding a serious, dangerous, and potentially life threatening flaw in the autopilot system. Its only luck that the car had not encountered a crash condition before the police arrived. Its the police that saved the guys life, they saved it from autopilot.

You drank the Musk koolaid, you believed the hype so seriously you were willing to comment on it without realizing that autopilot was endangering the man. Now you know why people use autopilot as if it was a 'drive'less' (AKA autonomous or self-driving) system, its because they drank the Musk koolaid too.

This guy was dumb enough to get that drunk and get in his car. He'd probably have done it in a car with no self-driving features -- people do all the time.

"Probably" is doing a lot of heavy lifting here. How probable? How have you evaluated this probability? Anything other than waving your hand to observe that some other people have done something similar at some previous time?

I'm enjoying a porch beer right now, in fact. Therefore I'm "probably" going to pass out in the grass tonight! People do it all the time.

"Every day, 29 people in the United States die in motor vehicle crashes that involve an alcohol-impaired driver."

So, to answer my question, you haven't. You know some things happen sometimes at some rate, and then you decided "probably" was the right word to use. I hope you're arguing from a position of dishonesty and you don't actually think what you've cited in any way justifies the claim you made. The alternative explanation is kind of unfortunate.

I use “probably” because I know how hard it is to reason a drink out of their stated desire to drive home, even when their vehicle has no self-driving features at all (in one case where I succeeded it was an ATV).

I also know that every time some intervention comes in that promises to enhance safety, naysayers pull the moral hazard argument, which is what you’re implicitly pulling here. And yet (as posted above) in retrospect the moral hazard even if it eventualizes — which it often doesn’t — doesn’t add up to a net negative.

Except humans have been doing that kind of monitoring in Aircraft for many years.

Must be something different about that then.from NY TimesBut Google decided to play down the vigilant-human approach after an experiment in 2013, when the company let some of its employees sit behind the wheel of the self-driving cars on their daily commutes. Engineers using onboard video cameras to remotely monitor the results were alarmed by what they observed — a range of distracted-driving behavior that included falling asleep. “We saw stuff that made us a little nervous,” Christopher Urmson, a former Carnegie Mellon University roboticist who directs the car project at Google, said at the time. The experiment convinced the engineers that it might not be possible to have a human driver quickly snap back to “situational awareness,” the reflexive response required for a person to handle a split-second crisis.

Uh oh, don't comment on arse Technica about Tesla/ Elon musk. It's full of Tesla fangirls. Any mention that Elon is culpable for Tesla related incidents causes many triggers to go off for his fangirls. The simple truth is marketing autopilot makes people too complacent with technology. Either have fully autonomous systems or have people drive themselves.

There is a big gray area of having driver assist technology and marketing it as that. Tesla overhypes its autopilot, starting with the name.

The name is only an overhype if you're in the unfortunate position of believing that Aircraft Autopilot systems do not require a trained flight crew to feed the system data and monitor the flight.

On this I call bullshit. Most folks assume autopilot to be a thing other than what it really is. More so in a car than a plane. People talk about the ToS that must be agreed upon with the touch-screen in a Tesla. Yet, how many folks just hit accept without reading the actual terms in any EULA/ToS? The name was derived for a reason. That being the implied notion of autonomous hands-free driving. Musk's Tweets reinforce that notion; wrong-headed though they are.

Ideally, airplane autopilots function toward the other end of the zone: there are systems to warn the pilot of ground proximity, or if the autopilot is unable to maintain the course, or of the presence of other aircraft. Now, it's not that simple in practice, but at least the goal is that the pilot has to remain oriented and able to take control but not constantly watchful.

In addition, in an airplane, the reaction times required a much slower. If the cruise control in a car is wrong, you may only have a split-second to respond. If one of the systems in an aircraft alerts you of a problem, you usually have several seconds to respond.

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot.

Plus something that clamps their hands to the wheel and shakes them every several seconds to make sure they are awake if autopilot is engaged.

It is becoming 'not uncommon' to see people sleeping in their Tesla's with autopilot engaged going down the road. Unfortunately the cops don't witness them doing it like this guy in the article.

Maybe there should be an IQ test before people can comment on articles. It's very rare for this to happen, but not so rare for people to drive drunk or fall asleep while driving. Nobody intentionally falls asleep while driving, with or without Autopilot. There's no evidence that people are more likely to drive drunk if they have Autopilot but there's plenty of evidence that if a person falls asleep, it can prevent an accident. People will drive drunk with or without Autopilot. It's not something that anyone wants. But people do want safety features in case a bad driver does something reckless.

Who remembers these gems?:

"If cars have lane departure warning, people will stop watching the road, and there will be more accidents!"

"If seat belts are required, it will give people a false sense of security and they will drive more recklessly!"

There have always been technophobes, but to say that it's bad to have safety features because it will cause people to be reckless is absurd. Chances are that more people will choose cars with those features because they are overcautious than the other way around.

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot.

Plus something that clamps their hands to the wheel and shakes them every several seconds to make sure they are awake if autopilot is engaged.

It is becoming 'not uncommon' to see people sleeping in their Tesla's with autopilot engaged going down the road. Unfortunately the cops don't witness them doing it like this guy in the article.

Maybe there should be an IQ test before people can comment on articles. It's very rare for this to happen, but not so rare for people to drive drunk or fall asleep while driving. Nobody intentionally falls asleep while driving, with or without Autopilot. There's no evidence that people are more likely to drive drunk if they have Autopilot but there's plenty of evidence that if a person falls asleep, it can prevent an accident. People will drive drunk with or without Autopilot. It's not something that anyone wants. But people do want safety features in case a bad driver does something reckless.

Who remembers these gems?:

"If cars have lane departure warning, people will stop watching the road, and there will be more accidents!"

"If seat belts are required, it will give people a false sense of security and they will drive more recklessly!"

There have always been technophobes, but to say that it's bad to have safety features because it will cause people to be reckless is absurd. Chances are that more people will choose cars with those features because they are overcautious than the other way around.

Not such a bad thing. Aren't the automated drivers already better drivers than sober humans? I'd rather have the robot at the steering wheel. I hope one day we'll get to sleep/read/knit/be wasted while the car drives itself without it being against the law.

Ideally, airplane autopilots function toward the other end of the zone: there are systems to warn the pilot of ground proximity, or if the autopilot is unable to maintain the course, or of the presence of other aircraft. Now, it's not that simple in practice, but at least the goal is that the pilot has to remain oriented and able to take control but not constantly watchful.

In addition, in an airplane, the reaction times required a much slower. If the cruise control in a car is wrong, you may only have a split-second to respond. If one of the systems in an aircraft alerts you of a problem, you usually have several seconds to respond.

Uh oh, don't comment on arse Technica about Tesla/ Elon musk. It's full of Tesla fangirls. Any mention that Elon is culpable for Tesla related incidents causes many triggers to go off for his fangirls. The simple truth is marketing autopilot makes people too complacent with technology. Either have fully autonomous systems or have people drive themselves.

There is a big gray area of having driver assist technology and marketing it as that. Tesla overhypes its autopilot, starting with the name.

The name is only an overhype if you're in the unfortunate position of believing that Aircraft Autopilot systems do not require a trained flight crew to feed the system data and monitor the flight.

Which is probably 99% of the population. Many people are under the misapprehension that today's aircraft take off, fly and land and the pilots are there to provide sexual services to the hostesses during layovers.

Not such a bad thing. Aren't the automated drivers already better drivers than sober humans? I'd rather have the robot at the steering wheel. I hope one day we'll get to sleep/read/knit/be wasted while the car drives itself without it being against the law.

That is in the future. We do not know where this car was going, but probably to his home. Teslas are definitely not capable of making such a journey safely at present, at least in a residential neighbourhood in somewhere like the Netherlands. So the challenge is to ensure that it does not attempt to do so unless there is a capable human in control. That does not negate the use of appropriate driver aids.

Uh oh, don't comment on arse Technica about Tesla/ Elon musk. It's full of Tesla fangirls. Any mention that Elon is culpable for Tesla related incidents causes many triggers to go off for his fangirls. The simple truth is marketing autopilot makes people too complacent with technology. Either have fully autonomous systems or have people drive themselves.

There is a big gray area of having driver assist technology and marketing it as that. Tesla overhypes its autopilot, starting with the name.

The name is only an overhype if you're in the unfortunate position of believing that Aircraft Autopilot systems do not require a trained flight crew to feed the system data and monitor the flight.

On this I call bullshit. Most folks assume autopilot to be a thing other than what it really is. More so in a car than a plane. People talk about the ToS that must be agreed upon with the touch-screen in a Tesla. Yet, how many folks just hit accept without reading the actual terms in any EULA/ToS? The name was derived for a reason.

You can call bullshit all you like but pretty much everything you've said is in full agrement.

Uh oh, don't comment on arse Technica about Tesla/ Elon musk. It's full of Tesla fangirls. Any mention that Elon is culpable for Tesla related incidents causes many triggers to go off for his fangirls. The simple truth is marketing autopilot makes people too complacent with technology. Either have fully autonomous systems or have people drive themselves.

There is a big gray area of having driver assist technology and marketing it as that. Tesla overhypes its autopilot, starting with the name.

The name is only an overhype if you're in the unfortunate position of believing that Aircraft Autopilot systems do not require a trained flight crew to feed the system data and monitor the flight.

On this I call bullshit. Most folks assume autopilot to be a thing other than what it really is. More so in a car than a plane. People talk about the ToS that must be agreed upon with the touch-screen in a Tesla. Yet, how many folks just hit accept without reading the actual terms in any EULA/ToS? The name was derived for a reason.

You can call bullshit all you like but pretty much everything you've said is in full agrement.

It's my fault that I didn't make it more clear that I was calling bullshit on autopilot, not the other poster's statements. I agree with them on how naming is so very important. That's all on me for the misunderstanding anyone had. The general sentiment about autopilot, EULA/ToS, and the ramifications stand.

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot.

Plus something that clamps their hands to the wheel and shakes them every several seconds to make sure they are awake if autopilot is engaged.

It is becoming 'not uncommon' to see people sleeping in their Tesla's with autopilot engaged going down the road. Unfortunately the cops don't witness them doing it like this guy in the article.

Maybe there should be an IQ test before people can comment on articles. It's very rare for this to happen, but not so rare for people to drive drunk or fall asleep while driving. Nobody intentionally falls asleep while driving, with or without Autopilot. There's no evidence that people are more likely to drive drunk if they have Autopilot but there's plenty of evidence that if a person falls asleep, it can prevent an accident. People will drive drunk with or without Autopilot. It's not something that anyone wants. But people do want safety features in case a bad driver does something reckless.

Who remembers these gems?:

(snip one)

"If seat belts are required, it will give people a false sense of security and they will drive more recklessly!"

.

Ive been around awhile and heard plenty of complainers, but never that one. The common refrain was "Im not wearing no seat belt. Its my life and you dont tell me how to live it".