Dutch police pull over Tesla with apparently sleeping, drunk driver

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot.

Plus something that clamps their hands to the wheel and shakes them every several seconds to make sure they are awake if autopilot is engaged.

It is becoming 'not uncommon' to see people sleeping in their Tesla's with autopilot engaged going down the road. Unfortunately the cops don't witness them doing it like this guy in the article.

Maybe there should be an IQ test before people can comment on articles. It's very rare for this to happen, but not so rare for people to drive drunk or fall asleep while driving. Nobody intentionally falls asleep while driving, with or without Autopilot. There's no evidence that people are more likely to drive drunk if they have Autopilot but there's plenty of evidence that if a person falls asleep, it can prevent an accident. People will drive drunk with or without Autopilot. It's not something that anyone wants. But people do want safety features in case a bad driver does something reckless.

Who remembers these gems?:

"If cars have lane departure warning, people will stop watching the road, and there will be more accidents!"

"If seat belts are required, it will give people a false sense of security and they will drive more recklessly!"

There have always been technophobes, but to say that it's bad to have safety features because it will cause people to be reckless is absurd. Chances are that more people will choose cars with those features because they are overcautious than the other way around.

Quote:

Autopilot....but there's plenty of evidence that if a person falls asleep, it can prevent an accident.

no, there isn't. There is zero evidence which shows that autopilot prevents accidents sleeping or not, but there is plenty of evidence which shows that autopilot was 'engaged' involved at the time of Tesla accidents.

no, there isn't. There is zero evidence which shows that autopilot prevents accidents sleeping or not, but there is plenty of evidence which shows that autopilot was 'engaged' involved at the time of Tesla accidents.

I dunno, in a situation like this I think it kinda spoke for itslef. How long realistically can an asleep person maintain a straight driving line in a traditional (ie - Non Assisted) car?

I mean, in this specifc scenario the driver shouldn't be allowed near a car for years, if allowed to drive again ever... And more generally we need to ask whether or not if people who fall asleep at the wheel should be allowed to drive...

..But do we need evidence that a car that will mantain a lane on its own is less likely to be in an accident than one that cannot where the driver is incapacitated/Asleep?

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot.

Plus something that clamps their hands to the wheel and shakes them every several seconds to make sure they are awake if autopilot is engaged.

It is becoming 'not uncommon' to see people sleeping in their Tesla's with autopilot engaged going down the road. Unfortunately the cops don't witness them doing it like this guy in the article.

Maybe there should be an IQ test before people can comment on articles. It's very rare for this to happen, but not so rare for people to drive drunk or fall asleep while driving. Nobody intentionally falls asleep while driving, with or without Autopilot. There's no evidence that people are more likely to drive drunk if they have Autopilot but there's plenty of evidence that if a person falls asleep, it can prevent an accident. People will drive drunk with or without Autopilot. It's not something that anyone wants. But people do want safety features in case a bad driver does something reckless.

Who remembers these gems?:

"If cars have lane departure warning, people will stop watching the road, and there will be more accidents!"

"If seat belts are required, it will give people a false sense of security and they will drive more recklessly!"

There have always been technophobes, but to say that it's bad to have safety features because it will cause people to be reckless is absurd. Chances are that more people will choose cars with those features because they are overcautious than the other way around.

Quote:

Autopilot....but there's plenty of evidence that if a person falls asleep, it can prevent an accident.

no, there isn't. There is zero evidence which shows that autopilot prevents accidents sleeping or not, but there is plenty of evidence which shows that autopilot was 'engaged' involved at the time of Tesla accidents.

The real test is how autopilot stacks up against alert human drivers. We already know humans fail at times. So, how does autopilot compare to the most alert drivers given the same amount of road time?

The real important factor is that automation is better than meatbags. If that holds true - we'll be better off with some degree of automation in transport. If not - back to the drawing board.

All auto manufacturers would be wise to ensure that the driver has to do something at intervals to register consciousness when the driver assist is engaged. Otherwise, public pressure and the auto insurance lobby will inevitably compel the pols to "Do something about it, quickly!" We know that usually doesn't end well.

Tesla's system does, but any usable system can be defeated.

NEVER under estimate the ingenuity, or laziness and stupidity of fools.

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot.

Plus something that clamps their hands to the wheel and shakes them every several seconds to make sure they are awake if autopilot is engaged.

It is becoming 'not uncommon' to see people sleeping in their Tesla's with autopilot engaged going down the road. Unfortunately the cops don't witness them doing it like this guy in the article.

Maybe there should be an IQ test before people can comment on articles. It's very rare for this to happen, but not so rare for people to drive drunk or fall asleep while driving. Nobody intentionally falls asleep while driving, with or without Autopilot. There's no evidence that people are more likely to drive drunk if they have Autopilot but there's plenty of evidence that if a person falls asleep, it can prevent an accident. People will drive drunk with or without Autopilot. It's not something that anyone wants. But people do want safety features in case a bad driver does something reckless.

Who remembers these gems?:

"If cars have lane departure warning, people will stop watching the road, and there will be more accidents!"

"If seat belts are required, it will give people a false sense of security and they will drive more recklessly!"

There have always been technophobes, but to say that it's bad to have safety features because it will cause people to be reckless is absurd. Chances are that more people will choose cars with those features because they are overcautious than the other way around.

Quote:

Autopilot....but there's plenty of evidence that if a person falls asleep, it can prevent an accident.

no, there isn't. There is zero evidence which shows that autopilot prevents accidents sleeping or not, but there is plenty of evidence which shows that autopilot was 'engaged' involved at the time of Tesla accidents.

The real test is how autopilot stacks up against alert human drivers. We already know humans fail at times. So, how does autopilot compare to the most alert drivers given the same amount of road time?

The real important factor is that automation is better than meatbags. If that holds true - we'll be better off with some degree of automation in transport. If not - back to the drawing board.

A fairer comparison would be between augmented meatbags (human+advanced driver assistance) and the coming L4 and L5 vehicles.

no, there isn't. There is zero evidence which shows that autopilot prevents accidents sleeping or not, but there is plenty of evidence which shows that autopilot was 'engaged' involved at the time of Tesla accidents.

I dunno, in a situation like this I think it kinda spoke for itslef. How long realistically can an asleep person maintain a straight driving line in a traditional (ie - Non Assisted) car?

I mean, in this specifc scenario the driver shouldn't be allowed near a car for years, if allowed to drive again ever... And more generally we need to ask whether or not if people who fall asleep at the wheel should be allowed to drive...

..But do we need evidence that a car that will mantain a lane on its own is less likely to be in an accident than one that cannot where the driver is incapacitated/Asleep?

Plus isn't the Netherlands known for difficult roads? Or am I confusing that with Finnland

The real test is how autopilot stacks up against alert human drivers. We already know humans fail at times. So, how does autopilot compare to the most alert drivers given the same amount of road time?

The real important factor is that automation is better than meatbags. If that holds true - we'll be better off with some degree of automation in transport. If not - back to the drawing board.

The test in this specific case is how autopilot stacks up against a drunk and sleeping human driver. That's not a very high bar to clear.

There's the confounding factor that autopilot might have caused a behaviour change -- that a drunk and tired human wouldn't have gotten behind the wheel. It appears the majority here believes that humans do not drive drunk unless they have Tesla Autopilot.

The Tesla uses a torque sensor and the autopilot needs more than just resting hands, it also needs very slight resistance to the autopilot not enough to disable it with a very slight and lite turn of the wheel to the left or right.

... These acts require intentional fine coordinated motor skills a person simply can not do while intoxicated and asleep.

Sorry, but that is just factually incorrect. People have defeated the Tesla torque sensor by wedging a grapefruit in the steering wheel. The dead weight of an unconscious driver’s hands could absolutely defeat the abort sequence.

Uh oh, don't comment on arse Technica about Tesla/ Elon musk. It's full of Tesla fangirls. Any mention that Elon is culpable for Tesla related incidents causes many triggers to go off for his fangirls. The simple truth is marketing autopilot makes people too complacent with technology. Either have fully autonomous systems or have people drive themselves.

I'd say the articles are pretty much anti-tesla 90% of the time on here. The wordings on this & the last article with some comments on software engineering on the last make this the Sean Hannity articles of the tech world

I would hardly fault Tesla for this incident as the driver was drunk & the system probably saved his life. & there is something to be said for personal responsibilityThe same story was reported on Electrek, followed by one on Rivian & some other stuff.

Bias aside, it would be nice to see some positive news. This is from a tesla centric website, but I was surprised to see an EV compete in this category I didn't actually think an ev would make it through the course

I would have expected Ars to be reporting on that & Rivian & the new Mercedes SQ electricOf course you will always have people that will either like the car or hate it & thats fine, its just personal preference.

I understand that negative articles garner page views & a boat load of comments & put food on the table.However the same can also be said for in depth high quality articles which used to be Ars forte

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot.

Plus something that clamps their hands to the wheel and shakes them every several seconds to make sure they are awake if autopilot is engaged.

It is becoming 'not uncommon' to see people sleeping in their Tesla's with autopilot engaged going down the road. Unfortunately the cops don't witness them doing it like this guy in the article.

Maybe there should be an IQ test before people can comment on articles. It's very rare for this to happen, but not so rare for people to drive drunk or fall asleep while driving. Nobody intentionally falls asleep while driving, with or without Autopilot. There's no evidence that people are more likely to drive drunk if they have Autopilot but there's plenty of evidence that if a person falls asleep, it can prevent an accident. People will drive drunk with or without Autopilot. It's not something that anyone wants. But people do want safety features in case a bad driver does something reckless.

Who remembers these gems?:

(snip one)

"If seat belts are required, it will give people a false sense of security and they will drive more recklessly!"

.

Ive been around awhile and heard plenty of complainers, but never that one. The common refrain was "Im not wearing no seat belt. Its my life and you dont tell me how to live it".

This story demonstrates why anti-Tesla stories are bunk. The only error Tesla made is calling their feature 'Autopilot'. It's not. You have to be engaged in the process of driving your car. If you are not you can die. Seriously.

There was another recent Tesla Autopilot story about a porn video that was shot while driving a Tesla on Autopilot.

If you're an idiot and decide to drive while not paying attention to what you're doing you deserve your Darwin Award.

Doesn't Tesla Autopilot recognize police cars? It is an all-visual system isn't it?

Musk promised it will, after exactly the same thing happened in California, and police drove for 7 miles, before they could stop Tesla with drunk driver. Musk promised police recognition will be added in 3 weeks, and Tesla will stop in such situations. Well, that was half a year ago, so he's a bit late, like with all the things he promised about "autopilot".

Uh oh, don't comment on arse Technica about Tesla/ Elon musk. It's full of Tesla fangirls. Any mention that Elon is culpable for Tesla related incidents causes many triggers to go off for his fangirls. The simple truth is marketing autopilot makes people too complacent with technology. Either have fully autonomous systems or have people drive themselves.

I'd say the articles are pretty much anti-tesla 90% of the time on here. The wordings on this & the last article with some comments on software engineering on the last make this the Sean Hannity articles of the tech world

I would hardly fault Tesla for this incident as the driver was drunk & the system probably saved his life. & there is something to be said for personal responsibilityThe same story was reported on Electrek, followed by one on Rivian & some other stuff.

Bias aside, it would be nice to see some positive news. This is from a tesla centric website, but I was surprised to see an EV compete in this category I didn't actually think an ev would make it through the course

I would have expected Ars to be reporting on that & Rivian & the new Mercedes SQ electricOf course you will always have people that will either like the car or hate it & thats fine, its just personal preference.

I understand that negative articles garner page views & a boat load of comments & put food on the table.However the same can also be said for in depth high quality articles which used to be Ars forte

No. Teslas are not safe cars to take somebody home who is not alert and in control. There was a significant risk that the car would, for example, go through a red light or fail to yield to a cyclist. The fact the driver was drunk, as opposed to, say, unconscious due to a medical condition is for this purpose irrelevant.

Of course a car with conventional controls would likely crash. It was under the control of software that meant it had not, but that does not mean that it was operating safely. We do not know what would have happened if the police had not intervened.

Uh oh, don't comment on arse Technica about Tesla/ Elon musk. It's full of Tesla fangirls. Any mention that Elon is culpable for Tesla related incidents causes many triggers to go off for his fangirls. The simple truth is marketing autopilot makes people too complacent with technology. Either have fully autonomous systems or have people drive themselves.

There is a big gray area of having driver assist technology and marketing it as that. Tesla overhypes its autopilot, starting with the name.

The name is only an overhype if you're in the unfortunate position of believing that Aircraft Autopilot systems do not require a trained flight crew to feed the system data and monitor the flight.

On this I call bullshit. Most folks assume autopilot to be a thing other than what it really is. More so in a car than a plane. People talk about the ToS that must be agreed upon with the touch-screen in a Tesla. Yet, how many folks just hit accept without reading the actual terms in any EULA/ToS? The name was derived for a reason. That being the implied notion of autonomous hands-free driving. Musk's Tweets reinforce that notion; wrong-headed though they are.

I'd say very few. It's not just disclaimers but instructions. There's no "autopilot" button in a Tesla. There's not a single label on any button saying that it's related to Autopilot or how to use it. It's not like another car where the Cruise Control button literally says Cruise Control right on it. Without reading the instructions, people wouldn't even know how to turn it on or change the settings. You can't just click a button to accept something. You need to enable certain settings, and know what each one does.

Then if you move the seat, it will stop working. That's because it will assume that it's a different driver. When each driver sets up a new profile, Autopilot will be disabled. A person has to go through the screens and enable the settings. So even somebody who thinks he knows what it does won't be able to use it. Or only parts of it might work if the person didn't read all the instructions on how to enable other parts. More likely, a person who had been shown how to turn it on during a test drive would find that turning it on doesn't work.

Even if a person did manage to get through all of it, intentionally skipping through the part that describes how to use it, the person would get an immediate notice when it gets turned on, and another one within minutes if the driver isn't holding the wheel. Nobody could go 10 minutes without knowing that it's not a hands off feature.

This isn't just opinion. A study of 675 Tesla owners found that 98% of them knew that they needed to pay attention at all times, and considering that Autopilot wasn't a chargeable feature, that's probably a lot higher than the percentage who actually used it.

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot.

Plus something that clamps their hands to the wheel and shakes them every several seconds to make sure they are awake if autopilot is engaged.

It is becoming 'not uncommon' to see people sleeping in their Tesla's with autopilot engaged going down the road. Unfortunately the cops don't witness them doing it like this guy in the article.

Maybe there should be an IQ test before people can comment on articles. It's very rare for this to happen, but not so rare for people to drive drunk or fall asleep while driving. Nobody intentionally falls asleep while driving, with or without Autopilot. There's no evidence that people are more likely to drive drunk if they have Autopilot but there's plenty of evidence that if a person falls asleep, it can prevent an accident. People will drive drunk with or without Autopilot. It's not something that anyone wants. But people do want safety features in case a bad driver does something reckless.

Who remembers these gems?:

"If cars have lane departure warning, people will stop watching the road, and there will be more accidents!"

"If seat belts are required, it will give people a false sense of security and they will drive more recklessly!"

There have always been technophobes, but to say that it's bad to have safety features because it will cause people to be reckless is absurd. Chances are that more people will choose cars with those features because they are overcautious than the other way around.

Quote:

Autopilot....but there's plenty of evidence that if a person falls asleep, it can prevent an accident.

no, there isn't. There is zero evidence which shows that autopilot prevents accidents sleeping or not, but there is plenty of evidence which shows that autopilot was 'engaged' involved at the time of Tesla accidents.

Then you haven't been paying attention. This article is an example of Autopilot keeping a car from crashing when the driver fell asleep. There are others.

Out of over a billion miles driven on Autopilot, there have been very few cases of significant accidents with Autopilot on, which is why people keep bringing up the same ones from years ago. They also leave out every accident involving any other car with driver assistance technology that didn't work as well when tested head to head with Tesla. Tesla publishes the statistics regularly, and the rate of accidents is far lower with Autopilot on.

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot.

Plus something that clamps their hands to the wheel and shakes them every several seconds to make sure they are awake if autopilot is engaged.

It is becoming 'not uncommon' to see people sleeping in their Tesla's with autopilot engaged going down the road. Unfortunately the cops don't witness them doing it like this guy in the article.

Maybe there should be an IQ test before people can comment on articles. It's very rare for this to happen, but not so rare for people to drive drunk or fall asleep while driving. Nobody intentionally falls asleep while driving, with or without Autopilot. There's no evidence that people are more likely to drive drunk if they have Autopilot but there's plenty of evidence that if a person falls asleep, it can prevent an accident. People will drive drunk with or without Autopilot. It's not something that anyone wants. But people do want safety features in case a bad driver does something reckless.

Who remembers these gems?:

"If cars have lane departure warning, people will stop watching the road, and there will be more accidents!"

"If seat belts are required, it will give people a false sense of security and they will drive more recklessly!"

There have always been technophobes, but to say that it's bad to have safety features because it will cause people to be reckless is absurd. Chances are that more people will choose cars with those features because they are overcautious than the other way around.

Quote:

Autopilot....but there's plenty of evidence that if a person falls asleep, it can prevent an accident.

no, there isn't. There is zero evidence which shows that autopilot prevents accidents sleeping or not, but there is plenty of evidence which shows that autopilot was 'engaged' involved at the time of Tesla accidents.

The real test is how autopilot stacks up against alert human drivers. We already know humans fail at times. So, how does autopilot compare to the most alert drivers given the same amount of road time?

The real important factor is that automation is better than meatbags. If that holds true - we'll be better off with some degree of automation in transport. If not - back to the drawing board.

You are asking the wrong question. It should be how Autopilot, with an alert driver, compares to a typical car with an equally alert driver. MIT studied it and found that Autopilot does not cause drivers to be less vigilant. https://hcai.mit.edu/tesla-autopilot-human-side.pdf The driver has full control of the steering, accelerator and brakes, and is monitoring the car in addition to Autopilot doing it. You will notice people who post that people turn it on without reading instructions or understanding it when actual studies refute that. People claim that drivers don't pay attention, and studies refute that too. You can go what people put in random comments, or you can believe teams of scientists from institutions such as MIT.

But to answer the question, cars with Autopilot do significantly better.

This story demonstrates why anti-Tesla stories are bunk. The only error Tesla made is calling their feature 'Autopilot'. It's not. You have to be engaged in the process of driving your car. If you are not you can die. Seriously.

There was another recent Tesla Autopilot story about a porn video that was shot while driving a Tesla on Autopilot.

If you're an idiot and decide to drive while not paying attention to what you're doing you deserve your Darwin Award.

Although I mostly agree with you, I think the problem isnt the name but the marketing. Tesla market the cars as having “full self driving”, when they do not.

The Tesla uses a torque sensor and the autopilot needs more than just resting hands, it also needs very slight resistance to the autopilot not enough to disable it with a very slight and lite turn of the wheel to the left or right.

... These acts require intentional fine coordinated motor skills a person simply can not do while intoxicated and asleep.

Sorry, but that is just factually incorrect. People have defeated the Tesla torque sensor by wedging a grapefruit in the steering wheel. The dead weight of an unconscious driver’s hands could absolutely defeat the abort sequence.

Or if a person goes over a bump, it could disable it. There was one video of somebody doing that with an orange, but acting as if it's commonplace is unfounded. Even people who do something to get around the sensor are most likely just as vigilant but would rather watch the road that watch out for messages.

Uh oh, don't comment on arse Technica about Tesla/ Elon musk. It's full of Tesla fangirls. Any mention that Elon is culpable for Tesla related incidents causes many triggers to go off for his fangirls. The simple truth is marketing autopilot makes people too complacent with technology. Either have fully autonomous systems or have people drive themselves.

I'd say the articles are pretty much anti-tesla 90% of the time on here. The wordings on this & the last article with some comments on software engineering on the last make this the Sean Hannity articles of the tech world

I would hardly fault Tesla for this incident as the driver was drunk & the system probably saved his life. & there is something to be said for personal responsibilityThe same story was reported on Electrek, followed by one on Rivian & some other stuff.

Bias aside, it would be nice to see some positive news. This is from a tesla centric website, but I was surprised to see an EV compete in this category I didn't actually think an ev would make it through the course

I would have expected Ars to be reporting on that & Rivian & the new Mercedes SQ electricOf course you will always have people that will either like the car or hate it & thats fine, its just personal preference.

I understand that negative articles garner page views & a boat load of comments & put food on the table.However the same can also be said for in depth high quality articles which used to be Ars forte

No. Teslas are not safe cars to take somebody home who is not alert and in control. There was a significant risk that the car would, for example, go through a red light or fail to yield to a cyclist. The fact the driver was drunk, as opposed to, say, unconscious due to a medical condition is for this purpose irrelevant.

Of course a car with conventional controls would likely crash. It was under the control of software that meant it had not, but that does not mean that it was operating safely. We do not know what would have happened if the police had not intervened.

That's correct on all accounts. A car without these features is much more likely to get into an accident but that does not mean that people should drive drunk or fall asleep. Drivers know that, but unfortunately people drive drunk or fall asleep in all kinds of cars. They aren't deluded into thinking that the cars can handle it. They are deluded into thinking that they can handle it. Occasionally they are deluded into thinking that a police officer won't know that they are drunk and might believe them if they say that the car can drive itself, but they know that it can't.

This story demonstrates why anti-Tesla stories are bunk. The only error Tesla made is calling their feature 'Autopilot'. It's not. You have to be engaged in the process of driving your car. If you are not you can die. Seriously.

There was another recent Tesla Autopilot story about a porn video that was shot while driving a Tesla on Autopilot.

If you're an idiot and decide to drive while not paying attention to what you're doing you deserve your Darwin Award.

Although I mostly agree with you, I think the problem isnt the name but the marketing. Tesla market the cars as having “full self driving”, when they do not.

That, and they sell the cars to morons with more money than sense.

No they don't. They market the car as having the hardware for full self driving. But a person would have to buy the package, and it lists what the capabilities are currently and what they are expected to be in the future, subject to regulatory approval. Not a single person bought the car expecting it to be self driving. If anybody had, that person would have put an address into the navigator and the car never would have left the driveway. Tesla would have told them that it's within the return period and if they don't want it, they can take it back. It would be literally impossible to think that the self driving software is enabled.

Uh oh, don't comment on arse Technica about Tesla/ Elon musk. It's full of Tesla fangirls. Any mention that Elon is culpable for Tesla related incidents causes many triggers to go off for his fangirls. The simple truth is marketing autopilot makes people too complacent with technology. Either have fully autonomous systems or have people drive themselves.

I'd say the articles are pretty much anti-tesla 90% of the time on here. The wordings on this & the last article with some comments on software engineering on the last make this the Sean Hannity articles of the tech world

I would hardly fault Tesla for this incident as the driver was drunk & the system probably saved his life. & there is something to be said for personal responsibilityThe same story was reported on Electrek, followed by one on Rivian & some other stuff.

Bias aside, it would be nice to see some positive news. This is from a tesla centric website, but I was surprised to see an EV compete in this category I didn't actually think an ev would make it through the course

I would have expected Ars to be reporting on that & Rivian & the new Mercedes SQ electricOf course you will always have people that will either like the car or hate it & thats fine, its just personal preference.

I understand that negative articles garner page views & a boat load of comments & put food on the table.However the same can also be said for in depth high quality articles which used to be Ars forte

No. Teslas are not safe cars to take somebody home who is not alert and in control. There was a significant risk that the car would, for example, go through a red light or fail to yield to a cyclist. The fact the driver was drunk, as opposed to, say, unconscious due to a medical condition is for this purpose irrelevant.

Of course a car with conventional controls would likely crash. It was under the control of software that meant it had not, but that does not mean that it was operating safely. We do not know what would have happened if the police had not intervened.

That's correct on all accounts. A car without these features is much more likely to get into an accident but that does not mean that people should drive drunk or fall asleep. Drivers know that, but unfortunately people drive drunk or fall asleep in all kinds of cars. They aren't deluded into thinking that the cars can handle it. They are deluded into thinking that they can handle it. Occasionally they are deluded into thinking that a police officer won't know that they are drunk and might believe them if they say that the car can drive itself, but they know that it can't.

Unfortunately for Tesla, and other manufacturers, it is their software that will be controlling the vehicle. It will not be enough to say that the car would have crashed anyway when the driver had the heart attack, and therefore they have no responsibility for an accident several miles down the road. Nor will they be able to blame the driver. The will have to convince regulators, and courts, that they could not have programmed the software to stop safely.

Except humans have been doing that kind of monitoring in Aircraft for many years.

Must be something different about that then.from NY TimesBut Google decided to play down the vigilant-human approach after an experiment in 2013, when the company let some of its employees sit behind the wheel of the self-driving cars on their daily commutes. Engineers using onboard video cameras to remotely monitor the results were alarmed by what they observed — a range of distracted-driving behavior that included falling asleep. “We saw stuff that made us a little nervous,” Christopher Urmson, a former Carnegie Mellon University roboticist who directs the car project at Google, said at the time. The experiment convinced the engineers that it might not be possible to have a human driver quickly snap back to “situational awareness,” the reflexive response required for a person to handle a split-second crisis.

It appears the majority here believes that humans do not drive drunk unless they have Tesla Autopilot.

And this Ladies and Gentlemen, is why you all should have listened to me when I said we should go back to Horses. They make excellent designated drivers.

Plus they don't mind it if you can't get further than their stall.

Absolutely, The HP (Horse Pilot) is guaranteed to get you home safe & sound should you imbue a wee bit too much on the way home

My experience with horses is that if you tried that, you’d wake up in the carriage at the side of the road somewhere, with the horse happily munching on some grass. He would then look back at you with an expression of complete indifference, as of saying “hey, I’m a horse. I eat grass. What did you expect?”

All of these driver aids are good if used properly. I like the ACC I have most of the time, but I know if the driver ahead of me pulls into a turn lane, clearing the road for me that my ACC will still track that vehicle, and just as the roadway opens up, will brake quite hard as the turning car decelerates. Annoying, but the guy behind me is seeing my lane clear, and is expecting me to accelerate, and he may well reflexively accelerate in turn. I have also noticed the ACC is completely oblivious to cross traffic. So, I have to be alert for both of the above occurrences, but I still find ACC to be useful overall. And I expect the same is true of AutoPilot.

As to the posters who think that Autopilot will tempt lushes who would not otherwise think of doing so to drink and drive, I have to ask, are you, or have you ever met, a human being? Especially one under the affluence of incohol? I have watched in an ice-glazed parking lot as a drunk friend tried to fight a football lineman for the car keys, as he wished to drive us home. Fortunately, around the third time he fell, he dropped the keys, and we lived to tell the tale, which two of us could remember.

In January, police arrested a man whose car was stopped on the San Francisco–Oakland Bay Bridge. He assured the officers that everything was OK because his car had been "on Autopilot."

There should be an IQ test before a person can engage autopilot.

Plus something that clamps their hands to the wheel and shakes them every several seconds to make sure they are awake if autopilot is engaged.

It is becoming 'not uncommon' to see people sleeping in their Tesla's with autopilot engaged going down the road. Unfortunately the cops don't witness them doing it like this guy in the article.

Maybe there should be an IQ test before people can comment on articles. It's very rare for this to happen, but not so rare for people to drive drunk or fall asleep while driving. Nobody intentionally falls asleep while driving, with or without Autopilot. There's no evidence that people are more likely to drive drunk if they have Autopilot but there's plenty of evidence that if a person falls asleep, it can prevent an accident. People will drive drunk with or without Autopilot. It's not something that anyone wants. But people do want safety features in case a bad driver does something reckless.

Who remembers these gems?:

"If cars have lane departure warning, people will stop watching the road, and there will be more accidents!"

"If seat belts are required, it will give people a false sense of security and they will drive more recklessly!"

There have always been technophobes, but to say that it's bad to have safety features because it will cause people to be reckless is absurd. Chances are that more people will choose cars with those features because they are overcautious than the other way around.

Quote:

Autopilot....but there's plenty of evidence that if a person falls asleep, it can prevent an accident.

no, there isn't. There is zero evidence which shows that autopilot prevents accidents sleeping or not, but there is plenty of evidence which shows that autopilot was 'engaged' involved at the time of Tesla accidents.

The real test is how autopilot stacks up against alert human drivers. We already know humans fail at times. So, how does autopilot compare to the most alert drivers given the same amount of road time?

The real important factor is that automation is better than meatbags. If that holds true - we'll be better off with some degree of automation in transport. If not - back to the drawing board.

A fairer comparison would be between augmented meatbags (human+advanced driver assistance) and the coming L4 and L5 vehicles.

Nope. It's all about human vs AI in terms of safety. We have so much data on humans. What we don't have is the same amount on AI (or assisted features).

All auto manufacturers would be wise to ensure that the driver has to do something at intervals to register consciousness when the driver assist is engaged. Otherwise, public pressure and the auto insurance lobby will inevitably compel the pols to "Do something about it, quickly!" We know that usually doesn't end well.

Tesla's system does, but any usable system can be defeated.

NEVER under estimate the ingenuity, or laziness and stupidity of fools.

And that's exactly Tesla's problem. Cars have to be designed for the worst-case scenario, not for the driver who does everything by the book.

All auto manufacturers would be wise to ensure that the driver has to do something at intervals to register consciousness when the driver assist is engaged. Otherwise, public pressure and the auto insurance lobby will inevitably compel the pols to "Do something about it, quickly!" We know that usually doesn't end well.

Tesla's system does, but any usable system can be defeated.

NEVER under estimate the ingenuity, or laziness and stupidity of fools.

And that's exactly Tesla's problem. Cars have to be designed for the worst-case scenario, not for the driver who does everything by the book.

No that's why they are doing everything possible make sure the driver signs away whenever's they use AP

All auto manufacturers would be wise to ensure that the driver has to do something at intervals to register consciousness when the driver assist is engaged. Otherwise, public pressure and the auto insurance lobby will inevitably compel the pols to "Do something about it, quickly!" We know that usually doesn't end well.

Tesla's system does, but any usable system can be defeated.

NEVER under estimate the ingenuity, or laziness and stupidity of fools.

And that's exactly Tesla's problem. Cars have to be designed for the worst-case scenario, not for the driver who does everything by the book.

No that's why they are doing everything possible make sure the driver signs away whenever's they use AP

Not possible in Europe. Not only is it generally impossible for companies to transfer legal responsibility to customers for things over which they have no control or it is unreasonable that they should do so (eg how the car behaves if the driver loses consciousness for whatever reason), but regulators will simply step in and refuse to permit cars perceived as dangerous. For the same reason that owners are not free to modify their cars in whichever way they choose.

This is a reality that many US companies, such as Uber, face. Practices that may be culturally or legally acceptable in the US may not be elsewhere. Yet it is only by becoming big players in the global market that they can achieve their ambitions.

All auto manufacturers would be wise to ensure that the driver has to do something at intervals to register consciousness when the driver assist is engaged. Otherwise, public pressure and the auto insurance lobby will inevitably compel the pols to "Do something about it, quickly!" We know that usually doesn't end well.

Tesla's system does, but any usable system can be defeated.

NEVER under estimate the ingenuity, or laziness and stupidity of fools.

And that's exactly Tesla's problem. Cars have to be designed for the worst-case scenario, not for the driver who does everything by the book.

No that's why they are doing everything possible make sure the driver signs away whenever's they use AP

Not possible in Europe. Not only is it generally impossible for companies to transfer legal responsibility to customers for things over which they have no control or it is unreasonable that they should do so (eg how the car behaves if the driver loses consciousness for whatever reason), but regulators will simply step in and refuse to permit cars perceived as dangerous. For the same reason that owners are not free to modify their cars in whichever way they choose.

This is a reality that many US companies, such as Uber, face. Practices that may be culturally or legally acceptable in the US may not be elsewhere. Yet it is only by becoming big players in the global market that they can achieve their ambitions.

Not paying attention to the road is dangerous in any vehicle, including those with AP/driver assist technology, just because it is less dangerous doesn't change the fact that you should still be held responsible if you try using it as an excuse not to pay attention to the road

no, there isn't. There is zero evidence which shows that autopilot prevents accidents sleeping or not, but there is plenty of evidence which shows that autopilot was 'engaged' involved at the time of Tesla accidents.

I dunno, in a situation like this I think it kinda spoke for itslef. How long realistically can an asleep person maintain a straight driving line in a traditional (ie - Non Assisted) car?

I mean, in this specifc scenario the driver shouldn't be allowed near a car for years, if allowed to drive again ever... And more generally we need to ask whether or not if people who fall asleep at the wheel should be allowed to drive...

..But do we need evidence that a car that will mantain a lane on its own is less likely to be in an accident than one that cannot where the driver is incapacitated/Asleep?

there is a big difference between "autopilot prevents accidents" - and not having an accident while autopilot is engaged while sleeping and simply not encountering an accident situation.

We already know, and there if plenty of evidence for, that there are situations where autopilot can not function properly. For example, running into stationary fire trucks and police cars at the scene of an accident, can not detect stationary barriers, and there have been injury and death as well in other autopilot related incidents. So where was this "autopilot prevents accidents" then?

There has never been a test, or study, or anything else which substantiates that Tesla "autopilot prevents accidents".

There are numbers in statistics that Tesla has come up with to basically say that while autopilot was engaged for so many miles driven accidents did not happen. But guess what, for every Tesla with autopilot on the road world wide there are literally thousands (if not millions) of other cars world wide on the road without autopilot that did not have accidents too so can we then say that "non-autopilot prevents accidents' ?

We visited some relatives this past weekend. They live about 70 miles away. During the drive I used cruise control, I did not have an accident so i am gonna say that 'cruise control prevents accidents'.

Not possible in Europe. Not only is it generally impossible for companies to transfer legal responsibility to customers for things over which they have no control or it is unreasonable that they should do so (eg how the car behaves if the driver loses consciousness for whatever reason), but regulators will simply step in and refuse to permit cars perceived as dangerous. For the same reason that owners are not free to modify their cars in whichever way they choose.

This is a reality that many US companies, such as Uber, face. Practices that may be culturally or legally acceptable in the US may not be elsewhere. Yet it is only by becoming big players in the global market that they can achieve their ambitions.

Not paying attention to the road is dangerous in any vehicle, including those with AP/driver assist technology, just because it is less dangerous doesn't change the fact that you should still be held responsible if you try using it as an excuse not to pay attention to the road

In countries like the UK, the law is actually more demanding than in parts of the US as regards driver responsibility in relation to drinking, since it is not the act of driving but being in charge of the motor vehicle that is the offence. So even being found slumped drunk in a stationary car at the side of the road is sufficient.

Nevertheless, the manufacturer cannot simply escape responsibility for things within their control - after all, it might have been a medical condition rather than drink or drugs that rendered a driver unconscious, and if the misuse of the vehicle is foreseeable the decisions made which might have forestalled it or mitigated its consequences are open to examination. So nags, warnings, other reminders about driver responsibility are part of the safety regime which the manufacturer is expected to implement, but clever T&Cs cannot be relied upon to transfer responsibility completely. If a Tesla kills a cyclist by failing to yield on a Dutch road because the driver is asleep, the driver will not escape the penalties of the law, but it still leaves the question of why the car failed to stop or avoid them.

The Tesla uses a torque sensor and the autopilot needs more than just resting hands, it also needs very slight resistance to the autopilot not enough to disable it with a very slight and lite turn of the wheel to the left or right.

... These acts require intentional fine coordinated motor skills a person simply can not do while intoxicated and asleep.

Sorry, but that is just factually incorrect. People have defeated the Tesla torque sensor by wedging a grapefruit in the steering wheel. The dead weight of an unconscious driver’s hands could absolutely defeat the abort sequence.

that's not true. the grapefruit, as well as the orange, trick has been tested and shown to not work and be a suspected hoax in some cases. Most of the video involving the orange trick (the same as the grape fruit trick but with an orange) were removed because its not true.

If you read the descriptions carefully you will find out why it does not work. For example...

in reference to the article link above, and here is why it does not work..the article says this

Quote:

Suffice to say, Tesla owners, do not try this at home. The check-in feature is a safety feature and uses pressure sensors in the wheel to determine whether a driver is physically controlling the car. If the sensors can’t detect pressure after two minutes, the car sends out a warning until the driver puts their hands back on the wheel. With the orange in place, the man who uploaded the video could trick the pressure sensors into thinking he was driving, and ultimately sit back, relax, and not have to manually operate his expensive sports car.

Tesla does not use "pressure sensors in the wheel", they use torque sensors in the steering column assembly. The torque sensors sense torque pressure on the wheel when its moved very slightly. Neither an orange or a grapefruit are able to move the wheel enough to trigger the torque sensor in a manner to satisfy autopilot if the autopilot is working correctly. The grapefruit or orange just keeps the wheel to one side slightly, but autopilot needs very slight movement over a period of time by providing resistance to the autopilot control of the wheel, wheel moves left and right, and an orange or grapefruit can not do that. If the torque sensors detect torque pressure in one direction only for a period of time (like that with the orange or grapefruit) its supposed to turn off and if it continues to operate it means its malfunctioning and will at some point just fail with suddenly turning the car in the direction the orange/grapefruit weight was torquing the wheel. In other words the orange/grapefruit trick would be disclosing a malfunctioning autopilot which was placing the driver/passenger(s) in potential danger.

No, just the dead weight of weight of an unconscious driver’s hands can not defeat the autopilot abort sequence, or at least its not suppose to but if it does it means Autopilot is malfunctioning and should not be used and Autopilot knows this and is suppose to shut down and this is where the flaw enters because in some cases it will not shut down.

Autopilot needs that which I stated, its the way its designed, a grape fruit can not defeat both the consistently changing hand pressure counter steer and the torque sensor requirement. Grape fruit, the last time I looked, do not have hands.

Let’s be honest. Drunk people who think it’s a good idea to drive (probably because they’re drunk?) are going to do it regardless. It’s the one case where autopilot is currently making the roads safer.

No. This man was lucky to be stopped by the police. Who knows what would have happened if they hadn't stopped him?

Agreed.

When I had a DUI 3 years ago, my BAC was the same .34 that the guy in the Netherlands had.The nurse in the ER who did the blood test wondered just how I was conscious at that level.

I pled guilty, lost my license for 90 days, got a year's probation, and had to carry pricey SR-22 insurance for 3 years.

Soon after being sentenced, I sold my car and no longer drive.I don't drink very often anymore either, but between losing my job, paying off the probation fees, court costs, and increased insurance I could no longer afford to drive.

no, there isn't. There is zero evidence which shows that autopilot prevents accidents sleeping or not, but there is plenty of evidence which shows that autopilot was 'engaged' involved at the time of Tesla accidents.

I dunno, in a situation like this I think it kinda spoke for itslef. How long realistically can an asleep person maintain a straight driving line in a traditional (ie - Non Assisted) car?

I mean, in this specifc scenario the driver shouldn't be allowed near a car for years, if allowed to drive again ever... And more generally we need to ask whether or not if people who fall asleep at the wheel should be allowed to drive...

..But do we need evidence that a car that will mantain a lane on its own is less likely to be in an accident than one that cannot where the driver is incapacitated/Asleep?

there is a big difference between "autopilot prevents accidents" - and not having an accident while autopilot is engaged while sleeping and simply not encountering an accident situation.

We already know, and there if plenty of evidence for, that there are situations where autopilot can not function properly. For example, running into stationary fire trucks and police cars at the scene of an accident, can not detect stationary barriers, and there have been injury and death as well in other autopilot related incidents. So where was this "autopilot prevents accidents" then?

There has never been a test, or study, or anything else which substantiates that Tesla "autopilot prevents accidents".

There are numbers in statistics that Tesla has come up with to basically say that while autopilot was engaged for so many miles driven accidents did not happen. But guess what, for every Tesla with autopilot on the road world wide there are literally thousands (if not millions) of other cars world wide on the road without autopilot that did not have accidents too so can we then say that "non-autopilot prevents accidents' ?

We visited some relatives this past weekend. They live about 70 miles away. During the drive I used cruise control, I did not have an accident so i am gonna say that 'cruise control prevents accidents'.

there is zero evidence that Tesla "autopilot prevents accidents".

There’s also probably no studies suggesting covering a toaster in cling film is a bad idea, but I don’t need a study to know that