Share this story

Tesla on Tuesday escalated its media battle with the family of Apple engineer Walter Huang. Huang died in Silicon Valley last month when his Model X vehicle crashed into a concrete lane divider at high speed. Tesla's Autopilot driver assistance system was engaged at the time. Tesla made its clearest statement yet that Huang—not Tesla—bore responsibility for his death on a Mountain View freeway.

Huang's family has hired an attorney to sue Tesla. In an on-camera interview with local television station ABC 7, Huang's wife, Sevonne, said that prior to his death, Huang had complained to her that the car had a tendency to drive toward the exact traffic barrier that ultimately killed him.

But in a statement to ABC 7 on Tuesday evening, Tesla turned this argument around.

"We are very sorry for the family's loss," Tesla wrote. "According to the family, Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location."

Tesla didn't stop there.

"The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so."

Especially since the 2016 death of Tesla owner Joshua Brown, Tesla has emphasized that Autopilot is a driver-assistance system rather than a full self-driving system. Drivers are warned to keep their hands on the wheel while Autopilot is engaged.

Unsurprisingly, Huang's family hasn't been happy with Tesla's combative response to his death. "It appears that Tesla has tried to blame the victim," the family's lawyer said to ABC 7 on Tuesday.

Tesla's response is an unorthodox one for a manufacturer facing a fatality involving one of its products. Most companies in Tesla's position would have just stopped at "we are very sorry for the family's loss"—perhaps citing the ongoing government investigation or likely litigation as reasons not to comment further. Engaging in a long-running argument with a grieving widow seems unlikely to improve Tesla's image. And federal investigators have already complained about Tesla talking publicly about the crash before the official investigation has concluded.

But Tesla argues that, despite occasional deaths, Autopilot actually saves lives overall. If people get the mistaken impression that Autopilot is unsafe, Tesla says, there's a danger that fewer people will use it—which could lead to more people dying on the roads overall.

"The reason that other families are not on TV is because their loved ones are still alive," Tesla wrote.

There is some evidence for Tesla's argument that Autopilot makes driving safer. A 2017 study by the National Highway Transportation Safety Administration (NHTSA) found that the rate of accidents for Autopilot-enabled Tesla cars dropped by 40 percent after the activation of the technology. But NHTSA didn't break down the severity of these crashes, or tease out which specific function of its system were responsible for the decrease—for example automatic emergency braking versus lane keeping—leaving open the possibility that Autopilot prevents a lot of minor crashes but is less effective at preventing deadly collisions like the one that killed Huang.

532 Reader Comments

So at the end of the day, the driver is still responsible with what happens with the car--not too surprising. The lawsuit and decision in this case will have far reaching answers that insurance companies and the legal system have been looking for in terms of 'who's fault is it' in cases of accidents involving automated or assisted driving. Very unfortunate for the family that he continued to test the system when he knew it could fail.

If Autopilot does things like drive into concrete barriers and the backs of fire trucks I'd say it's pretty unsafe. It may only be safer (which is up for debate anyway) when a human is able to catch all of its errors. Unfortunately, paying attention while the car is mostly driving itself is not something we're wired to do.

If Autopilot does things like drive into concrete barriers and the backs of fire trucks I'd say it's pretty unsafe. It may only be safer (which is up for debate anyway) when a human is able to catch all of its errors. Unfortunately, paying attention while the car is mostly driving itself is not something we're wired to do.

I'd wager if you're incapable of paying attention while the car is mostly driving itself that's a YOU problem and not a car problem.

Nobody forces you to engage autopilot.....

Not trying to shill for Tesla here but I put these up there with spraying Windex in your eyes and eating Tide pods. There are warnings in place telling you to keep your hands on the wheel and pay attention, if you choose to ignore them, what more can the manufacturer do at that point?

If Autopilot does things like drive into concrete barriers and the backs of fire trucks I'd say it's pretty unsafe. It may only be safer (which is up for debate anyway) when a human is able to catch all of its errors. Unfortunately, paying attention while the car is mostly driving itself is not something we're wired to do.

Marketing aside autopilot at its core is adaptive cruise control and lane keeping (autosteer). It also has some other minor systems like blind spot detection, and auto parallel park but at its core it is combining those two existing systems.

Is a car with adaptive cruise control safe?Is a car with lane keeping safe?Does it becomes unsafe when you combine the two?

This is a question that goes beyond just Tesla. Pretty much every major brand now has or will soon have Level 2 driver assist technology. It also won't remain at the luxury segment for long just like airbags didn't.

If you treat Autopilot like it is a level 5 self driving car it will eventually kill you. You might luck out for a while but eventually it will kill you. Of course that is true of any level 2 driver assist tech.

It is a driving tool, however their is a issue with the latest version of autopilot with centering.The last update has introduced a issue with left hand exits, something called barrier lustHeres a link to video of the Barrier Lust in a teslahttps://www.reddit.com/r/teslamotors/co ... st_201812/You can see what happens is that it tries to center itself, and due to the left exit it thinks the lane is widening and converges to the center. And it follows it untill it hits the barrier.TLDRLatest update has a issue, however autopilot is NOT autonomous, so still drivers fault. Update should be fixed though.

In an on-camera interview with local television station ABC 7, Huang's wife, Sevonne, said that prior to his death, Huang had complained to her that the car had a tendency to drive toward the exact traffic barrier that ultimately killed him.

Quote:

"It appears that Tesla has tried to blame the victim," the family's lawyer said to ABC 7 on Tuesday.

No, his wife did that, on camera. She's making him sound FarCry 3 insane.

The sample size for determining overall safety of the autopilot versus human drivers is still pretty small, particularly in regards to serious incidents. Of course that will not stop conclusion-jumping on either side. Sigh. And condolences to the family involved in any case.

If Autopilot does things like drive into concrete barriers and the backs of fire trucks I'd say it's pretty unsafe. It may only be safer (which is up for debate anyway) when a human is able to catch all of its errors. Unfortunately, paying attention while the car is mostly driving itself is not something we're wired to do.

I'd wager if you're incapable of paying attention while the car is mostly driving itself that's a YOU problem and not a car problem.

Nobody forces you to engage autopilot.....

Not trying to shill for Tesla here but I put these up there with spraying Windex in your eyes and eating Tide pods. There are warnings in place telling you to keep your hands on the wheel and pay attention, if you choose to ignore them, what more can the manufacturer do at that point?

Liability aside, designing a system for human use, that doesn't take into account human tendencies, is a design failure.

"Huang's wife, Sevonne, said that prior to his death, Huang had complained to her that the car had a tendency to drive toward the exact traffic barrier that ultimately killed him."

And yet.. they Sue.. apparently, Huang's family feel that you do not bare "personal responsibility" for your actions. if you know the tech was faulty in that area.. and you choose to use it anyways.. how is that Tesla's fault. ? Add to that, you were not paying attention in that area, knowing that area was a problem.

We have become a society of money grabbing apes.. and i mean no disrespect to apes. we do not want to bare personal responsibility for our actions and the first thing we do is go for money.

If Autopilot does things like drive into concrete barriers and the backs of fire trucks I'd say it's pretty unsafe. It may only be safer (which is up for debate anyway) when a human is able to catch all of its errors. Unfortunately, paying attention while the car is mostly driving itself is not something we're wired to do.

One crashed near where I live in Wliiiston and once again Tesla blamed the driver.

"Huang's wife, Sevonne, said that prior to his death, Huang had complained to her that the car had a tendency to drive toward the exact traffic barrier that ultimately killed him."

And yet.. they Sue.. apparently, Huang's family feel that you do not bare "personal responsibility" for your actions. if you know the tech was faulty in that area.. and you choose to use it anyways.. how is that Tesla's fault. ? Add to that, you were not paying attention in that area, knowing that area was a problem.

We have become a society of money grabbing apes.. and i mean no disrespect to apes. we do not want to bare personal responsibility for our actions and the first thing we do is go for money.

I understand it. They lost their son. They are grieving and grieving people very often want someone to blame someone. It isn't always just a cash grab. They want the world to see (through a court victory) that Tesla 'killed' their son and they want Tesla to pay for it.

I've seen Tesla do this before; this is their standard MO, to get aggressive, blame the victim, and use every scrap of info in the car they can to discredit the driver. The fact that this is the best they can come up with, in this case, is probably a pretty strong sign that their autopilot fucked up bad.

I'm not sure why Tesla always gets a pass from the tech crowd. They don't seem too friendly to me.

I feel like at this point the only reason they don't change the name from "autopilot" is because it would be admission of guilt on Tesla's part.

And while I don't get why Tesla chooses to comment on the incident, instead of staying quiet, I do agree that a person who seemed to be smart enough to work for Apple, should know not to rely on autopilot when he knew it wasn't very good. That's the strangest part about this. Was he overworked and sleeping at the wheel, was he trying to commit suicide? It's just crazy to me that he relied on autopilot at that point.

I personally feel like any Driver Assistance that will drive for periods of time and require your attention is potentially lulling you into complacency.

If I had a Tesla and could choose how the AI assists me I would never engage the full driving capabilities and instead set it to engage whenever I may rear end someone or shift lanes inadvertently (as an accident prevent system I guess).

Seems lame to do that even to me, but I don't trust my mind to pay attention if I'm not engaged in the activity.

If Autopilot does things like drive into concrete barriers and the backs of fire trucks I'd say it's pretty unsafe. It may only be safer (which is up for debate anyway) when a human is able to catch all of its errors. Unfortunately, paying attention while the car is mostly driving itself is not something we're wired to do.

Marketing aside autopilot at its core is adaptive cruise control and lane keeping (autosteer). It also has some other minor systems like blind spot detection, and auto parallel park but at its core it is combining those two existing systems.

Is a car with adaptive cruise control safe?Is a car with lane keeping safe?Does it becomes unsafe when you combine the two?

This is a question that goes beyond just Tesla. Pretty much every major brand now has or will soon have Level 2 driver assist technology. It also won't remain at the luxury segment for long just like airbags didn't.

If you treat Autopilot like it is a level 5 self driving car it will eventually kill you. You might luck out for a while but eventually it will kill you. Of course that is true of any level 2 driver assist tech.

The thing that frustrates me (and others) about Tesla claiming Autopilot reduced crashes by 40% is there's no way to say if that was due to AEB, which everyone agrees is a good thing, or lane keeping assist.

It is a driving tool, however their is a issue with the latest version of autopilot with centering.The last update has introduced a issue with left hand exits, something called barrier lustHeres a link to video of the Barrier Lust in a teslahttps://www.reddit.com/r/teslamotors/co ... st_201812/You can see what happens is that it tries to center itself, and due to the left exit it thinks the lane is widening and converges to the center. And it follows it untill it hits the barrier.TLDRLatest update has a issue, however autopilot is NOT autonomous, so still drivers fault. Update should be fixed though.

So the vehicle is actively trying to steer into the barrier, which the driver has to physically fight against, but it’s only the driver’s fault if the result is a crash? That is horse shit.

A few weeks ago I found myself in a Tesla showroom. I had made a passing remark on how distracting their large touch screens must be and a Tesla sales rep interjected by telling me that because all of their cars drive themselves with no driver input being necessary, they could afford to have such an interface.

I feel like at this point the only reason they don't change the name from "autopilot" is because it would be admission of guilt on Tesla's part.

And while I don't get why Tesla chooses to comment on the incident, instead of staying quiet, I do agree that a person who seemed to be smart enough to work for Apple, should know not to rely on autopilot when he knew it wasn't very good. That's the strangest part about this. Was he overworked and sleeping at the wheel, was he trying to commit suicide? It's just crazy to me that he relied on autopilot at that point.

I'm wondering if he actually had previous problems with that guard rail. His wife's comments smack of 'I really don't know what I'm doing, but I think this makes Tesla sound more guilty, so I'm just going to lie to establish an ongoing problem' (whether conscious or subconscious).

If Autopilot does things like drive into concrete barriers and the backs of fire trucks I'd say it's pretty unsafe. It may only be safer (which is up for debate anyway) when a human is able to catch all of its errors. Unfortunately, paying attention while the car is mostly driving itself is not something we're wired to do.

I'd wager if you're incapable of paying attention while the car is mostly driving itself that's a YOU problem and not a car problem.

Nobody forces you to engage autopilot.....

Not trying to shill for Tesla here but I put these up there with spraying Windex in your eyes and eating Tide pods. There are warnings in place telling you to keep your hands on the wheel and pay attention, if you choose to ignore them, what more can the manufacturer do at that point?

Liability aside, designing a system for human use, that doesn't take into account human tendencies, is a design failure.

As someone that gets destroyed in here for saying the same thing about autonomous car companies, no doubt, I agree.

But someone with the financial means to buy a $150,000 car should have gained enough sense in that time. There have been HIGHLY publicized accidents stemming from not paying attention while having Autopilot on. I mean, it's all good to blame the manufacturer I guess, but don't people have any sense of self preservation? The answer can't just be damn the torpedoes, let the courts sort it out.

A few weeks ago I found myself in a Tesla showroom. I had made a passing remark on how distracting their large touch screens must be and a Tesla sales rep interjected by telling me that because all of their cars drive themselves with no driver input being necessary, they could afford to have such an interface.

If Autopilot does things like drive into concrete barriers and the backs of fire trucks I'd say it's pretty unsafe. It may only be safer (which is up for debate anyway) when a human is able to catch all of its errors. Unfortunately, paying attention while the car is mostly driving itself is not something we're wired to do.

Marketing aside autopilot at its core is adaptive cruise control and lane keeping (autosteer). It also has some other minor systems like blind spot detection, and auto parallel park but at its core it is combining those two existing systems.

Is a car with adaptive cruise control safe?Is a car with lane keeping safe?Does it becomes unsafe when you combine the two?

This is a question that goes beyond just Tesla. Pretty much every major brand now has or will soon have Level 2 driver assist technology. It also won't remain at the luxury segment for long just like airbags didn't.

If you treat Autopilot like it is a level 5 self driving car it will eventually kill you. You might luck out for a while but eventually it will kill you. Of course that is true of any level 2 driver assist tech.

I personally don't use either lane keeping or adaptive cruise control on longer drives because I find it's hard to stay as alert as I would if I needed to do those things myself. I also think a big part of the problem with autopilot is how it's marketed. If Tesla changed the name to adaptive cruise control and lane assist people probably would more cautious using it, of course then they'd be admitting that their current marketing was a problem.

Whatever happened to "shut up and wait for the trial"? Both sides are damaging their cases with these public statements.

Their lawyer doesn't want a trial, it would probably just get thrown out. They probably figure they can get a settlement or something. More likely they'd get money out of California for not maintaining the road properly.

If Autopilot does things like drive into concrete barriers and the backs of fire trucks I'd say it's pretty unsafe. It may only be safer (which is up for debate anyway) when a human is able to catch all of its errors. Unfortunately, paying attention while the car is mostly driving itself is not something we're wired to do.

Marketing aside autopilot at its core is adaptive cruise control and lane keeping (autosteer). It also has some other minor systems like blind spot detection, and auto parallel park but at its core it is combining those two existing systems.

Is a car with adaptive cruise control safe?Is a car with lane keeping safe?Does it becomes unsafe when you combine the two?

This is a question that goes beyond just Tesla. Pretty much every major brand now has or will soon have Level 2 driver assist technology. It also won't remain at the luxury segment for long just like airbags didn't.

If you treat Autopilot like it is a level 5 self driving car it will eventually kill you. You might luck out for a while but eventually it will kill you. Of course that is true of any level 2 driver assist tech.

The thing that frustrates me (and others) about Tesla claiming Autopilot reduced crashes by 40% is there's no way to say if that was due to AEB, which everyone agrees is a good thing, or lane keeping assist.

If I had to hazard a guess I would say at the very least AEB makes up most of that 40% improvement. Hell it might even be something like AEB reduces it 50% and then other AP functionality is 10% worse making it net-net a 40% reduction.

On highways the most common accident is rear end collision with a car in the lane ahead and AEB can improve that a lot. It also should get even better with time as manufacturers get more comfortable with the systems and can tune them to be a bit more aggressive.

A few weeks ago I found myself in a Tesla showroom. I had made a passing remark on how distracting their large touch screens must be and a Tesla sales rep interjected by telling me that because all of their cars drive themselves with no driver input being necessary, they could afford to have such an interface.

This is not an uncommon experience. I had a sales person tell me much the same when I got a Model S 4 years ago.

IMO Tesla needs to be very careful with the marketing of 'Autopilot' and a bit more sensitive with the immediate statements whenever there is a death.

condolences to all who have been hurt by this event. we need to be sure that this sort of thing doesn't become common.

this awful experience needs to be a wake up call to the whole industry and government at large. here we have a software engineer, not a hayseed, and he knew there was a problem with the control system in this location. also that location appears to be especially problematic, and yet he apparently lost concentration and let the car handle it.

When you can rely on a technology to be correct 99.9% of the time, I don't think it is reasonable for a human to be able to pay attention the other .1%. It's in our nature to become complacent.

I completely disagree.

I have been driving for the last 4 years and 120 000 km with a car that has a so called adaptive cruisecontrol. When I'm doing 120km/h on the highway with the cruisecontrol on I feel more relaxed than if I were to be doing all the driving myself.BUTI know the system is not perfect and I can anticipate moments when the system can fail (very slow driver infront of me or a bit of a strange truck) and it triggers me to anticipate having to take over the accelerator/brake.

What is even worse in this story is that the driver had mentioned a problem with the auto pilot in that exact spot. That's darwin hard at work.

If Autopilot does things like drive into concrete barriers and the backs of fire trucks I'd say it's pretty unsafe. It may only be safer (which is up for debate anyway) when a human is able to catch all of its errors. Unfortunately, paying attention while the car is mostly driving itself is not something we're wired to do.

I'd wager if you're incapable of paying attention while the car is mostly driving itself that's a YOU problem and not a car problem.

Nobody forces you to engage autopilot.....

Not trying to shill for Tesla here but I put these up there with spraying Windex in your eyes and eating Tide pods. There are warnings in place telling you to keep your hands on the wheel and pay attention, if you choose to ignore them, what more can the manufacturer do at that point?

Liability aside, designing a system for human use, that doesn't take into account human tendencies, is a design failure.

As someone that gets destroyed in here for saying the same thing about autonomous car companies, no doubt, I agree.

But someone with the financial means to buy a $150,000 car should have gained enough sense in that time. There have been HIGHLY publicized accidents stemming from not paying attention while having Autopilot on. I mean, it's all good to blame the manufacturer I guess, but don't people have any sense of self preservation? The answer can't just be damn the torpedoes, let the courts sort it out.

I am pretty sure they do have a sense of self preservation, sure. The problem is, the system is pretty good, and as people, they have other senses, too. A sense of boredom and sleepiness, and lots of others. If it wasn't so good, the sense of self preservation would likely outweigh others. It is when it is very good that we may be lulled into a false sense of security. Does that make Tesla liable? Not in my view, but it is still a design failure.

It's a system designed for machine use and the machine isn't mature enough yet. You shouldn't ignore human tendencies when you are depending on human supervision.

If Autopilot does things like drive into concrete barriers and the backs of fire trucks I'd say it's pretty unsafe. It may only be safer (which is up for debate anyway) when a human is able to catch all of its errors. Unfortunately, paying attention while the car is mostly driving itself is not something we're wired to do.

Marketing aside autopilot at its core is adaptive cruise control and lane keeping (autosteer). It also has some other minor systems like blind spot detection, and auto parallel park but at its core it is combining those two existing systems.

Is a car with adaptive cruise control safe?Is a car with lane keeping safe?Does it becomes unsafe when you combine the two?

This is a question that goes beyond just Tesla. Pretty much every major brand now has or will soon have Level 2 driver assist technology. It also won't remain at the luxury segment for long just like airbags didn't.

If you treat Autopilot like it is a level 5 self driving car it will eventually kill you. You might luck out for a while but eventually it will kill you. Of course that is true of any level 2 driver assist tech.

The thing that frustrates me (and others) about Tesla claiming Autopilot reduced crashes by 40% is there's no way to say if that was due to AEB, which everyone agrees is a good thing, or lane keeping assist.

If I had to hazard a guess I would say at the very least AEB makes up most of that 40% improvement. Hell it might even be something like AEB reduces it 50% and then other AP functionality is 10% worse making it net-net a 40% reduction.

On highways the most common accident is rear end collision with a car in the lane ahead and AEB can improve that a lot. It also should get even better with time as manufacturers get more comfortable with the systems and can tune them to be a bit more aggressive.

It is a driving tool, however their is a issue with the latest version of autopilot with centering.The last update has introduced a issue with left hand exits, something called barrier lustHeres a link to video of the Barrier Lust in a teslahttps://www.reddit.com/r/teslamotors/co ... st_201812/You can see what happens is that it tries to center itself, and due to the left exit it thinks the lane is widening and converges to the center. And it follows it untill it hits the barrier.TLDRLatest update has a issue, however autopilot is NOT autonomous, so still drivers fault. Update should be fixed though.

So the vehicle is actively trying to steer into the barrier, which the driver has to physically fight against, but it’s only the driver’s fault if the result is a crash? That is horse shit.

It did steer into the barrier but there is no physically fighting the car. If you turn the wheel it disengages autopilot. It isn't like you have to muscle the car to the right as the Autopilot servos are whining and trying to force the wheel to the left. Your survival isn't dependent on your upper body strength. If your hands aren't on the wheel and you aren't watching the road though well you will crash.

Whatever happened to "shut up and wait for the trial"? Both sides are damaging their cases with these public statements.

What happened to looking at the data before coming to conclusions? My understanding is the car was so badly damaged that Tesla does not have detailed telemetries. They have no idea how hard the car turned and how much time the driver had to respond.