The first accounts from police about the self-driving Uber car that struck and killed Elaine Herzberg in Tempe, Arizona, rushed to absolve Uber and blame the victim. They described Herzberg, 49, as appearing in the roadway “out of the shadows” “like a flash” and emphasized that she was not in a crosswalk.

It’s a template that police have followed after countless pedestrian deaths caused by human drivers. Every action of the victim is conveyed in the most accusatory light, while the driver’s actions aren’t questioned at all.

As with many of those cases, now that video from the Uber car has been released, the victim-blaming narrative doesn’t hold up. The images should alarm anyone living in an area where these vehicles are being operated on public streets.

The exterior video shows Herzberg, pushing a bike to ferry belongings, had already crossed most of the street at the moment of impact — she didn’t hurl herself into the car’s path. She was outside a crosswalk, but a main selling point of autonomous vehicles is that they’re supposed to detect and safely react to such situations.

Writing at Forbes, Jim McPherson, a legal consultant on autonomous vehicle issues, says Uber’s sensors should have been able to detect Herzberg even in poor lighting. Other industry experts relayed similar conclusions to the Associated Press. Instead, the vehicle did not brake until impact.

Either there was some sort of technical failure, says McPherson, or the car was programmed not to swerve out of the lane in order to protect the vehicle and its occupants, which raises a whole host of ethical questions.

In the event of such a breakdown, a human “safety driver” is supposed to take control of the vehicle. But interior video shows the backup driver looking away from the road for significant stretches of time immediately before the crash:

Tempe Police Vehicular Crimes Unit is actively investigating
the details of this incident that occurred on March 18th. We will provide updated information regarding the investigation once it is available. pic.twitter.com/2dVP72TziQ

It may be a year before the National Transportation Safety Board releases a detailed report on the collision, providing more definitive answers.

For now, Uber has temporarily suspended its AV testing. Other automakers have not. They may have better technology and protocols to prevent collisions, but the public has startlingly few assurances that necessary safeguards are in place.

That’s what public safety watchdogs have been trying to communicate. The regulation of AVs has been too lax as companies beta-test the technology on roads where a glitchy sensor, bad code, or momentary lapse of attention by the human back-up can easily prove fatal.

You need to prove your case that this vehicle could have avoided the homeless woman without taking a greater risk from other traffic. Your personal attacks merely demonstrate that you have no confidence you can make your case

Is there any evidence that the driver could have stopped the vehicle when the homeless woman jumped out in front of the vehicle? I doubt that many human drivers could have avoided her either, without taking the risk of swerving into traffic

She didn’t “jump out” in front of the car. The video is decieving since you can only see 20′ a head of the car (if that was the vis, then the car shouldn’t of been driving >20mph). The actual street is well lit and she would of been visible for hundreds of feet.

Well that’s what we don’t know isn’t it? The cops appear to think she was at least partly at fault. But there may have been a flaw in the program logic as well. I’d assume that a decision had to be made whether to hit the woman or swerve left or right, both of which could have been risky.

Wait until they own the sole rights to use the streets, deeded to them by cities seeking funding for infrastructure repairs “Increasing safe access to major roads for bicycles and walkers, for
example, might offer an unexpected benefit should the robocarpocalypse
arrive: Neighborhoods with deliberately-built, well-maintained paths for
non-automotive transit will retain the freedom of entrance and egress
that the car currently insures everywhere.”from Atlantic magazine article predicting this.

If you outrun your vision field, then you are driving too fast for the road conditions, hence you are speeding, if we are to take the Uber video at face value. Of course, it is safe to assume that the video has been doctored. At least, it is more plausible than to believe that the dashcam i Uber vehicles is no better than a 90s webcam.

Has the NTSB released any reports (or a preliminary ones) detaling the causes of the accident, containing recommendations or mandatory changes for manufacturers and operators of autonomous vehicles like FAA does after any airplane crash?

I don’t know if it’s an Uber problem or an industry wide problem and I really don’t care.

What I want the NSTB to be better than the FAA who after any incident publishes for the whole industry, reports with their findings, recommendations and mandatory changes. No matter matter if the airplane/s were Boeing, Airbus or Embraer, or if they were operated by United, Southwest, British Airways or Lufthansa.

TL;DR: What I would like is that the NTSB together with the Industry analyzed any incident involving an autonomous car (crash, near misses, anything…), and released timely reports on the causes, mandatory changes and recommendations for both, makers and operators.

Each year, motorists on American streets kill nearly 5,000 pedestrians. The loss of life is enormous — equivalent to 12 jumbo jets crashing with no survivors — but the steady drumbeat of pedestrian fatalities doesn’t register as an urgent public safety crisis. Maybe it would seem more urgent if the press covered pedestrian deaths as the preventable […]