Police Report Untangles Uber Crash Mystery

MADISON, Wis. — EE Times has obtained from the Tempe (Ariz.) Police Department the report of a traffic collision involving an Uber vehicle on March 24.

The full report tells a story different from earlier, sketchy accounts of the crash. It also reveals details that might prompt more questions about Uber’s actions — or inaction — at the intersection where the accident occurred.

EE Times got the help of Mike Demler, senior analyst at The Linley Group, to decipher the police report.

Before getting into detail, here’s a recap what we reported before:

Until police info became public, the prevailing press narrative, including this publication, was that the crash resulted when the driver of a second vehicle “failed to yield” to the Uber car while making a left turn.

Observers were quick to conclude that this was an accident caused by a reckless “human” driver. Uber — which was in self-driving mode — was not at fault.

Intersection of South McClintock Dr. and East Don Carlos Ave. where the Uber accident took place on March 24, 2017

When we put together statements by drivers and an eyewitness included in the police report, we begin to see a different picture.

First, the driver who hit Uber’s Volvo was in the intersection waiting to turn left and was therefore moving slowly. She wasn’t exactly making a sudden, reckless move.

Second, Uber’s Volvo, in self-driving mode, was moving at 38 mph in a 40 mph zone and failed to detect the left-turning vehicle. Further, although the Uber’s driver remembers the traffic light changing to yellow when his car entered the intersection, the Uber Volvo didn’t react, neither hurrying nor hesitating.

Fortunately, nobody got hurt. After a dormant weekend, Uber announced that the company has resumed its driverless pilot project, picking up passengers both in Tempe and Pittsburgh.

Uber, originally quick to announce its decision to “ground” its driverless vehicles after the crash, has since fallen silent. The company has ceased all comment about the accident. An Uber spokesperson told us via email, “What I've shared to date is all we are sharing on our findings.” She referred to Tempe Police Department for any more information on the incident.

So, to understand what really happened, and what — if anything — we can learn from this crash, we turned to the Police Report, provided to us by Detective Lily Duran, Media Relations Unit, at Tempe Police Department.

One thing not mentioned is something I encounter every day. Many cities have rules about not blocking intersections. In crawling traffic, most drivers will stop or slow before entering an intersection even on green, just in case the line of cars on the other side stalls before the light turns. Stacking onto a stalled lane is not only discourteous, in some places you may get a ticket if the police happen by. It would be curious to see if the Uber software exhibits this courtesy, or has awareness of any intersection non-blocking rules where it is actual law. The speed at which it entered the intersection despite the stopped left lanes, suggests not.

Yes the algorithim needs to be tweaked. I'm surprised that the "procced with caution" type of programming was not already in place. You don't maintain the same speed when approaching a busy intersection with lots of stopped traffic - that is insanity.

You have to anticipate that in stopped or nearly stopped traffic that unusual situations will suddenly appear in front of you. Animals in the roadway, people, vehicles perhaps disabled, etc... You don't approach those situations at near posted speeds.

As for the comments stating the UBER did not run the light... Sometimes you can anticipate that a green has been on for an extended period. Experience tells you that it's likely to change soon. Armed with that information plus the fact the intersection is full the UBER clearly shares much of the fault for driving wrecklessly into the intersection while given the circumstances.

Legally allowed vs. what is the right thing to do in this situation is really what were talking about.

While I do expect them to figure it out I don't expect anticipating what humans will do wrong while driving as well as humans do to be easy. Needless to say automated cars will have to coexist with human drivers for many years.

For all of the commentary on "typical driver" or "responsible driver" or "conservative driver" we neglect to recognize the critical issue of the learning curve.

Today, when you drive around on public roads, you endure the risk that EVERY driver in EVERY vehicle on that road must progress through a learning curve to reach the level of being a "typical" or "responsible" driver. Having taught my four children to drive, and having endured riding with them frequently during the first year (and less frequently during subsequent years) of their driving, and having of course endured my own learning curve as I learned to drive, I can hardly resist raising this point.

Come on folks -- 'fess up. You KNOW how many bad decisions new drivers make on the road. You see it frequently. Even if they will grow and mature to become very capable drivers in time, when any of us first start we are VERY POOR at anticipating, at recognizing risk scenarios, at reacting appropriately to unfamiliar and sudden variances. The statistics bear out the disproportionate share of traffic accidents caused by inexperienced/new drivers.

So before we all jump up and down about how an autonomous vehicle might be at fault, or might have done better (even if not at fault), recognize that whatever the learning curve is, it need only be endured ONE time before it can be applied to that company's entire fleet of vehicles. One accident, and every autonomous "driver" from Uber (or whichever other autonomous car maker), now and for untold years into the future, learns from it??? Did that EVER happen when a teen driver got hit making a left turn at the wrong moment?

Now we are just getting the basics figured out, with small fleets of autonomous vehicles operating under experimental licenses. Still as they progress up the learning curve, every furture vehicle in the fleet will reflect the learning from these test experiences. As autonomous vehicles deploy more widely, we'll see thousands, then hundreds of thousands, then millions of daily experiences forming the base on which all other cars in that provider's fleet will learn.

The question isn't whether autonomous cars are perfect. They aren't. There will be accidents. But what is a better scenario, a renewable crop of millions of new drivers on the road each year, all making the same rooky errors, every year, forever; or a one-time crop of a few thousand test vehicles on the road, each making rooky errors, once, that may well never be repeated ever again.

True enough, Junko. Looking at just this one specific case, had the Uber car gone slower throught the intersection, e.g. because it sensed the line of traffic to its left and consequent lack of visibility across the entire intersection, then its sensors would have had the time to see the oncoming Honda, and prevent the collision.

So, the autonomous algorithm had all the information it needed to be safer. Had there been full visibility of the intersection, going through at the posted limit would have been fine.

It's going to involve a series of these tweaks, over multiple traffic scenarios, before the algorithms will behave in ways acceptable to most passengers.

But I think this situation was not the same as that one fatality, caused by clearly inadequate vertical resolution of the sensor.

My point being, the Autonomous driving algorithm can handle that maneuver if programmed to do so. Obviously, in this case it does not appear it had any alternative accident avoidance routine(s) enabled.

Thanks, Bert. I see your point. Let's not blow things out of proportion. That said, i think one of the things the engineering community is still grappling with is how we teach machines "to negotiate with the traffic."

You may call it just "tweaking algorithms." But figuring out the right algorithms to negotiate in a dense traffic won't be easy from what I heard from experts.

From the article, "The Uber driver said in his statement that he saw the traffic light change to yellow as his car entered the intersection".

Taking this at face value, the Uber vehicle was not trying to run the yellow light.

I agree with the poster who said let's not blow this out of proportion. This is a situation where some drivers might have slowed down and proceeded cautiously through the intersection, while others would have just maintained their posted speed. The Uber vehicle did the later. From my experience on the roads seeing what happens at intersections when the light turns yellow (or even red, for that matter), so would most other drivers.

Driving 101 : Always slow down at traffic intersection. Zooming through it at near 40 mph is an accident waited to happen which it did. The traffic lights are designed to support turning left when yellow lights are on, not exclusively for the other vehicles to run yellow lights.

I have often been in exactly this situation, except replace the Honda with a pedestrian crossing in front, having no visibility of oncoming traffic in the curb lane.

Experience teaches you to slow down, even if you have a right turn arrow, because people or vehicles, unable to see you coming, will show up suddenly. Makes NO DIFFERENCE who is right and who is wrong. It will still spoil your whole day.

The algorithm simply needs to be tweaked. No sense getting overly dramatic about this.