QuietBlue wrote:True, but it's puzzling (and concerning) that the bus was not able to take any evasive action to avoid being hit.

Do we wait until autonomous vehicles can not only drive perfectly among other autonomous vehicles, but can also perfectly avoid sleepy, drunk, sloppy, or even terroristic drivers? We will wait forever.

Instead, ticket (or jail) the human driver who made the mistake. And get human drivers off the road ASAP.

Yeah, I agree. Lots of people will be excited about self-driving cars, but there's going to be a very, very long tail on standard cars. Some people just want to get point A to point B. Others want to do so but demand their own hermetic compartment. They're good candidates for self-driving. But there are a lot of folks who actually like driving, or who see it as an expression of their mobility and won't be comfortable giving the power of where they can or can't go over to a corporation. There's going to be a "cold, dead hands" movement, for sure.

This is part of why I don't think self-driving cars are going to be as transformative as some others say. It won't really increase road capacity or travel time (since they'll have to cope with standard cars). It will increase safety...to a point. The environmental impact of producing fewer cars is minimized by the fact that people wealthy enough to own a self-driving car won't want to share it, and people who currently bike or take transit may switch to self-driving-carsharing. And if people cared so much about getting back their commute time so they could read/work/whatever, they'd already be taking the bus.

The biggest impacts IMO are:
- Cheaper/faster replacement for taxis and local buses in sprawling, transit-poor cities like MSP
- Filling in Last Mile gaps in cities with built-up metro/commuter rail networks
- Small quality of life improvement for the wealthy
- Simpler/reduced/centralized parking in urban environments

Do we wait until autonomous vehicles can not only drive perfectly among other autonomous vehicles, but can also perfectly avoid sleepy, drunk, sloppy, or even terroristic drivers? We will wait forever.

Instead, ticket (or jail) the human driver who made the mistake. And get human drivers off the road ASAP.

I don't think expecting an autonomous vehicle to be able to do something as simple as backing up is asking all that much. Just because the human driver was at fault does not absolve the autonomous vehicle (or rather, its engineers/programmers) of the responsibility of protecting its occupants.

Edit: Alternately, a human bus driver in this situation could have sounded their horn to alert the truck driver. Do these vehicles not have the capability to do that?

Do we wait until autonomous vehicles can not only drive perfectly among other autonomous vehicles, but can also perfectly avoid sleepy, drunk, sloppy, or even terroristic drivers?

Yes, we do! These things have algorithms configured via deep learning driven by millions of simulation cycles. They are not being intentionally programmed to react to specific situations. That is both good and bad. Good in that simulation can often catch weird edge cases and the computer can learn how to properly respond. Bad in that there is no guarantee it will find all such situations (indeed it is guaranteed not to), even situations somewhat easily anticipated by people.

These things need TONS of testing before they hit the road because algorithms can go off in very strange directions when presented with unexpected input. I don't think it's technical engineering that's holding autonomous vehicles back, it's our (proper) lack of trust in how these computers are programming themselves.

QuietBlue wrote:I don't think expecting an autonomous vehicle to be able to do something as simple as backing up is asking all that much. Just because the human driver was at fault does not absolve the autonomous vehicle (or rather, its engineers/programmers) of the responsibility of protecting its occupants.

Edit: Alternately, a human bus driver in this situation could have sounded their horn to alert the truck driver. Do these vehicles not have the capability to do that?

Are you blaming the victim or the perpetrator?

Should pedestrians wear high-visibility vests and blinking lights at night?

Should pedestrians take the blame if they get hit in a crosswalk because they’re looking at their phone?

Google/Waymo, Uber, etc., are collecting this data, too. Tesla’s system is constantly learning by comparing what it would do if Autopilot were on, with what the driver does with Autopilot off.

I’m not saying it’s ready today, for all situations; quite the opposite. But it does seem like it’s ripe for testing in limited situations, in ideal conditions, at low speeds, such as low-speed shuttles in defined loops, for example. People can choose to (or not) participate.

If anything, these news items are showing us just how bad human drivers are. They even crash into low-speed shuttle buses.

Oh yes, it's ready for limited road testing. I don't believe it will be ready for mass consumption for some time. I have no idea what that timeline is but I'm pretty sure it's at least 5 years out, maybe 10.

And you're right, it's probably some of both. But I suspect it's heavily biased to deep learning. It's all the rage, as the kids say.

Should pedestrians wear high-visibility vests and blinking lights at night?

Should pedestrians take the blame if they get hit in a crosswalk because they’re looking at their phone?

These kinds of questions put the blame on the victim.

Well, the victims, in this case, would be the passengers on the bus, though there were no injuries. So no, I'm not blaming them. Why would I?

This is new territory in that our existing laws and concepts of motor vehicle accident fault don't take this kind of situation into account yet, but they will have to very soon. But here, I'd argue that the vehicle itself should be considered defective. So I'd blame the human driver for what they did, but also the manufacturer for designing an unsafe product.

Yes, it would be impossible to design a vehicle that could always avoid collisions, but something like this seems very avoidable.

We add safety features to cars all the time. It doesn’t mean that humans can’t drive until all the cars have these safety features. They’re a bonus, not a requirement for all current vehicles.

So, sure, add safety features to self-driving cars. It doesn’t mean they’re not ready for testing in real situations, yet. (In fact, features like backing up and honking the horn are probably software features that can be added in realtime.)

That depends on whether you see them as something extra, or something that should be part of the minimum standards. To me, it's the latter. Especially since they don't seem like a difficult thing to fix.

I'm not against self-driving vehicles, or even limited road testing. I'm just puzzled by the lack of foresight on the manufacturer's part here. Hopefully they can correct it soon.

The former is clearly too restrictive--millions of people get on airplanes every day, despite the fact that they're only mostly, not completely, safe. The latter is clearly too permissive--we treat engineering and human errors differently, and when an autonomous car drives someone's kid off a cliff, nobody's going to be all that comforted by the idea that, statistically, it was very unlikely.

It's a hard, open, and somewhat philosophical question where we set the bar. I can guarantee Silicon Valley is going to go to Congress and push hard for the latter standard, and taxi drivers are going to take to the streets in support of the former.

Another wrinkle to that issue is that, while neither human drivers or self-driving cars are completely safe, our road system is full of engineered mitigations of specific ways that human drivers are unsafe (curb reaction space, rumble strips, bike lane buffers, etc). However, self-driving vehicles, even if they are overall as safe as human drivers, are, as we're seeing, unsafe in totally different ways. It's going to take time and experience to even figure out what that shortcomings of self-driving cars even are, and even longer to retrofit hundreds of thousands of lane miles with whatever mitigations traffic engineers come up with. Until then, self-driving cars are going to keep getting into collisions that seem trivially avoidable to human eyes, and each one will be a significant stepback in public acceptance of autonomous vehicles, no matter how statistically safe they are.

Silophant wrote:Another wrinkle to that issue is that, while neither human drivers or self-driving cars are completely safe, our road system is full of engineered mitigations of specific ways that human drivers are unsafe (curb reaction space, rumble strips, bike lane buffers, etc). However, self-driving vehicles, even if they are overall as safe as human drivers, are, as we're seeing, unsafe in totally different ways. It's going to take time and experience to even figure out what that shortcomings of self-driving cars even are, and even longer to retrofit hundreds of thousands of lane miles with whatever mitigations traffic engineers come up with. Until then, self-driving cars are going to keep getting into collisions that seem trivially avoidable to human eyes, and each one will be a significant stepback in public acceptance of autonomous vehicles, no matter how statistically safe they are.

As we see the occasional failings of self-driving cars, and as the best cities are closing off their densest business districts to any & all cars, I think self-driving cars will work best initially on divided highways and the newest suburbs’ major roads, where roads are straight, lanes are wide, striping is reflective, bright, & consistently applied, lanes are dedicated (straight or left-turn only, e.g.), and traffic lights govern traffic flow.

Those seem to be the easiest problems to solve, the low-hanging fruit. Over time, as machine learning gets better (more data), they may also work in inner ring suburbs, residential areas of cities, and older cities.

But I think gas-powered cars will be out sooner than most people think.