Posted
by
Soulskillon Monday November 21, 2011 @05:27PM
from the we-who-are-about-to-die-salute-you dept.

fergus07 writes "Toyota is to show an autonomous Prius at Tokyo Motor Show. Dubbed the Toyota AVOS (Automatic Vehicle Operation System), the car will be available for members of the public to take 'back seat' rides at the show, demonstrating first hand how the Prius can avoid obstacles, be summoned from a parking garage and park itself."

I hate to agree with you, but i think its true, no one will tolerate a self driving car crash, even if it is just one. Even trains front crash time to time , something we think should be impossible to happen.
Being benevolent, lets assume one of those car crashes , another driver fault, not a clear one, but his fault, what are the makers going to do defend themselves with system logs?

Under current law, the person behind the wheel in the drivers seat is considered the operator, and liable for whatever the vehicle does. The owners liability (assuming they weren't driving) is dependent upon their insurance, and the fact that the vehicle is autonomous is irrelevant. The developers, assuming they had not signed an unprecedented and absolutely retarded employment contract, have no personal liability. Toyota could only be found liable if it was proven that a defect in the vehicle caused the crash.

Simple fact is, before autonomous cars will really become commercially viable, a lot of laws have to change, mainly around liability of the manufacturer since they're taking on more responsibility. Most likely though, the operator will retain the majority of the liability, and we're unlikely to see in our lifetimes a car where you can punch in a destination and take a nap. It'll be more like an advanced cruise control. The operator still has total ability to control, is required to keep hands on the wheel and attention on the road at all times, and is responsible for intervening in the case of an emergency.

I don't think this is as big a deal as people always fear. The person operating a machine normally takes responsibility for what it does under their direction. Nobody says, "that backhoe just dug a cellar," they say, "I dug a cellar" (even though 99.99% of the caloric expenditure was by the backhoe). Nobody says, "Excel just computed our monthly budget," they say, "I just worked out our monthly budget" (even if Excel did 99.99% of the calculations). Only when we're thinking into a future we don't yet understand does it seem like the machines will be making all these "intelligent" decisions. Once the machine is in hand and understood, we feel like we are making the decisions (even though the machine is actually making thousands every second, as with an airplane autopilot). Our perception of intelligence on the part of the machine disappears. Once we know what to expect from them we simply laugh at those who don't and assume they are idiots (pertinent example [snopes.com]). People even feel this way when working through human subordinates. "George Washington crossed the Delaware River." It doesn't mean he rowed the boat.

Yet no one seems to care. 500 US troops die a year in the middle east and it's a huge deal. These are 35,000 deaths that can easily be avoided. And that's only in the United States Yeah there'll be a few deaths, but probably 99% of the 35,000 will be avoided. Everyone should be forced to own one of these considering how many pedestrians are run over. People have to get over their own greed to drive a car fast though.

This is actually very easy to deal with. The driver is still liable. The insurers decide, based on the cars, the expected crash rate for autonomous vehicles. They don't really care about individual situations, they care about overall numbers. They can choose how much to charge if it's an automated driver, and how much if it's a physical driver, and pay out if it fails. It's really not a hard system. If autonomous vehicles are safer drivers, they will take over a lot faster due to significantly reduced insurance costs relative to physical drivers.

Self-driving cars are the way of the future. Why drive when you don't have to? Once people get over the fear of trusting the software they will realize that their time is far too valuable to waste driving.

This seems like a pretty narrow concept of freedom. I'm kind of uncomfortable with self-driving cars myself, I have the control-freak instinct, I currently drive a stick-shift mostly for that reason. But it really is pretty hard to argue against either safety or practicality of self-driving cars.. I'm assuming that the self-driving car really is more like a taxi than a bus, in that if I decide half-way to my destination that I want a different destination, I can just make it so, and that will be that, and furthermore that if I want to take the scenic route down along the creek instead of the freeway, I can get that too.

So, I can still pick my time of departure, my route, and my destination, and change my mind in mid-drive, only my freedom to operate the vehicle has been removed. Yeah, it bugs me a bit, but I don't know if I'm ready to die for it.

And where's the line? In my city, it's hopelessly impractical (and maybe illegal) for me to ride a horse to and from work. Is that an unacceptable infringement on freedom of movement? Should I die for that one too?

Simple fact is, before autonomous cars will really become commercially viable, a lot of laws have to change, mainly around liability of the manufacturer since they're taking on more responsibility. Most likely though, the operator will retain the majority of the liability, and we're unlikely to see in our lifetimes a car where you can punch in a destination and take a nap. It'll be more like an advanced cruise control. The operator still has total ability to control, is required to keep hands on the wheel and attention on the road at all times, and is responsible for intervening in the case of an emergency.

There is a legal principle... I don't remember the latin, but a rough translation is 'the law is not stupid.' Legal decisions are made by judges, not bureaucrats or computers blindly following the rules. That's the essence of a common law system: the legal system is based on an understanding that reality is too complex to legislate completely, and judges have the authority to interpret how law is applied to reality as necessary. A literal interpretation is best if possible, but judges have leeway. Precedent then exists to ensure that the law, as actually applied, is consistent.

So, I suspect that if you try just sitting in the passenger seat and get into an accident, the judge will determine that:1. You're still the operator.2. You're an idiot.

There are only so many jobs to be had, and when two stupid people have nine children, they've just created seven people who are more likely than the first two to be unemployed. Get a damn condom.

Given the projected ratios of earners to SSA recipients in the next 50 years, those seven extras are going to be needed to keep the SSA from collapsing.

Do remember that the SSA wasn't designed for operations with fewer recipients than workers supporting same. And that our lower-than-replacement rate growth accompanied by increased life expectancy (I just read that a baby born this year in the West (USA, Canada, Western Europe) has about a 50% chance of reaching 100) will make that whole social security thing a real problem by and by.