Roger Ratcliffe says ‘baa’ to the driverless car concept

I don’t know about you but the thought of driverless vehicles absolutely scares the multigrade lubricant out of me.

They didn’t get off to a very encouraging start in the US. Last month a self-driving bus hit a truck within two hours of its debut on the streets of Las Vegas. Clearly, the techie eggheads who invented them are gambling on us all loving the idea of being chauffeured about, but my money is on the whole concept being, well, a bit of a car crash.

I mean, how can onboard computers possibly be savvy enough to deal with every driving situation? Here’s an example. Anyone who regularly travels along the moorland road between Burley Woodhead and Ilkley in Yorkshire knows that the free-roaming sheep either have a death wish or just enjoy playing chicken with each other, because some of them are liable to wait until the last possible moment before they cease rubbing their behinds on a drystone wall and lazily walk out in front of cars.

Drivers eventually realise that the kamikaze ones are those that exude the greatest air of indifference to traffic, and so it’s necessary to get ready to swerve or do an emergency stop. A car programmed in California won’t know this. Ah, say the techies, the cars will have “smart” computers. They will learn from real-life road situations and store them on their aptly named hard drives. I see, so they’ll be able to spot the mischievous glint in the eyes of Ilkley Moor sheep? Hmmm.

Actually, I’m not comforted by the thought of the computers learning as they go along. I wouldn’t want to be in a driverless car when it crosses the M62 between Yorkshire and Lancashire for the first time and sees that farm in the middle of the carriageways. Who hasn’t had a bit of a WTF wobble when coming across that?

The more I think about it, the more I’m terrified by the idea of cars having inbuilt artificial intelligence. I keep thinking of the all-seeing, all-knowing computer aboard the spaceship in Douglas Adams’s Hitchhiker’s Guide to the Galaxy, which sensed some imminent catastrophe ahead. Would driverless car computers do the same and, like the one in Hitchhiker, start to sing You’ll Never Walk Alone?

Driverless cars are supposed to have ears, too. This means the computer will hear when another vehicle toots its horn. But it won’t know whether the horn is to let you know a) it is giving way to let you pull round an obstruction ahead, b) point out your right indicator bulb has either gone or your last manoeuvre almost caused a multiple pile-up so you’d better get ready for a bit of road rage, mate, or c) England have just beaten Germany at Wembley, the toots being accompanied by chants of “Eng-er-land, Eng-er-land”.

Of course, the computer that’s outside the reach of hackers has yet to be invented, so satnav-controlled cars would be a big temptation for the Kremlin’s coders. Sitting in their hacking farms east of the Urals, they have already accessed voting systems in the US. Think of the chaos they could cause by directing every vehicle to clog up the centre of Leeds or Manchester.

Some RTAs by their nature are unavoidable.
Consider this situation, a collision with a pedestrian for whatever reason is unavoidable, option 1 is to swerve left and have a fatal collision with a child, option 2 is to swerve right and have fatal collision with an senior citizen. How does an AI make a morale decision in these cases on who lives and who dies, the person with their whole life ahead of them or the person who would likely sacrifice their what life they have left to save the child. And ultimately who carries the liability for that decision, the coder or the driver for not recovering control.
There are so many issues other than technology to solve on this matter.

Matt

13 Dec 2017 15:09

Your examples are extremely weak arguments. It is clear the author has ZERO knowledge of self-driving tech and the programming algorithms associated with it. Forget the fact 1.2 million HUMANS die in car accidents each year with 90+% of those being because the HUMAN made a wrong decision. Take a little time to do some research before you write about a topic you clearly aren't educated on.