Forget Google’s self-driving cars, here’s something totally different and unique. Researchers at Oxford University have developed a new auto-drive technology drive by Apple’s iPad which allows drivers to hand control of the wheel to the robot system to drive itself. Unlike Google’s self-driving cars, this technology combines the best of both worlds: drivers can control their vehicle themselves, but the iPad can optionally take over when the system determines it knows a route.

They built the tablet into the dashboard of a Nissan Leaf and the driver can activate the autonomous driving mode with a single tap. Of course, this technology is still in its infancy and far from commercialization. Currently, the prototype navigation system costs a whopping £5,000, or about $7,500, but researches believe that over time it will work its way down to about just a £100, or approximately $150. I’ve included a bunch of interesting clips just past the break…

CleanTechnica quotes Professor Paul Newman of Oxford University’s Department of Engineering Science who said that “it’s easy to imagine that this kind of technology could be in a car you could buy.”

He also opines in an article published at The University of Oxford web site:

Instead of imagining some cars driving themselves all of the time we should imagine a time when all cars can drive themselves some of the time. The sort of very low cost, low footprint autonomy we are developing is what’s needed for everyday use.

Even though the car itself only moves in 2D, it senses in 3D, but it must learn what its environment looks like before it can take over from the driver.

The system relies on cameras and lasers built into the body of the car. 3D laser mapping enables the system to rapidly build up a detailed picture of its surroundings rather than use third-party maps that are often outdated.

Here’s another video.

AI is heavily involved in interpreting data from the lasers and onboard cameras, including the mathematics of probability and machine learning that retrieves additional data from aerial photos and on-the-fly web queries.

Mind you, they are not using GPS at all because it’s not always available and does not offer the accuracy required for robots to make decisions about how and when to move safely.

“Even if it did,” the project’s web page explains, “it would say nothing about what is around the robot, and that has a massive impact on autonomous decision-making.”

“Because our cities don’t change very quickly robotic vehicles will know and look out for familiar structures as they pass by so that they can ask a human driver ‘I know this route, do you want me to drive?’ and the driver can choose to let the technology take over’,” said Professor Newman.

The prototype system is currently being tested in Begbroke Science Park, near Oxford. There are plans to commercialize the technology, but that won’t happen until the system can be programmed to understand complex traffic flows and to decide the best routes to take.