Google’s Robot Car Can’t Explore New Roads, and That’s a Big Problem

Photo: Ariel Zambelich/WIRED

The roughly three-mile ride we took around Mountain View in Google’s self-driving car was so impressive it was kind of boring. With the exception of a few jerky maneuvers, the car drove just like an experienced human. So really it was kind of like being driven by a teenager who got his license a few weeks ago.

The modified Lexus RX350h is covered in laser, camera, and radar systems, but those aren’t the only things that make it more advanced than any other autonomous car technology that’s come before it.

The key advantage is that the car isn’t just seeing and figuring out the world as it drives along. It’s basing its actions on vast amounts of data the Google Self-Driving Car Project has already compiled about every road it travels. Before the car drives itself into new territory, the project team collects detailed information on permanent features: lane markers, the precise location of the curbs, the height of traffic lights, local speed limits, and so forth.

“We require digital maps in order for our cars to be able to drive,” Andrew Chatham, who leads mapping on the project, said at a press event Tuesday. That data “makes the job of building the self-driving car software much simpler.”

The car has a good idea of what to expect from any stretch of road, freeing up the software to deal with cars, pedestrians, cyclists, construction, and any other new obstacles in real time.

That’s the “magic of maps,” Software Lead Dmitri Dolgov said. But that “magic” inherently limits the range of the self-driving car to areas Google has the data for. As Chatham pointed out, “If we have not already built our own maps in an area, the car cannot drive there.” He noted that as the car’s sensors get better, they will rely less on perfect accuracy, but Chris Urmson, the project director, emphasized the key role these maps play.

“It’s also a really important part of the safety story,” he said. If the car is constrained to areas its system is familiar with, “you can be confident that it’s gonna behave well in those situations.”

Battery-range anxiety is a major hurdle to the adoption of electric cars. Is this the dawn of map-range anxiety?

Chatham described gathering the required information as “an awful lot of work,” and the fact that Google Maps has already recorded much of the world in remarkable detail doesn’t help here. That data isn’t good enough. All those roads have to be manually driven again, several times. For the self-driving car to conquer the world, the world must first be mapped — in much greater detail than what Google has already done.

The self-driving SUVs are equipped with the hardware to do the necessary mapping. The car’s laser rotates 10 times per second, with 64 vertically-stacked beams constantly measuring the distance to surrounding objects, including lane lines (they register because they’re reflective). Mounted on the roof, it provides a 3D, 360-degree view of what’s around the car. The camera and radar systems bring in extra information, all of which is stitched together by team members. Then they have a map they can use to let the car tackle the road solo.

Asked if this challenge — doing this for everywhere someone would want to go in their Google ride — is a major issue for the project, Dolgov said he wasn’t worried. The cars can be driven by humans, after all, and he said he could imagine a scenario where owners help out with data collection.

Each self-driving car has the necessary laser, camera, and radar systems necessary for mapping. It’s possible than an owner whose cul-de-sac isn’t mapped by Google could manually drive the car through it a few times. Then the system would have the necessary data, and he could relax all the way home.