Share

The vehicle was terribly considerate to those approaching crosswalks, halting the moment it looked like they might cross the street.

Cruise Automation

Nothing will make you hate humans—capricious, volatile, unplanned, erratic humans—like sitting in the back of self-driving car. When I hitched a ride in one, a white and orange General Motors Cruise autonomous vehicle during a press event in San Francisco on Tuesday, every movement was a cause for alarm. Two walkers darted out in front of the car during my roughly 20-minute, 3-mile ride, blissfully ignorant that they were trusting their lives to a piece of software. Two cyclists made unexpected but sweeping turns. Human-operated vehicles whipped around corners and rolled through stop signs. Why couldn’t they be like this autonomous vehicle: extra cautious, considerate, aware?

But this chaos—this unpremeditated waltz of oops, no, you go and nope, buster, me first—is reality. It’s how cities work. Which means that if a car is going to drive itself, no humans drivers involved, it must get very good at doing something very hard: interpreting and anticipating the behavior of humans.

In this regard, the electric, self-driving Chevrolet Bolt seems to be doing just OK. My trip was far from smooth, the vehicle so careful that it jolted, disconcertingly, to a stop at even the whisper of a collision. If the Silicon Valley motto is “move fast and break things,” Detroit’s seems to be “move below the speed limit and ensure you don’t kill anyone.”

My herky-jerky ride in an autonomous vehicle showed that Cruise Automation, acquired by General Motors in 2016, has made serious progress. No San Franciscans were hurt during the making of this article. But the biggest vehicle manufacturer in America has some big work to do before humans take the back seat—for good.

Let’s Take a Self-Drive

When Cruise opened the doors of its self-driving cars to journalists this week, it marked the first time non-investors and non-GM-ers were allowed inside. But Cruise launched its own ridehailing app, Cruise Anywhere, in August, and workers have been hitching free self-driving rides around San Francisco ever since. So when it was time to begin my trip, an employee handed me an iPhone, and I used the app to hail a car. I chose one of three pre-determined destinations—a basketball court in Mission Bay—and a Bolt nicknamed Pickle accepted my ride.

And then it cancelled. And then no other cars were around to get me. I am here to tell you: The future feels a lot like the present. Finally, another vehicle—named and labeled Chinchilla—pulled up outside, and the ride began.

The in-vehicle display, mounted on iPads in the back seat of the Cruise self-driving car, shows the route, car name, and destination.

Cruise Automation

Cruise’s driverless rides aren’t human-free, not yet. Today, two autonomous vehicle trainers sit in the front—one safety driver, with her feet resting on the brake pedal and her hands loosely around the wheel, and a helper in the passenger seat, who sits with laptop in lap, softly intoning directions and words of caution, sending messages to coworkers through Slack, and taking notes on the ride. (This is contrast with Waymo, Google’s self-driving car unit, which has taken drivers out of its test vehicles in a Phoenix suburb and plans to launch a completely driverless taxi service in a Phoenix suburb in a few months.)

So the whole thing felt very safe. I felt good about the well-being of the pedestrians and cyclists around me, too. The car gave a bike rider cycling next to the curb plenty of space—we inched behind him for minutes, refusing to deviate from our lane. And Chinchilla was terribly considerate to those approaching crosswalks, braking, hard, the moment it looked like a person might cross the street. Towards the end of the ride, the car began to make a left turn into a crosswalk, and a woman pushing a stroller on the sidewalk accelerated toward the street. Not the baby, I pleaded silently, before she turned to cross the perpendicular street instead. Our car, meanwhile, had jerked to a stop—in the middle of the intersection. Cruise employees later told me they’ve programmed their cars to anticipate the actions of pedestrians. But right now, they don’t always get it right.

For humans driving regular cars, these auto-matons must be a nuisance. They are slow—we stayed at about 15 to 20 miles per hour for most of our trip. They stop at the hint of danger, sometimes slamming on the brakes and throwing passengers forward in their seats. (I would not choose to ride in this self-driving car if I were, say, already suffering from a migraine.) And occasionally, they get confused and just kinda freeze. At one point, Chinchilla approached a public bus pulled over to the side of a one-way street. There was plenty of room to navigate around it. Chinchilla braked and considered its impending circumnavigation. And considered. And considered. About two minutes later, the safety driver finally flipped off the self-driving mode and piloted the car around the bus. No vehicles were waiting behind us, but, oh, if there had been—the honking! (Kyle Vogt, Cruise’s CEO, later told me the lidar sensors that usually determine how much clearance the vehicles have on their sides have been suffering from technical issues for the past few weeks, so the cars are even more cautious about going around obstacles then they normally are.)

Yes, these cars are more conservative than your uncle who forwards you those chain emails. Cruise says they’re programmed like that on purpose. “We will not launch until we have safety perfect,” General Motors President Dan Ammann said during Tuesday’s press event, referring to plans to put driverless cars on the road. (Vogt declined to answer questions about how it will determine what is safe enough.)

As a result, the cars are more likely to get hit than to be the hitters. According to mandatory reports filed with the California Department of Motor Vehicles, Cruise cars have been involved in 21 incidents in 2017 alone. Overall, that’s pretty good: Cruise won’t say how many miles of testing it has under its drive belt, but 100 vehicles operate in San Francisco, and the company tests 24 hours a day. Still, 13 of those fender-benders happened because the self-driving cars got rear-ended. If a human driver is tailgating, or texting, or letting her mind wander while behind a safety-conscious autonomous car, she could miss a quick and cautious brake. Cruise officials told me drivers should act normally around these testing robots—just drive like you always should, they said. Pay attention. But humans are flawed. And impatient. When they’re around these cars, people might have to drive better than they usually do. Maybe that’s asking too much.

Or, more likely, maybe these cars represent something all drivers, humans or not, should aspire to—it may not be long until riding in these cars feels more like riding with an experienced adult instead of a responsible teen with a learner's permit. And the teen is doing OK. During my ride, the car navigated around a garbage truck, a roundabout, and a dicey, crowded left-hand turn with the finesse and patience of a well-rested cab driver.

“Autonomous driving is the most challenging engineering problem of the decade, if not the century,” Vogt told reporters. If my quick trip through a quiet section of San Francisco is any demonstration—well, yeah. And GM is about to make these exercises a lot harder for itself. Last month, it announced it will start testing in downtown Manhattan. And a chorus of “I’m walking here!”'s rang through the streets.