How Driverless Cars May Interact With People

SAN FRANCISCO — There are plenty of unanswered questions about how self-driving cars would function in the real world, like understanding local driving customs and handing controls back to a human in an emergency.

Now a start-up called Drive.ai, based in Mountain View, Calif., is trying to address how an autonomous car would communicate with other drivers and pedestrians. The company is emphasizing what is known in the artificial intelligence field as “human-machine interaction” as a key to confusing road situations.

How does a robot, for example, tell everyone what it plans to do in intersections when human drivers and people in crosswalks go through an informal ballet to decide who will go first and who will yield?

“Most people’s first interaction with self-driving cars will not be as a rider, but more likely as a pedestrian crossing the street,” said Carol Reiley, the co-founder and president of Drive.ai. “I think it is so important for everyone to trust this type of technology.”

The start-up gained some attention earlier this year when it received a license from the State of California to test driverless cars on the road. But Tuesday was the first time its executives outlined, at least in broad terms, what they planned to do. They would not discuss the company’s investors.

The Drive.ai cars won’t speak with pedestrians and bicyclists. But they will try to communicate with visual displays that go beyond today’s turn signals, perhaps with bannerlike text and easily identifiable sounds, company officials said.

The company, populated by graduate students and researchers from the Stanford Artificial Intelligence Laboratory, is entering a crowded field in the race to self-driving vehicles. There are about 20 self-driving car projects in Silicon Valley and more than four dozen around the country.

Unlike many of the efforts, however, Drive.ai will not attempt to build cars. Instead, it plans to retrofit commercial fleets for tasks like parcel delivery and taxi services.

The company is leaning on a technology called deep learning, a machine-learning technique that has gained wide popularity among Silicon Valley firms. It is used for a variety of tasks, like understanding human speech and improving the ability to recognize objects in computer vision systems.

An Israeli firm, Mobileye, is the dominant supplier of vision technology to the automotive industry, but Silicon Valley companies like Nvidia are also starting to compete for that business.

The self-driving cars of the future will need to be transparent about what their intentions are, how they make decisions and what they see, said Ms. Reiley, who is a roboticist with a background in designing underwater robotics and medical systems. They will need to communicate clearly both with the world around them as well as with their passengers.

“There’s the left brain in which a lot of discussion has taken place, what algorithms and what sensors, the logical side,” she said. “A lot of the discussion around self-driving cars has no human component, which is really weird because this is the first time a robotic system is going out in the world and interacting with people.”

A version of this article appears in print on , on Page B3 of the New York edition with the headline: Start-Up Imagines Driverless Vehicles That Can Tell You Where They’re Going. Order Reprints | Today’s Paper | Subscribe