SAN JOSE, Calif. – When the Florida Highway Patrol pulled him over this month for driving too fast, Brooks Weisblat didn't bother telling the ­officer that his Tesla Model S had been driving itself.

"That would have definitely got me a ticket," said ­Weisblat, who got a warning instead.

Florida doesn't have a driver's handbook dictating robot rules of the road. No state does, but California could become the global model next year when it publishes first-in-the-world consumer rules for self-driving cars.

Those regulations are already a year behind schedule. Among the problems vexing officials with the Department of Motor Vehicles is how to handle not just the vehicles but their overtrusting owners.

"The technology is ready. I'm not sure the people are ready," said Weisblat, who along with his Model S and its new Autopilot feature didn't notice the sign warning that the freeway speed limit had dropped by 10 miles per hour as it approached Miami. "You still need to pay attention."

Google has for years been testing vehicles near its Mountain View headquarters that are meant to be fully autonomous, requiring no human intervention except a rider's voice saying "Take me to the supermarket." But most carmakers developing self-driving technology are working on tools that relieve but don't entirely replace human ­drivers.

That leaves some in the industry worried about "The Handoff," that moment — perhaps just a split-second — when humans must regain control to avert disaster. Today, drivers of cars equipped with semiautonomous tools such as automatic braking, adaptive cruise control and sensors that help keep the car in its lane are supposed to be monitoring and supervising whatever the car can do on its own. But as the cars get smarter and able to navigate themselves, the humans in the driver's seat will increasingly grow comfortable checking text messages, scanning a newspaper and opening up the makeup kit.

"Humans are really bad at evaluating low-probability events," said Steven Waslander, an engineering professor at the University of Waterloo in Canada. "You've been driving the same way to and from work every month and then there's one moment when suddenly you have to be paying attention."

This is not just a theoretical problem anymore, now that Tesla Motors is allowing drivers to switch on its "Autopilot Mode" that includes adaptive cruise control and letting the car change lanes by itself after the human driver turns on a signal.

That mode falls just short of crossing the line of California's rules, which authorize corporate road-testing but not the consumer use of truly self-driving cars. As rumors emerged earlier this fall that Tesla would be introducing its semiautonomous features in tens of thousands of vehicles already on the road, the DMV sent a letter to the Palo Alto-based company clarifying that the cars wouldn't actually be driving by themselves. State officials met with Tesla just a day before the company reprogrammed its electric cars to allow the new tools.

California categorizes Tesla's new tools as Level 2 technology, which means it is "just helping drivers make better decisions," said DMV spokeswoman Jessica Gonzalez. It is not considered a Level 3 or Level 4 car, which would mean the "car is making the decisions," she said.

The public use of such cars that let humans cede control to machines is still banned in California, though 10 companies — including Google, Tesla and some major auto firms — are permitted to test them with trained safety drivers.

To discourage drivers from relying too heavily on its imperfect technology, Tesla's Autopilot is supposed to beep after about 10 seconds of hands-free driving to nudge drivers to grab the wheel again, and after being ignored it can sound louder warnings and turn the radio off.

"We were very comfortable with what they're doing," Gonzalez said. "We didn't tell them to do the 10-second thing, but that's why they did the 10-second thing. They also are saying the driver needs to be in ­control. They're not ready to say, 'Hey, let's sit back and relax.' "

Not all Tesla drivers are heeding that advice. Videos posted to YouTube since the new tools were introduced on Oct. 15 show drivers around the country letting the cars take control — with mixed results.

"It's definitely a big update to wake up and have your car be able to drive by itself," Weisblat said. "No one's making you read the manual to know how it works."

As Google's more than 70 self-driving vehicles — considered to have more advanced sensing and navigation technology than Tesla's — continue to cautiously roll around Mountain View streets, Tesla is pushing out new tools that could make its entire customer base an experiment in semiautonomous technology.

That puts pressure on California to finish its long-awaited draft rules, which were due in January, as other states and foreign governments look to the Golden State for guidance and hope to secure their own piece of the red-hot automotive innovation economy.

"California wants to take its time and do this properly," said Susan Shaheen, a professor with the University of California, Berkeley's Transportation Sustainability Research Center. "They're taking this relatively slowly compared to what Google wants to see."

Nearly two dozen consumer, privacy and public health groups are urging U.S. regulators to investigate whether children are being endangered by deceptive apps in Google's app store for smartphones running on its Android software.