Sooner or later, consumers will be able to buy cars that rely on computers — not the owner — to do the driving.
With that in mind, the California Department of Motor Vehicles held an initial public hearing Tuesday as it puzzles through how to regulate the public's use of the technology that is still being tested.
Among the complex questions officials wanted to unravel:
How will the state know the cars are safe?
Does a driver even need to be behind the wheel?
Can manufacturers mine data from onboard computers to make product pitches based on where the car goes or set insurance rates on how it is driven?
Once the stuff of science fiction, driverless cars could be commercially available by decade's end. Before then, the DMV wants to decide how to integrate the cars — often called autonomous vehicles — onto public roads.
Three other states have passed driverless car laws, but those rules mostly focus on testing. California's Legislature passed a law in 2012 that mandated rules on testing and public operation, and the DMV expects within weeks to finalize regulations dictating what companies must do to test the technology on public roads.
Those rules came after Google Inc. already had sent its fleet of Priuses and Lexuses, fitted with an array of sensors including radar and lasers, hundreds of thousands of miles in California. Major automakers also have tested their own models.
Now, the DMV is scrambling to regulate the broader use of the cars. With federal government apparently years away from developing regulations, California's rules could effectively become the national standard.
California's DMV must finalize the regulations by the end of the year.
Much of the initial discussion Tuesday focused on privacy concerns.
California's law requires autonomous vehicles to log records of operation so the data could be used to reconstruct an accident.
But the cars "must not become another way to track us in our daily lives," John M. Simpson of the nonprofit Consumer Watchdog said at the hearing. Simpson called out Google, saying the Internet giant rebuffed attempts to add privacy guarantees when it pushed the 2012 legislation, which mandated rules on testing and public operation.
Seated across from Simpson at the hearing's head tables was a representative from Google, who offered no comment on the data privacy issue.
Discussion also touched on how to know a car is safe and whether an owner knows how to safely operate it. In initial iterations, human drivers would be expected to take control in an instant if the automated driving fails.
Ron Medford, Google's director of safety for its "self-driving car" project, suggested that manufacturers should be able to self-certify that their cars are safe. He cautioned that it would get complicated, fast, if the state tried to assume that role.
DMV attorney Brian Soublet asked who would ensure that owners know how to use the new technology. Should the onus be on dealers, manufacturers, owners?
Representatives of automakers suggested they shouldn't be asked to guarantee the capability of owners. One, from Mercedes-Benz, said the DMV could test owners on basics such as starting and stopping automated driving.
Automaker representatives also expressed concerns that other states could pass regulations that were substantially different from California. That would create the kind of patchwork of rules that businesses hate.
DMV representatives at the hearing said other states had contacted them and were following California's rule-making process closely.
The DMV has said it aimed to strike a balance between public safety and private-sector innovation when the agency drafted its rules on testing driverless cars.
Google and carmakers have argued that development of the technology would be stifled if regulations are too onerous, They also said there is a strong incentive to sell only safe cars.
At the same time, the DMV was aware that it would be scrutinized about whether it did enough to protect the public when automated cars crash.
This story has been updated.