There is little doubt that the technology behind driverless cars is nearly advanced enough for mainstream use.
Google made its biggest public display yet of its cars Tuesday, taking reporters on spins around Mountain View, Calif. Carmakers like BMW and Toyota are also preparing to sell cars that drive themselves.
Instead, the bigger question about driverless cars is a legal one. Who is responsible when something goes wrong?
Driverless cars are supposed to be much safer than cars driven by people because they do not make human errors.
- 2 people killed in Seattle-area windstorm identified
- High winds stall firefighting efforts, fuel Tunk Block, Lime Belt fires
- Steven Hauschka's 60-yard FG gives Seahawks final edge over Chargers
- Chargers players upset with Frank Clark
- White House renames Mount McKinley as Denali on eve of trip
Most Read Stories
But accidents seem inevitable. What happens when a driverless car kills someone? Or less drastically, who pays the ticket when it does not notice a no-parking sign, or when an error in Google Maps sends it the wrong way down a one-way street?
As robots become mainstream, lawmakers will have to grapple with how to govern machines and hold software accountable.
Only four states and the District of Columbia have passed laws specific to driverless cars, some just allowing manufacturers to test cars and none answering every legal question that might come up.
But lawyers, academics and the car’s designers say none of these issues are likely to prevent self-driving cars from hitting the road, because current liability laws already provide some guidance. A bigger obstacle than the law might turn out to be people’s own visceral fears of robots.
Here is what to expect. In cases of parking or traffic tickets, the owner of the car would most likely be held responsible for paying the ticket, even if the car and not the owner broke the law.
In the case of a crash that injures or kills someone, many parties would be likely to sue one another, but ultimately the car’s manufacturer, like Google or BMW, would probably be held responsible, at least for civil penalties.
Product-liability law, which holds manufacturers responsible for faulty products, tends to adapt well to new technologies, John Villasenor, a fellow at the Brookings Institution and a professor at University of California, Los Angeles, wrote in a paper last month proposing guiding principles for driverless-car legislation.
A manufacturer’s responsibility for problems discovered after a product is sold — like a faulty software update for a self-driving car — is less clear, Villasenor wrote. But there is legal precedent, particularly with cars, as anyone following the recent spate of recalls knows.
The cars could make reconstructing accidents and assigning blame in lawsuits more clear-cut because the car records video and other data about the drive, said Sebastian Thrun, an inventor of driverless cars.
“I often joke that the big losers are going to be the trial lawyers,” he said.
Insurance companies would also benefit from this data, and might even reward customers for using driverless cars, Villasenor wrote.
Ryan Calo, who studies robotics law at the University of Washington School of Law, predicted a renaissance in no-fault car insurance, under which an insurer covers damages to its customer regardless of who is at fault.
Criminal penalties are a different story, for the simple reason that robots cannot be charged with a crime.
“Criminal law is going to be looking for a guilty mind, a particular mental state — should this person have known better?” Calo said. “If you’re not driving the car, it’s going to be difficult.”
The first deadly accident could be a bigger headache for the carmaker’s public-relations department than for its lawyers.
“It’s the one headline, ‘Machine Kills Child,’ rather than the 30,000 obituaries we have every year from humans killed on the roads,” said Bryant Walker Smith, a fellow at Stanford University’s Center for Automotive Research.
“It’s the fear of robots. There’s something scarier about a machine malfunctioning and taking away control from somebody. We saw that in the Toyota unintended acceleration cases, when people would describe their horror at feeling like they could lose control of their car.”
Robot cars scare people less than some other new technologies, though. Nearly half of Americans say they would ride in one, according to Pew Research Center, making them a much more popular new technology than others like drones or implantable memory chips.