While the public is most afraid of marauding vehicles without a driver behind the wheel, the reality is that the self-driving vehicles are overly cautious.
WASHINGTON — As auto accidents go, it wasn’t much: twelve minutes before noon on a cool June day, a Chevrolet Bolt was rear ended as it crawled from a stop light in downtown San Francisco.
What made this fender bender noteworthy was the Bolt’s driver: a computer.
In California, where companies like Cruise Automation and Waymo are ramping up testing of self-driving cars, human drivers keep running into them in low-speed fender benders. The run-ins highlight an emerging culture clash between humans who often treat traffic laws as guidelines and autonomous cars that refuse to roll through a stop sign or exceed the speed limit.
“They don’t drive like people. They drive like robots,” said Mike Ramsey, an analyst at Gartner who specializes in advanced automotive technologies. “They’re odd, and that’s why they get hit.”
Most Read Business Stories
- Early 787 test plane is dismantled for reuse, recycling, or scrap
- Apple’s New iPad is the best tablet for almost everybody | Tech review
- The unspoken factor in Amazon’s search for a new home: Jeff Bezos’ support for gay rights
- In fight over Bombardier CSeries, Boeing loses friends as well as tariff case
- Seattle will be too expensive for you when you retire, longtimer is told | Money Makeover
Companies testing autonomous vehicles are closely watching how they interact with their human-driven counterparts.
What they’ve found is that while the public may most fear a marauding vehicle without a driver behind the wheel, the reality is that the vehicles are overly cautious. They creep out from stop signs after coming to a complete stop and mostly obey the letter of the law — unlike humans.
Smoothing out that interaction is one of the most important tasks ahead for developers of the technology, says Karl Iagnemma, chief executive officer of self-driving software developer NuTonomy.
“If the cars drive in a way that’s really distinct from the way that every other motorist on the road is driving, there will be in the worst case accidents and in the best case frustration,” he said. “What that’s going to lead to is a lower likelihood that the public is going to accept the technology.”
Sensors embedded in autonomous cars allow them to “see” the world with far more precision than humans, but the cars struggle to translate visual cues on the road into predictions about what might happen next, Iagnemma said. They also struggle to handle new scenarios they haven’t encountered before.
California is the only state that specifically requires reports when an autonomous vehicle is involved in an accident.
The records show vehicles in autonomous mode have been rear-ended 13 times in the state since the beginning of 2016, out of 31 collisions involving self-driving cars in total.
The collisions also almost always occur at intersections rather than in free-flowing traffic. A Cruise autonomous vehicle was rear-ended last month, for example, while braking to avoid a vehicle drifting into its lane from the right as traffic advanced from a green light.
Waymo’s now-retired “Firefly” autonomous vehicle prototypes were rear-ended twice at the same intersection in Mountain View, Calif., in separate instances less than a month apart in 2016. In both cases, the Waymos were preparing to make a right hand turn before they stopped to yield for oncoming traffic and got hit from behind.
Another time a vehicle was rear-ended by a cyclist after it braked to avoid another car. And a truck racing to pass a slow-moving self-driving vehicle before a stop sign clipped it as it scooted back to the right.
The state’s crash reports don’t assign blame and provide only terse summaries of the incidents, but a few themes are common. They’re almost always low-speed fender benders with no injuries. The Bolt, for example, was traveling at less than 1 mile per hour when it was rear-ended. While they represent a minuscule share of crashes in the state, autonomous vehicles are also a very small share of the vehicles on the road.
“You put a car on the road which may be driving by the letter of the law, but compared to the surrounding road users, it’s acting very conservatively,” Iagnemma said. “This can lead to situations where the autonomous car is a bit of a fish out of water.”
Maybe the autonomous cars will have to be programmed to lighten up.
“Humans violate the rules in a safe and principled way, and the reality is that autonomous vehicles in the future may have to do the same thing if they don’t want to be the source of bottlenecks,” Iagnemma said.