Of the 48 self-driving cars currently navigating the streets of California, four of them have crashed in the past nine months, casting doubt on whether autonomous vehicles are ready for widespread adoption.
California began allowing companies to test autonomous cars on public roads in September 2014, and since then, two accidents occurred while the cars were in self-driving mode and two occurred when the driver was steering. Each accident happened when the car was going under 10 miles per hour, and resulted in minimal damages. Three of the cars belonged to Google and the other to Delphi Automotive, a source familiar with the accident reports told the Associated Press.
While the Department of Motor Vehicles confirmed that there have been four accidents, it did not comment on anything else, as collision reports are confidential in California. That this information has not been made public has worried some who argue that the public should be better informed about upcoming technological advances.
Bryant Walker Smith, a law professor at the University of South Carolina who has written extensively about autonomous technology says that interest in accidents involving self-driving cars will remain high, particularly if the car is at fault.
“For a lot of reasons, more might be expected of these test vehicles and of the companies that are deploying them and the drivers that are supervising them than we might expect of a 17-year-old driver in a 10-year-old car,” Smith told the Associated Press.
Likewise, Bill Gurley, a venture capitalist and early investor in Uber, told the Washington Post people will expect higher safety standards from autonomous cars because human error is easier to forgive than technological malfunction – particularly if the car accidents cause injury or death.
“I would argue that for a machine to be out there that weighs three tons that’s moving around at that speed, it would need to have at least four nines because the errors would be catastrophic,” Gurley said, referring to a 99.99 percent safety guarantee.
Teaching self-driving cars to avoid serious accidents, which have the potential to reverse public opinion and stall innovation, is a priority for many companies developing autonomous vehicles, says Raj Rajkumar, a pioneer of the technology with Carnegie Mellon University.
However, in this transitional stage when self-driving cars must still be supervised by a human, the standards for who can sit in the driver’s seat are high. Alex Davis, who wrote a piece for Wired Magazine about his experience test driving an autonomous Audi, had to go through an entire day of training before getting behind the wheel of the Audi A7.
“[I]f you need to grab the wheel, the odds are something’s gone terribly amiss,” Mr. Davis wrote. “A nicer way of saying this is it takes a lot of skill to be better than Audi’s robot.”
Regardless of public opinion on the man-versus-machine debate that semi-autonomous cars tend to ignite, these four accidents complicate the already murky regulatory issues surrounding self-driving vehicles and could potentially slow the testing process.
Experts say courts would likely rule that there is nothing the National Highway Traffic Safety Administration can do to stop automakers from selling self-driving cars until they are proven to present an unreasonable safety risk. It remains to be seen whether these accidents will be viewed as such.