Google Claims Some Responsibility for Its Driverless Car Crashing
The self-driving car made a humanlike error, and it won’t be the last.
Google has accepted some of the heat after one of its self-driving Lexuses crunched into a California municipal bus on February 14: “In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision,” the company admitted in a statement released today.
Minus the A.I., it’s a fairly mundane bender: Passing around sandbags in an intersection, the two-mph Lexus hit a bus traveling at a 15 mph. The Google employee in the front seat of the autonomous car didn’t grab the steering wheel, believing the bus would brake. The Lexus busted a fender and a sensor. No one was hurt. But don’t expect that to last.
Driverless cars have an complicated track record: On a per-mile basis, one October 2015 study suggests they’re more likely to be involved in an accident, though it’s not statistically significant. But, until now, no driverless cars have been suggested at fault — some experts hypothesized the increased rates could be because human drivers don’t expect A.I. cars to adhere so rigidly to the rules of the road.
Google chalked the accident up to normal driving grey areas. The company sent this segment from its February 2016 monthly report (which comes out Tuesday) to the Associated Press:
This is a classic example of the negotiation that’s a normal part of driving — we’re all trying to predict each other’s movements. In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision. That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.
Prior to this incident, Google maintained any accidents were the other guy’s fault. However, as of January 2016, its 55 autonomous cars had driven about 1.4 million miles on public streets since 2009. That sounds like a lot of miles, but it’s peanuts in comparison with the trillions of miles Americans log annually.
The National Highway Traffic Safety Administration reports the fatality record in the hundreds of millions of miles driven. Should Google continue at its rate of 15,000 autonomous miles a week, it’d take 128 years to get to 100 million miles — which was, in 2013, about the distance driven at which you’d expect one fatality, on average.
But let’s say, in 2020, there are 10 million driverless cars on the road, and each car drives 10 thousand miles a year. If, charitably, driverless cars are twice as safe as human cars — at one death every 200 million miles — they’d be involved in (very roughly) 50 fatal accidents.
Google is taking the accident as a learning lesson: “From now on, our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future.”
We should take this as a lesson, too: Self-driving cars may very well be much safer than human drivers. They will still kill someone because machines, like humans, are fallible.