This article was published on March 20, 2018

Uber’s self-driving car may not have been at fault for killing a pedestrian, but you should still be worried


Uber’s self-driving car may not have been at fault for killing a pedestrian, but you should still be worried

Following the horrific news that a woman was killed by one of Uber’s self-driving cars in Tempe, Arizona, on Sunday night, local police said that it would have been impossible to avoid the accident.

The San Francisco Chronicle reported that Tempe police chief Sylvia Moir said of the accident:

The driver said it was like a flash, the person walked out in front of them. His first alert to the collision was the sound of the collision.

The vehicle, which had a human in the driver’s seat, was said to be traveling at 38mph in a 35mph zone when it struck 49-year-old Elaine Herzberg, as she was pushing a bicycle and abruptly walked from a center median into a lane of traffic.

Moir added that, from viewing the videos obtained from the cameras in the self-driving Volvo SUV, “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway.”

From this account, it seems that the autonomous vehicle system may not have been at fault. It’s also worth noting that this is the first time ever that a pedestrian has been killed in an incident involving a self-driving car, and that this technology is still in the testing phase.

That said, this raises a number of other concerns that aren’t easy to grapple with. The first is that this tragedy shows just how difficult it is to build a fool-proof self-driving system that guarantees safety of all parties involved.

Around the end of 2017, Uber’s test vehicles had already covered two million miles in roughly three years, doing about 84,000 miles a week on public roads. It’s worth noting that in all this time, this seems to be only the second accident an Uber autonomous car has caused – but we’re also setting a high bar for these vehicles’ ability to keep us safe.

Next, it stands to reason that even with impeccably trained self-driving systems, accidents can still happen. Humans make mistakes and are prone to impulsive behavior, and so we can’t rule out situations where people – whether they’re pedestrians or driving a vehicle manually – throw our machines a curve ball in the form of an unexpected maneuver on the road. Perfecting our systems can only reduce the chances of self-driving vehicles causing accidents; it can’t prevent freak accidents entirely.

We also have to think of the legal issues that come into play in case of an accident involving a self-driving vehicle, and how we’ll tackle them when they become more commonplace. In this instance, police chief Moir said Uber may not be at fault, but the backup driver in the vehicle could face charges. That’s going to be a hard case to fight, and it’ll test not just lawyers’ abilities to argue their sides, but also our faith in self-driving technology.

It remains to be seen how this affects Uber’s plans to launch a fleet of self-driving cabs next year, as well as what it spells for state regulations in the US that allow for testing these cars, and the autonomous vehicle industry at large.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with