Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on September 12, 2017

US authorities conclude last year’s fatal Tesla crash was mostly human error


US authorities conclude last year’s fatal Tesla crash was mostly human error Image by: Tesla Motors

The U.S. National Transportation Safety Board (NTSB) today determined that a fatal 2016 accident involving a semi truck and a partially-autonomous car was caused by a combination of factors. This includes its assessment that the car, a Tesla Model S, played a ‘major role’ in the accident by allowing its driver to leave autopilot mode engaged. There were larger contributing factors, however, according to the board.

It’s never easy to take a pragmatic approach to unraveling an event that caused the death of a person. The driver of the Tesla was Joshua Brown, a 40-year-old technology lover who served as a US Navy SEAL. Board members took time to express their condolences to the family of Mr. Brown before they began proceedings today.

The first concern the board members expressed was that the car allowed Brown to operate in autopilot mode despite driving on a highway, a task that the autonomous features weren’t yet approved for. According to NTSB chairman, who spoke during the meeting:

This was the first known case of a highway fatality with a vehicle with this level of automated systems. As you will hear, the driverless features were not intended for all roads. The operational limitation played a major role in this accident.

The car warned Brown several times that he should take over. It detected his hands weren’t on the steering wheel seven times and gave auditory warnings (which only come after visual cues have been ignored for 15 seconds) six times during the 41 minutes he drove prior to the crash. Brown, according to the board, was over-reliant on the autopilot.

Tesla’s Model S is not a self driving car. It has level two autonomy, which means that it can control steering and acceleration but still requires a driver to be alert and at the wheel. The car is designed to augment human drivers not replace them. Brown would have had to clear a message from his screen indicating his understanding of such every time he drove his car.

The board also recognized the driver of the truck was in the wrong. It was traveling Westbound and making a left turn across the Eastbound lanes of a highway and failed to yield. The Tesla, going 119 kmh (74 MPH), struck the truck and ‘slid’ underneath it, instantly killing Brown.

The car didn’t detect the truck, the driver of the truck didn’t properly yield when crossing a highway intersection, and Brown didn’t have his hands on the wheel. The accident was a tragic combination of several mistakes that caused one man to die.

Tesla, for it’s part, have addressed the role that its software played in the accident. The truck wasn’t detected because it’s trailer, which was sideways across the lanes of the highway, was white and didn’t register to the car’s cameras. That’s been remedied with the implementation of a radar system. The software that allowed Brown to ignore warnings has been updated: after several warnings it will now force the car to slow to a stop and won’t engage the autopilot again until it’s been placed in park.

Brown shouldn’t have engaged the autopilot on the highway and the truck driver should have obeyed traffic laws by waiting until the roadway was clear before driving across lanes. Failure to use technology properly caused the accident, and while sad, it’s almost always the same story with every car accident: human error is to blame most of the time.

This is why even the NTSB, fully aware of all factors that contributed to the accident, today continued to emphasize – while they aren’t perfect – driverless cars are the future. This is not in spite of Brown’s death, but because of it.

 

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top