Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on September 4, 2019

Another Tesla crashes due to misuse of Autopilot [Updated]


Another Tesla crashes due to misuse of Autopilot [Updated]

The US National Transportation Safety Board (NTSB) today revealed that faulty software and human error were to blame for a recent accident involving Tesla’s “Autopilot”.

According to the report, the Tesla’s driver was operating his Model S P85 with the vehicle’s driver-assistance feature turned on. Data obtained by the NTSB shows the driver’s hands hadn’t been on the steering wheel for approximately thirteen minutes before the vehicle collided into a parked fire truck. Per the NTSB:

The Tesla, which had its “Autopilot” system engaged, was traveling in the HOV lane, behind another vehicle. After the lead vehicle changed lanes to the right, the Tesla remained in the HOV lane, accelerated and struck the rear of the fire truck at a recorded speed of about 31 mph. A forward collision warning alert occurred 0.49 seconds prior to impact but the automatic emergency braking system did not engage. The driver’s hands were not detected on the steering wheel during this sequence nor did the driver apply steering or braking prior to impact.

Credit: NTSB
Soucrce: NTSB

We’ve seen the same chain events played out in many accidents involving the company’s “Autopilot,” the driver overestimates the capabilities of the vehicle’s software and, sometimes, ends up dead. In each previous case Tesla has been quick to point out that it expressly warns drivers that they must keep their hands on the wheel and their eyes on the road at all times. The car even has an alarm that warns you if your hands aren’t on the wheel. Tesla’s “Autopilot,” after all, is not an autopilot system.

So why does the same refrain – vehicle crashes after taking hands off steering wheel, Autopilot failed to properly identify obstruction – return in nearly ever fatal Tesla crash? Perhaps these drivers believe that their vehicles have an autopilot system that makes their cars fully self-driving because they’ve been sold a car that has a piece of software called “Autopilot,” connected to a system named “Full Self-Driving.”

Credit: Tesla
Straight from Tesla’s website.

Maybe it’s also because Tesla founder and CEO Elon Musk went on national television and misused the Autopilot system in much the same way as many of the people who’ve died over-relying on it did. If the billionaire technology genius thinks it’s safe enough to take his hands off the wheel, who the hell are the suits at the NTSB to tell them how they should drive their Teslas. Elon invented Teslas!

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Here’s what Musk had to say last year when one of his customers died in a Tesla, via The Verge:

When there is a serious accident it is almost always, in fact maybe always, the case that it is an experienced user, and the issue is more one of complacency.

According to the Tesla Owner’s Manual, you should always keep your hands on the steering wheel — Musk’s excused because he has blood on his.

Edit 4 September 6:00 PM PST: A previous version of this article’s headline erroneously reported the most recent Tesla crash was fatal. Per the NTSB there were no injuries reported in the Model S accident involving a parked firetruck. 

Read next: These idiots shot a porno in an Autopilot-enabled Tesla and Elon Musk deserves some blame

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top