This article was published on April 19, 2021

The latest Tesla crash is another tragic reminder that cars can’t drive themselves


The latest Tesla crash is another tragic reminder that cars can’t drive themselves Image by: Bram van Oost, Unsplash

Last week, we had to read the sad story of the death of two men that were traveling in a Tesla that left the road at high speed, and crashed into a tree. It subsequently burst into flames, it took firefighters hours to subdue the blaze.

It’s believed that Autopilot was engaged at the time, and local media reports that no one was in the driver’s seat. The men, one aged 59 and the other 69, were reportedly in the front and rear passenger seats of the vehicle.

Moments before setting off, the two men had been discussing the car’s Autopilot feature, investigating constable Mark Herman said.

This incident serves as yet another painful reminder that cars are not capable of driving themselves, and that drivers are being grossly misled by overzealous and irresponsible marketing.

Tesla’s Autopilot and Full Self Driving systems are not capable of driving a vehicle without significant human assistance. Before engaging the system, drivers are warned by the car to pay full attention to the road at all times. They must be ready to take control at any moment.

The company’s self-driving suite is a Level 2 system, by SAE classification, meaning that it is designed to assist the driver with partially automated features, like speed control, and lane keeping.

However, considering how Full Self Driving and Autopilot are named and marketed, it’s easy to see why drivers would believe their car is capable of taking full control of driving duties, even though that’s not the case.

Spend some time on YouTube, the Tesla forums, and Tesla subreddits, and it becomes clear there is a significant contingent of fans that truly believe the vehicles are capable of driving themselves, and that it’s just regulations and The Man standing in Musk’s way.

What we’re actually witnessing is autonowashing, a phenomenon coined by human-machine interaction researcher Liza Dixon. It describes the difference between what drivers think their partially automated vehicle is capable of doing, and what it’s actually capable of doing.

It’s far from a harmless misunderstanding of technology, and as we’ve witnessed again this weekend, can result in people losing their lives.

It’s worth educating ourselves more deeply on the state of  supposed autonomous cars, and the potential dangers that can result from misunderstanding and misuse of advanced driver assistance systems.

Ed Niedermeyer, who is part of the PAVE (Partners for Automated Vehicle Education) campaign has listed some useful resources that are essential reading for anyone interested in the self-driving car debate.

You should also read SHIFT’s interview with Liza Dixon here, and our deep dive into self-driving car technology and what happens when we don’t get it right.


Do EVs excite your electrons? Do ebikes get your wheels spinning? Do self-driving cars get you all charged up? 

Then you need the weekly SHIFT newsletter in your life. Click here to sign up.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with