Tesla is recalling nearly 54,000 vehicles in the US, as a recent Full Self-Driving (FSD) Beta update allows its cars to perform a “rolling stop.” This means the vehicle doesn’t come to a complete halt at a stop sign, instead moving slowly through it.
Unsurprisingly, the National Highway Traffic Safety Administration (NHTSA) has issued a safety recall of the feature, which it deemed dangerous:
Failing to stop at a stop sign can increase the risk of a crash.
Thanks for that, NHTSA.
According to the NHTSA, Tesla introduced the rolling stop functionality last October through a “limited access” over-the-air update.
As part of this, Tesla owners could choose between different profiles for their car’s self-driving features: Chill, Average, and Assertive.
The Assertive mode was accompanied with a warning that the vehicle “may perform rolling stops.”
I guess “Road Rage Mode” didn’t fit on the screen pic.twitter.com/6pJNFvrJXA
— David Zipper (@DavidZipper) January 9, 2022
The NHTSA’s report specifies the rolling stop feature “allows the vehicle to travel through all-way-stop intersections at up to 5.6 mph before coming to a complete stop” — if certain conditions are met first.
The most important are:
- No vehicles, pedestrians, or cyclists are detected near the intersection.
- The vehicle has sufficient visibility when approaching the intersection.
What does this mean for Tesla?
The EV maker has decided to voluntarily (not that there was any other option) recall the functionality from the affected 2016-2022 Model S and Model X, 2017-2022 Model 3, and 2020-2022 Model Y vehicles.
This doesn’t mean that owners of these vehicles will have to take any physical action. Tesla will release another over-the-air update that will disable the rolling stop feature, starting in early February.
Musk’s snark and… contradiction
Naturally, Elon rushed to defend Tesla’s software and — why the hell not? — insult a journalist for “misreporting.”
cue the moron journalists completely misreporting this story
it did not run any stop signs due to this issue. it slowed down to 2 miles per hour rather than 0 miles per hour if there was no one there to maximize comfort
— Whole Mars Catalog (@WholeMarsBlog) February 1, 2022
He’s actually a lobbyist, not a journalist. There are many who pose as the latter while behaving like the former. No integrity.
Indeed, there were no safety issues. The car simply slowed to ~2 mph & continued forward if clear view with no cars or pedestrians.
— Elon Musk (@elonmusk) February 1, 2022
Indeed, Tesla notified the NHTSA that, as of January 27, 2022, it’s not aware of “any warranty claims, field reports, crashes, injuries or fatalities related to this condition.”
But Musk’s nonchalant justification of the software raises deeper concerns.
During Tesla’s Q4 2021 earnings call, the celebrity CEO predicted that FSD will be safer than human drivers by the end of 2022:
I would be shocked if we do not achieve Full Self-Driving safer than a human this year. I would be shocked.
But programming an autonomous software to take the same kinds of shortcuts a human driver does undermines the whole “autonomous cars will be safer” argument.
Especially since we’re still unsure autonomous systems are able to make the judgements a human driver can when it comes to rule bending.
And there’s another issue here: it looks like the feature was designed to break laws. Because, you guessed it, performing a rolling stop in the US is, well, illegal. And it’d get you a nice fat ticket.
So why would Tesla enable a feature that’s technically illegal?
Was it a programming oversight? Does the company believe that “stopping” actually means “slowing down?” Or does it simply think it has enough money to be above the law?
No matter the answer, there’s one thing I know for sure: we’ll never trust an automated vehicle that makes human mistakes — and we don’t need a bunch of rogue Teslas deliberately breaking road laws.
Get the Shift newsletter
Get the most important mobility news in your inbox each week.Follow @shift_tnw