Cars are evolving quickly — but it’s important that people know the difference between ADAS and autonomous driving. It’s also important that both vehicular categories receive a comparable level of due diligence when it comes to ensuring the safety of car occupants and other road users.
And well, we have a problem. Let me explain what’s got me riled up.
This week we’ve seen a clash of the titans between autonomous vehicle industry experts, The New York Times, and Tesla drivers.
It started with a rallying cry against the use of misleading terms like ‘self-driving’ to describe car functionalities that sit firmly in Level 2 of vehicle automation.
A New York Times journalist wrote an article that showed a concerning inability to differentiate between FSD/ADAS and autonomous driving while trying out a Cadillac in Super Cruise mode, which left many people greatly concerned.
Even when a couple of auto academics asked the journalist in question, Farhad Manjoo, (as well as the Opinion Copy Chief), to correct or clarify the Opinion Copy Chief’s response was:
The @nytopinion copy chief responded in part: “In short, although the auto industry may distinguish between advanced driver-assistance systems and autonomous or self-driving vehicles, most people do not, and we do not expect them to.” Really, @NYTimes?
— Bryant Walker Smith (@bwalkersmith) February 9, 2022
The ruckus continued on Twitter:
Everyone who walks or bikes should understand that Tesla is risking your life with its “Full-Self Driving” feature.
Like it or not, we’re all part of beta tests being conducted on our public streets — while @NHTSAgov sits on its hands.
A quick ?: pic.twitter.com/9z3iifjOKc
— David Zipper (@DavidZipper) February 9, 2022
Tesla users were able to request access to Beta testing last year via a button in their car which enables the company to evaluate owners’ driving for seven days before they’re included in the beta testing group. It’s not clear what the criteria are for having this functionality rescinded due to bad or unsafe driving.
For example, a video was released this week, where two Tesla owners liken driving to gaming. They film their car at risk of hitting a cyclist, laugh about it, and then the driver says he needs to cut it from the video. Hiding the evidence?
This raised many conversations on Twitter about how Tesla drivers understand and experience their “Full Self-Driving” subscription.
Most Tesla owners would not have made that save, lets be 100% honest with ourselves. They would be texting/distracted and let their car run someone over and just respond “Oops, not my fault” or “thats why its in beta”
— Ace (@LN2_R0gue) February 8, 2022
And it’s not like Tesla’s safety record hasn’t been in the spotlight time and time again.
Is Tesla getting an easy ride?
Earlier this year, a lobby group called The Dawn Project took out a full-page ad in the New York Times. It slammed Tesla’s ‘Full Self-Driving’ (FSD) beta software.
Ok, my first thought was “old man shouts at cloud”, but I think there’s something here.
The organization wrote a safety analysis analyzing 22 videos of over seven hours of driving under FSD Beta by members of the public, noting:
The drivers repeatedly praise or congratulate FSD or Autopilot for doing something as well as a human driver. They repeatedly excuse the mistakes that put lives in danger.
Part of me thinks okay, Tesla is ostensibly beta testing, so there’s some margin for error.
But the other part of me thinks, when did road users, including pedestrians and cyclists, sign up to be guinea pigs (or sitting ducks) while the tech is ironed out?
I’ve previously asserted that part of the problem is fuckwit Tesla drivers, a category that certainly doesn’t categorize all Tesla drivers by any means.
But it raises worrying questions about parity. Especially when we compare Tesla’s beta testing to the efforts of vehicle tech companies making autonomous taxis.
Let’s compare this to the robotaxi driving experience
There are huge contrasts between Tesla’s “FSD” and robotaxis, which don’t have a human driver behind the wheel.
Take a look at China’s regulations that allow AV testing on selected highways and city roads. Driving is restricted to specific zones, and safety is prioritized.
Then, there’s Cruise, which is now offering autonomous commercial services in San Francisco.
Ok, the robotaxis are generally only operating from 11PM to 5AM. This shields them from some of the conventional driving challenges like peak hour traffic, road work, or rushes of pedestrians and cyclists — so you could argue they get a safer ride.
But the autonomous robotaxis have undergone a gradual process of rigorous testing and piloting.
Yet, Tesla drivers can go where they like. And, too bad if they treat their car as autonomous, when it’s only driver-assist. Further, robotaxis have had fewer untoward incidents, while FSD has caused several
What’s more dangerous? An autonomous robotaxi or a Tesla in “full-self driving” (driver-assist mode)? We’ve seen little effective effort to curtail Tesla’s rights on the road.
The privilege of driving a car is changing
The way we experience cars is changing massively, especially in how we define what it means to drive a car.
Even learning to drive needs to change to effectively accommodate the different responsibilities of the driver, as cars gain increased autonomy.
Increasing the levels of vehicle automation is a hard sell to many people outside of the tech bubble. I want individuals to feel confident and excited about the technology. And I want them to feel safe co-existing with cars that are both autonomous and engaged in driver-assist mode.
Until that happens, we’re sitting in an uncomfortable position, waiting for news of the next car crash in driver-assist.
Get the TNW newsletter
Get the most important tech news in your inbox each week.