In a pre-recorded message to attendees of World AI Conference in Shanghai, Tesla’s flamboyant CEO Elon Musk bragged that the company has pretty much figured out self-driving. The company is “very close” to having a fully autonomous, no-driver required vehicle.
According to a Bloomberg report, Musk claimed that the company has resolved many of the fundamental challenges associated with building a Level 5 self-driving vehicle:
I’m confident that we will have the basic functionality of L5 autonomous driving this year. There are no fundamental challenges.
This might just be Elon being Elon, but current Tesla vehicles are nowhere near being capable of driving themselves.
The misleadingly named Autopilot and Full-Self Driving packages are Level 2 of automation classification; they’re just partially automated driver aids. In other words, drivers must always be present, have their hands on the steering wheel, and remain cognizant of their surroundings.
This latest statement from Musk is very vague, perhaps intentionally so. There’s no mention how much Tesla’s cars will change by the time they’re considered Level 5 or if that function will be deployed to all cars via over-the-air update, which is the company’s usual strategy.
What’s more, having the fundamentals “figured out” is very different to saying that a vehicle can be deployed into operation safely.
But, let’s suppose for a second that Tesla does have a fully automated Level 5 vehicle. The next hurdle would be to overcome regulations and get the green light to put the car into production, which won’t come easy — if at all.
Tesla’s approach to supposed self-driving tech is coming under an increasing amount of scrutiny.
The company’s partially autonomous Autopilot system has been at the center of a number of investigations following a string of crashes where the technology was in use. In one high-profile case, the driver was reminded to put their hands back on the steering wheel numerous times before the car ended up being involved in a fatal collision.
Most of the time, these collisions occurred because drivers were misusing the technology, because they think it is more capable than it actually is.
Engineers from other industries, such as aerospace, have been critical of Tesla’s lack of LiDAR, and regularly cite the numerous Autopilot-related accidents as evidence that its technology is below par. Its camera-based system regularly struggles to differentiate objects that cross the vehicle’s path, and when the sun is on the horizon.
Advertising and competition regulators around the world have taken issue with its name, Autopilot, saying that it’s dangerously misleading. Academics have also criticized this practice of overselling autonomous technologies, calling the practice “Autonowashing;” like autonomous Greenwashing.
One thing is for sure, Tesla cars cannot in their current state drive themselves. There’re countless hours of dashcam footage floating around the internet demonstrating the inherent dangers of using Autopilot as more than a just a driver aid.
In the short term, Musk‘s statement is careless and only serves to further confuse customers into thinking their vehicles are more capable than they actually are — which has been shown to have lethal consequences.
While Musk can brag that Tesla has figured out autonomous driving, we’re yet to see any evidence that it should be trusted to have done so.
Published July 9, 2020 — 10:15 UTC