Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on October 18, 2021

Where are all the robotaxis and autonomous cars we were promised?

Still just a few years away...


Where are all the robotaxis and autonomous cars we were promised?

We’re a couple weeks away from All Hallows Eve and there are plenty of things that go bump in the night for us to fear this year.

Things such as dumb cars that can’t figure out how cul-de-sacs work without a human operator. And Tesla FSD’s proclivity towards driving the wrong way down clearly-marked one-way streets, for example.

Up front: Rumors of the advent of truly driverless cars have been, for the most part, widely exaggerated.

Elon Musk’s now infamous 2019 claim that Tesla would have one million fully-autonomous robotaxis on the road by the end of 2020 has aged exactly as well as the experts have said it would. That is to say, the exact number of autonomous vehicles Tesla has on roads right now is zero.

Investors are trembling at the prospect of yet another year without a driverless car. A few years ago we were promised autonomous vehicles would be the norm by now. A few years before that… it was the same.

And, if you listen to the C-suite executives running the companies building these technologies, they’re still promising the same thing.

The big question is whether these technologies will always be a few years away, or if there’s a threshold we can pass through where the future looks a bit less fuzzy.

Background: Ford, Cruise, Waymo, and Tesla aren’t the only companies working on the problem. But you’d be forgiven for thinking so because they’re the ones getting all the media coverage.

There are myriad other technology companies involved in solving the autonomy problem. And, at the risk of simplifying the issue, most of the ones making progress aren’t also besotted with the marketing problem of selling a $40,000 vehicle on the promise of what it’ll be able to do at a later, unspecified date.

Neural spoke with Allen Steinhardt, the chief scientist at AEye to find out what was happening behind the scenes in the world of transportation autonomy.

AEye is a company that specializes in LiDAR technologies. It holds over 100 patents – including its “iDAR” (intelligent detection and ranging) tech that introduces a bistatic architecture (separate communication channels) enabling the integration of deterministic AI for LiDAR tech.

And Steinhardt’s the former Chief Scientist for DARPA. His background in cutting-edge military technologies combined with his work in solving global-scale transportation problems makes him a leading world expert on the subject of autonomous vehicle technologies.

The first thing we wanted to know was “why.” Why have driverless car technologies been ‘just a few years away’ for what seems like a decade now?

Steinhardt relayed that the problem was a combination of invention and application. According to them, the challenge comes in creating useful technologies capable of utilizing what’s already out there:

Our infrastructure was built for humans. … And, this is true whether it’s the military or in business, it’s a bad idea not to leverage what’s already out there.

In this case, we’ve already got stop signs, traffic lights, and myriad visual cues designed to alert human drivers to what’s going on. We combine roadside signage with GPS systems and accelerometers to come up with the modern paradigm involving your smart phone or vehicle console giving you directions – turn right in 500 meters.

The challenge: Consumers tend to think of autonomous vehicle development from the vision-only point of view espoused by Elon Musk and Tesla. The big idea a few years back was that vision – putting cameras all over the car and then training AI to interpret the images – was the perfect replacement for humans.

But, realistically, do we want machines that drive like humans? How far can a vision-only approach take us?

The killer app that Musk’s hoping to unlock by using deep learning to brute force a better vision system is a car that doesn’t need a network to operate. Nobody wants their autonomous vehicle to break down on the side of the road because it can’t establish a network connection.

However, as Steinhardt told us, encryption isn’t really an issue (data’s either encrypted or it isn’t). The real problem is an issue of leveraging what’s already there – signage, roadways, GPS data, etc., – while also adding in communications arrays that don’t rely on existing network infrastructure. Even if a vehicle can’t or shouldn’t leverage an existing network, it needs to be able to establish its own. 

The problem, according to Steinhardt, is that most comm arrays – 5G included – are congested and risk data loss. This can be a crucial point of failure for vehicles that rely on external data. 

Quick take: The reason we’re not all cruising around in driverless cars is because the technology isn’t quite there yet. It’s that simple. 

The vision approach solves some problems, but more is needed if we want cars to drive as good as or better than humans.

AEye and its partners are working towards large-scale production of its LiDAR systems in hopes of solving more than just getting vehicles to drive in a straight line and stop at red lights.

Setting aside the steering wheel and pedals issues, there’s more to driving than just obeying the traffic laws and not crashing. Steinhardt mentioned a scenario a few years back where flooding caused a bridge to wash out.

These situations can be dangerous and even fatal for motorists and, even in the modern world, it can be hard to relay information during a disaster. Who knows how many vehicles could be affected before someone’s able to report the situation or whether the typical lines of communication will be available if infrastructure goes down.

AEye’s iDAR has the patented ability to transform LiDAR sensors into an optical communications network, enabling the sensor to not only to determine what’s happening on the road, but to create real-time communications arrays to help coordinate vehicle ‘knowledge’ across multiple nodes. So even if the internet’s down, or the 4 and 5G networks are congested, such a network would still function optimally.

As Steinhardt put it, the goal should be giving vehicles the ability to create an in-motion system capable of doing “everything the internet does without infrastructure.”

Not only would such a system be able to, for example, warn vehicles within a specified range of changes to roadways in real-time (a washed out bridge), but it’d be capable of innumerable other tasks such as sharing anonymous statistics about passenger entertainment choices or aligning nearby travelers in gaming or trivia challenges. 

Steinhardt tossed out the idea of an “auctioning system” that could work in real-time to seamlessly offload physical products or sell services in a fantastic take on the ages-old traveling salesman problem.

Imagine a world where it was cost-effective and environmentally friendly to conduct sales and deliveries from a moving platform. In effect, a canceled order or vehicle accident could be handled seamlessly in such a way as to eliminate the ripple effect.

Perhaps most-exciting of all, if vehicles run into a situation where they’re underperforming – for example, getting stuck in cul-de-sacs – the system could either collaborate with other vehicles experiencing similar issues to source a solution, or simply warn other cars away from the particular space causing the errors.

And the applications go far beyond consumer vehicles and robo-taxis. The railway, airline, shipping, and trucking industries are all ripe for disruption in the form of an interlaced data-driven autonomy and communications system that can get people and things where they’re going faster, cheaper, and safer than the status quo.

At the end of the day, we’re still left wondering if we’ll see these technologies in mass market abundance on city streets around the globe any time soon.

But the difference between some of the pie-in-the-sky approaches relying on future breakthroughs to make sense and what AEye’s doing is obvious. AEye’s work on iDAR, LiDAR, and its 100 or so other patented technologies is a part of the tide that manufacturers and AI developers are hoping will lift their vessels.

Maybe we’re finally, really, actually ‘just a few years away’ from level 4 and 5 autonomous vehicles hitting the consumer market.

 

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with