Need a way to turn people off autonomous vehicles (AVs)? That’s what happened with Toyota’s e-Pallette at the Tokyo Paralympics games this week. One of the company’s self-driving transportation collided with and injured a visually impaired pedestrian. The company suspended all self-driving e-Palette transportation pods at the Games village in response.
What exactly happened?
While Toyota claims the accident happened during manual control, the company has suspended the use of all e-Palette self-driving pods.
In a YouTube video, Toyota Chief Executive Akio Toyoda explained that the vehicle was under manual control at the time of the accident.
The driver used the control joystick to stop at a T junction and then — when attempting to turn — hit the athlete going at around 1 or 2 kilometers an hour.
Luckily, the athlete wasn’t badly hurt and was able to walk back to their residence after receiving medical attention. Toyota is cooperating with local police to determine the cause of the accident and will conduct its own investigation.
To date, it’s not clear whether the vehicle, the operator holding the joystick, or a combination of both, that caused the accident.
Passengers aren’t the only ‘client’
Toyota’s attention to accessibility when it comes to the e-Palette is commendable, but you could argue it didn’t do enough.
For the people inside the vehicle, it’s great. The e-Palette is a low-speed self-driving pod at SAE level 4. It features large sliding large doors, low floors, and electric ramps which eases access. It’s also capable of transporting up to four passengers in wheelchairs along with standing room, which is impressive.
But when it comes to people outside the vehicle it gets trickier.
The e-Palette has headlamps designed to mimic eye contact, which is neat… but isn’t much help to the visually impaired. That’s why a fundamental pillar of car-pedestrian communication is that cars need to be seen and heard.
Since 2019, electric vehicles makers have been required to include an Acoustic Vehicle Alerting System (AVAS). This is in response to claims they are too quiet to be heard by blind people or their guide dogs.
Now it’s of course still unclear if the pedestrian was able to hear the car or not in this incident, but Toyota’s Chief Executive did say it showed that “autonomous vehicles are not yet realistic for normal roads.”
Is the problem automation… or semi-automation?
The ToyotaChief’s comment is understandable — and frankly correct — but it also skirts around one of the bigger issues: people.
When car-makers promised us the dream of self-driving vehicles, it was effectively cars that drive themselves while we take a nap or lead a meeting.
[Read: The Taliban love Toyota… but why?]
But what we’ve got so far is semi-autonomous vehicles like the e-Palette where a driver sits behind the wheel. The driver is ideally hyper-alert with their hands at the ready to take over control. But are drivers are capable of interacting safely with semi-automation?
In most cases, yes. However, semi-autonomous vehicles (with the help of humans) have been killing people since 2018. when a safety driver in a self-driving Uber failed to notice a pedestrian until it was too late.
While these vehicles are equipped with alarms, deceleration, and other whistles to ensure driver vigilance, they are hardly a posterchild for further automation.
Do I trust machines more than humans? Kinda
If Toyota’s e-Palette was fully autonomous, it wouldn’t have to rely on the driving skills or the focus of humans. In some respects, I trust a road full of completely autonomous vehicles more than a mix of semi-autonomous and zero automation cars.
Why? Because humans are unpredictable. They drive while drunk and stoned, and speed for fun. Autonomous vehicles do not. So if we can work out the details (ok, there’s quite a few) like their ability to distinguish between stop signs on the streets and on trucks — and to be able to identify and stop for humans as well as animals — things might just get interesting.
But there’s no way AVs will roll out without the gradual increase of autonomous functionality. A lot of drivers don’t seem able to cope with the current level of automation. This is the rub and I can’t see a way to reconcile this.
Meanwhile, we see more and more crashes and other incidents that fail to convince the average person of the future safety of autonomous vehicles. The road between L4 and L6 automation is long, and it’s proving painful.
Do EVs excite your electrons? Do ebikes get your wheels spinning? Do self-driving cars get you all charged up?