This article was published on September 7, 2017

Hackers can access voice assistants using inaudible commands

Hackers can access voice assistants using inaudible commands
Bryan Clark
Story by

Bryan Clark

Former Managing Editor, TNW

Bryan is a freelance journalist. Bryan is a freelance journalist.

Chinese researchers recently hacked popular smart assistants like Siri and Alexa. Using inaudible voice commands outside the range of human hearing, the researchers successfully tricked the assistants into completing a range of commands.

Calling out “hey Siri” to an iPad or “Alexa” to an Amazon Echo triggers an expected response: a digital assistant waiting for your command. And these commands are usually something simple like checking the weather or making a calendar entry.

When paired with other smart devices, however, these digital assistants open up an entirely new set of possibilities for bad actors. Asking Alexa to unlock the front door, for example, poses a serious threat when an unintended party is doing the asking. Situations like this get markedly worse when you can’t even hear what requests are being made.

For researchers at Zhejiang University, all it took was a cell phone and $3 worth of hardware to hack devices in a way, they say, is easily replicated.

As household mainstays like door locks and thermostats continue being replaced with smarter successors, experts worry about the implications to household security. Devices connected to the so-called ‘Internet of Things’ have proven again and again to be highly vulnerable to security breaches, and typically ones involving cheap hardware and open source code.

For manufacturers of traditionally non-connected items (like door locks), the challenge of adapting to industry demand for smart devices is leading to critical flaws in the design of popular products.

And chances are, it’ll get worse before it gets better.

Back to top