Self-Hacked

Academic researchers have discovered that if you get within close enough proximity, you can "hack" an Amazon Alexa device to execute commands the owner may not want. I put the word hack in quotes because this is less of a security failure and more of a "that's how this thing works" way that the Echo (or any voice assistant) operates. Dan Goodin at Ars Technica has more.

The attack works by using the device’s speaker to issue voice commands. As long as the speech contains the device wake word (usually “Alexa” or “Echo”) followed by a permissible command, the Echo will carry it out, researchers from Royal Holloway University in London and Italy’s University of Catania found. Even when devices require verbal confirmation before executing sensitive commands, it’s trivial to bypass the measure by adding the word “yes” about six seconds after issuing the command. Attackers can also exploit what the researchers call the "FVV," or full voice vulnerability, which allows Echos to make self-issued commands without temporarily reducing the device volume.

Ars Technica

What this is is someone using a device to have spoken words play in the same room as an Amazon Echo. By interacting with one voice assistant or speaker, you can tell another nearby assistant to do things and confirm those actions a few seconds later. The device doesn't know it's not a human. In fact, many speaking devices sound human enough these days. While Amazon may be able to mitigate this type of "attack", it is likely quite difficult. How could it ever know what kind of voice is a person and what voice is from another speaker?

I find it amusing that this is considered a hack because I've used this before myself. At times when I needed my Echo to do something, I would set my home automation system to play a text-to-speech file that spoke the commands I needed. The Echo didn't know it was a speaker and would do as I asked. This was a very low-tech way to chain commands together from incompatible systems.

The article does go on to say that Amazon fixed several of the weaknesses, including one that used Alexa skills to self-wake devices. By securing skills, which are really just apps for the platform, at least Amazon is trying to stop a programmatic method from kicking these things off. But chaining commands from a speaker to Alexa, a Homepod, or Google Home is likely an impossible situation to truly fix.