Siri and Alexa can be Hacked using Inaudible Voice Commands

In many houses, people are accustomed to talking with the digital assistants like Apple’s Siri and Amazon’s Alexa but researchers have reported that they can secretly hear someone else also.

Due to the accessibility and efficiency of Speech Recognition systems which translates the human-spoken words into the machine-readable formats, it is being extensively used to control different systems.

Researchers at Zhejiang University reported that these digital assistants can respond to human inaudible voice commands termed as DolphinAttack.Using this attack, researchers demonstrated following practices

Visiting a malicious website. The device can open a malicious website, which can launch a drive-by-download attack or exploit a device with 0-day vulnerabilities.

Spying. An adversary can make the victim device initiate outgoing video/phone calls, therefore getting access to the image/sound of device surroundings.

Injecting fake information. An adversary may instruct the victim device to send fake text messages and emails, to publish fake online posts, to add fake events to a calendar, etc.

Denial of service. An adversary may inject commands to turn on the aeroplane mode, disconnecting all wireless communications.

Concealing attacks. The screen display and voice feedback may expose the attacks. The adversary may decrease the odds by dimming the screen and lowering the volume

Another group of researchers at California University reported that they can hide the audio commands inside recordings of music. An audio that would be some music for the human can instruct the digital assistant to open doors for you or to add something secretly to your shopping list.

“We wanted to see if we could make it even more stealthy,”

Said Nicholas Carlini, a fifth-year PhD student in computer security at U.C. Berkeley and one of the paper’s authors. “There is no evidence that these techniques have left Lab” Mr. Carlini added.

Response

Amazon said that it doesn’t disclose specific security measures, but it has taken steps to ensure its Echo smart speaker is secure. Google said security is an ongoing focus and that its Assistant has features to mitigate undetectable audio commands. Both companies’ assistants employ voice recognition technology to prevent devices from acting on certain commands unless they recognize the user’s voice.

Apple said its smart speaker, HomePod, is designed to prevent commands from doing things like unlocking doors, and it noted that iPhones and iPads must be unlocked before Siri will act on commands that access sensitive data or open apps and websites, among other measures.