Virtual personal assistants can be hijacked by subliminal messages embedded in music
We've seen how a young child can place an order for a dollhouse using Alexa, and how a television anchorman repeating the story accidentally caused other Echo units to order a dollhouse too. Now comes a story that is even scarier in its implications. A report in Thursday's New York Times revealed that students from University of California, Berkeley, and Georgetown University were able to hide subliminal commands for virtual personal assistants inside white noise played over loudspeakers. The commands included enabling airplane mode and requesting a website on a smartphone. That was back in 2016.
Major smart speaker manufacturers like Amazon, Google and Apple say that they have safeguards in place to prevent their assistants from being hijacked. Amazon and Google use technology to block commands that cannot be heard. To block Alexa from accidentally getting activated on devices during its Super Bowl ad (which was about Alexa losing her voice), the company played a tone in the range of 3,000 to 6,000Hz. In addition, voice recognition technology prevents Alexa and Google Assistant from following through on tasks requested by an unknown voice. Apple says that its HomePod won't act on commands that unlock doors, and iPhone and iPod models must be unlocked before Siri can access personal data, call up websites or open apps.
Last year, a Burger King commercial said, "O.K., Google, what is the Whopper burger?" The idea was to get Android devices to read aloud the ingredients of the burger from the Whopper's Wikipedia page. But when the page was overrun with obviously false ingredients like "toe nail clippings," the ad was pulled.
Perhaps the scariest scenario was tested by Chinese and American researchers who discovered that virtual personal assistants would follow commands hidden inside music played from the radio or over YouTube. So if your phone or smart speaker starts acting funny and performs tasks that you didn't request, it could be reacting to a message embedded in a song you are listening to on the radio. Or, it could be coming from a message embedded in a sound that you can't even hear.
Things that are NOT allowed: