Домой United States USA — IT Alexa, Siri, and Google Assistant can hear silent commands that you can’t

Alexa, Siri, and Google Assistant can hear silent commands that you can’t

299
0
ПОДЕЛИТЬСЯ

Researchers have proven that voice assistants like Alexa, Google Assistant and Siri can hear silent commands that the human ear can’t pick up. In some cases, researchers have been able to embed commands directly into white noise or musical recordings and users were unable to perceive them.
A series of studies have proven that it’s possible to secretly give silent commands to voice assistants like Amazon Alexa and Google Assistant without their owners ever knowing.
According to the New York Times, researchers in both China and the U. S. have carried out a series of experiments which ultimately proved that it’s possible to communicate silent commands that are undetectable to the human ear to voice assistants like Siri, Alexa and Google Assistant.
The findings call to light a variety of security concerns as they reveal just how vulnerable voice assistant data could be.
In one study conducted by Georgetown University and University of California, Berkeley in 2016, student researchers successfully hid secret voice commands with the help of white noise. The students were able to get smart devices to switch over to airplane mode and navigate to websites by hiding commands to do so in white noise that way played through YouTube videos and loudspeakers.
White noise essentially overrides any other sounds around it due to the fact that white noise is a mixture of all the sound frequencies the human ear can detect. By inserting a smart speaker command into white noise, the researchers essentially camouflaged the commands from human listeners.
Taking the research to the next level, some of the same Berkeley researchers published a paper this month, in which they demonstrated that they could insert silent commands into spoken text and music files. While you might think you’re listening to an audiobook or a piece of classical music, your smart speaker could be receiving a litany of commands telling it to change its settings or purchase items from your Amazon account.
So far, there’s nothing to suggest that such subliminal messages have been successful outside of a student research setting, but one of the Berkeley paper’s authors says he believes it’s just a matter of time: “My assumption is that the malicious people already employ people to do what I do,” he told the New York Times .
As the Times points out, voice recognition systems are set up to recognize each sound you make as a letter, which the system then collects into complete words and phrases.
What these research studies prove is that it’s possible to manipulate speech recognition gadgets by making minute changes to speech or other audio files. In doing so, you could essentially override the message the voice assistant is supposed to receive and substitute it with sounds that will be interpreted differently, thus giving the voice assistant a different command that would be virtually unrecognizable to the human ear.
According to the coverage in the Times, both Amazon and Google have taken measures to protect their smart speakers and voice assistants against such manipulations. Individual user voice recognition is one such protocol that could prevent such silent commands from being successful: if your smart speaker is calibrated to only respond to your voice, a silent command should, in theory, not have an effect on it.

Continue reading...