源之原味

声音助理可能被愚弄的命令, 你甚至听不到

 

本文来自thenextweb.com。源URL是: https://thenextweb.com/artificial-intelligence/2018/05/11/voice-assistants-could-be-fooled-by-commands-you-cant-even-hear/

以下内容由机器翻译生成。如果您觉得可读性不好, 请阅读原文或 点击这里.

Many people already consider voice assistants to be too invasive to let them listen in on conversations in their homes — but that’s not the only thing they should worry about. Researchers from the University of California, Berkeley, want you to know that they might be also be vulnerable to attacks that you’ll never hear coming.

In a new paper (PDF), Nicholas Carlini and David Wagner describe a method to imperceptibly modify an audio file so as to deliver a secret command; the embedded instruction is inaudible to the human ear, so there’s no easy way of telling when Alexa might be asked by a hacker to add an item to your Amazon shopping cart, or worse.

To demonstrate this, Carlini hid the message, “OK Google, browse to evil.com,” in a seemingly innocuous sentence, as well as in a short clip of Verdi’s ‘Requiem,’ which fooled Mozilla’s open-source DeepSpeech transcription software.

Speaking to The New York Times, Carlini – who, in 2016, demonstrated how he and his team could embed commands in white noise played along with other audio to get voice-activated devices to do things like turn on airplane mode – said that while such attacks haven’t yet been reported, it’s possible that “malicious people already employ people to do what I do.”

Thanks for the cheerful thought, Nicholas.

There have been other (unfortunately successful) attempts to fool voice assistants, and there aren’t a lot of ways to counter such audio from being broadcasted to target people’s ‘smart’ devices. One method called DolphinAttack even muted the target phone before issuing inaudible commands, so the owner wouldn’t hear the device’s responses.

We need hardware makers and AI developers to tackle such subliminal messages, particularly for devices that don’t have screens to give users visual feedback and warnings about having received secret commands. In demonstrating what’s possible with this method, Carlini’s goal is to encourage companies to secure their products and services so users are protected from inaudible attacks.

Let’s hope Google, Amazon, Apple, and Microsoft are listening.

下一次网络的2018会议只是几个星期的路程, 这将是💥💥。了解我们的足迹这里.

下一篇:

South Korea’s largest cryptocurrency exchange under investigation for fraud

Leave A Reply

Your email address will not be published.