Tech

Hackers can use white noise to break into your Alexa

You know how only dogs can hear a dog whistle? Well, it turns out that there are certain types of white noise that only services like Amazon’s Alexa and Apple’s Siri can hear — and experts think hackers know how to use them.

This month, researchers have demonstrated the ability to send instructions to smart devices that are undetectable to the human ear. Students from the University of California, Berkeley, and Georgetown University published new research demonstrating that they were able to bury hidden messages through music or white noise.

Researchers say that scammers could use this technology to ask your device to shop online, wire money or unlock doors. In fact, they’re probably doing so already.

“My assumption is that the malicious people already employ people to do what I do,” Nicholas Carlini, a Ph.D. student in computer security at UC Berkeley and co-author of the study, tells WRAL Tech Wire.

Amazon says that it has put in security measures to prevent such attacks, although the company didn’t specify how. Apple says that its HomePod speakers prevent some commands like unlocking doors and that iPhones must be unlocked before using Siri’s voice commands.

This is unsettling news, but hardly surprising. In April, The Sun reported that Amazon filed a patent that explains that “a smartphone or tablet computer can actively listen to audio data for a user, such as may be monitored during a phone call or recorded when a user is within a detectable distance of the device.”

Carlini hopes that his team’s research will alert people to be more conscious of their devices.

“We want to demonstrate that it’s possible,” he says, “and then hope that other people will say, ‘OK, this is possible. Now, let’s try and fix it.’”