Inaudible commands can hack digital assistants
Researchers at Zhejiang University were able to hack voice assistants like Alexa and Siri using frequencies too high for humans to hear that were still picked up by the devices.
Why it's scary: Many "hey Siri" functions may be harmless, but the researchers could also tell an Amazon Echo to "unlock the back door," redirect the navigation system of an Audi Q3, or install malware on an iPhone. The researchers used a smartphone with just $3 worth of extra hardware. With a little technological expertise, this method could easily be replicated.
Why it matters: Security experts worry the the so-called Internet of Things (including connected devices like digital assistants) are highly vulnerable to security breaches and other attacks not anticipated by manufacturers.