According to a report by Wired, cyber-security researcher Takeshi Sugawara and a group of researchers from the University of MI say that they can change the intensity of a laser beam and point it to a smart speaker's microphone to send any command that would be interpreted as a normal voice command by a smart speaker.
Consumers are increasingly outfitting their homes with smart speakers and displays that use wide-field microphones to pick up voice commands.
Additional researchers from the University of Electro-Communications in Tokyo and the University of MI contributed to the work.
Yes, Alexa and Google Home devices can still eavesdrop on your conversations Exclusive: Amazon, Google fail to address security loopholes in Alexa and Home devices more than a year after first reports.
Recently, researchers from the University of Electro-Communications based in Japan and the University of MI based in the United States discovered that certain tools such as laser beams and flashlights can be used to control certain voice-controlled digital assistants.
PSG Boss Thomas Tuchel Rejects Offer From German League Champions
On the Exchange, bettors are certainly banking on that with Bayern [1.45] to win the Bundesliga for the eighth successive season. Niko Kovac was axed by the German champions on Monday - two days after their 5-1 Bundesliga loss at Eintracht Frankfurt .
Microphones convert sound into electrical signals.
"This opens up an entirely new class of vulnerabilities", University of MI associate professor of electrical engineering and computer science Kevin Fu said. The light would hit the diaphragm built into the smart speaker's microphone, causing it to vibrate in the same way as if someone had spoken that command. A team of researchers has achieved success in fooling virtual assistants by pointing a specially crafted laser beam towards various devices, including Google Home, Amazon Echo, iPhone XR, Google Pixel 2, Facebook Portal Mini, etc. The method, aptly, is called Light Commands.
Hitherto, hacking such systems has been about sending them audible commands without their owner's knowledge. But there are limitations to the stealth of a light command attack, researchers found. In order to do this, they had to point a laser at the target device's microphone using a telephoto lens and a tripod to change the intensity to a specific frequency. For example, the Google Home has mics facing up and slightly back.
What these researchers have demonstrated is that security attacks are getting increasingly advanced and that companies need to consider not just software attacks but also physical attacks against their devices. The deals will be posted in one feature so they're easy to find and you can even bookmark the page once it's live on site.
The researchers have already notified Tesla, Ford, Amazon, Apple and Google about the issue - a move that's highly important to get the problem fixed, since simply covering microphones with tape wouldn't solve it. "The risks associated with these attacks range from benign to frightening depending on how much a user has tied to their assistant". Apple declined to comment. Firstly, make sure your device is not physically within the line of sight of any flashlight either from a close range or even from a window seen across the building.