Alexa, Google, Facebook, Siri Microphones Can All Be Remotely Attacked by Light Commands

Researchers created a proof of concept attack using a photoacoustic effect aiming laser light at the microphones of popular voice assistants using MEMS (micro-electro-mechanical systems) microphones to fool them into interpreting the light beams as sound commands. This is known as a laser-based audio injection. In the lab, they used an oscilloscope to measure background noise and figured out how to send the right combination of light pulses to mimic the sound commands. The attack can be performed with a cheap laser pointer if you know how to tune it with an oscilloscope. The researchers found they could attack a device up to a distance of 110 meters (360 feet). Devices secured by a PIN would probably need to be attacked with brute force attacks. Researchers are continuously looking for ways to hack devices by sound or reading the light of LCD screens or even measuring the sound off of hard drives to try to convert them to readable data,

Abstract—“We propose a new class of signal injection attacks on microphones based on the photoacoustic effect: converting
light to sound using a microphone. We show how an attacker can inject arbitrary audio signals to the target microphone by aiming an amplitude-modulated light at the microphone’s aperture. We then proceed to show how this effect leads to a remote voice-command injection attack on voice-controllable systems. Examining various products that use Amazon’s Alexa, Apple’s Siri, Facebook’s Portal, and Google Assistant, we show how to use light to obtain full control over these devices at
distances up to 110 meters and from two separate buildings. Next, we show that user authentication on these devices is often lacking or non-existent, allowing the attacker to use light-injected
voice commands to unlock the target’s smartlock-protected front doors, open garage doors, shop on e-commerce websites at the target’s expense, or even locate, unlock and start various vehicles (e.g., Tesla and Ford) that are connected to the target’s Google account. Finally, we conclude with possible software and hardware defenses against our attacks. Index Terms—Signal Injection Attack, Transduction Attack, Voice-Controllable System, Photoacoustic Effect, Laser, MEMS”

Full research paper:

Read the Ars Technica summary


Featured Webinars


Advanced Phishing and
Training

Monday 1:30 PM – 2:30 PM
» Learn More
Outlook Phish Alert Button
Tuesday 1:30 PM – 2:30 PM
» Learn More
Customizing Phishing Templates, Landing Pages, & Training Notifications
Wednesday 1:30 PM – 2:30 PM
» Learn More
Active Directory Integration
(ADI) Setup

Thursday 1:30 PM – 2:30 PM
» Learn More
Gold/Platinum/Diamond
Features

Friday 1:30 PM – 2:30 PM
» Learn More

Privacy Policy | Terms of Service