Back to articles
When light becomes a weapon: laser-based command injection attacks on voice assistants

When light becomes a weapon: laser-based command injection attacks on voice assistants

via Dev.toHelixCipher

A research introduces LightCommands, a novel class of signal-injection attacks that convert amplitude-modulated light into audio signals at a microphone’s aperture, enabling attackers to inject arbitrary voice commands into popular voice-controllable systems (Alexa, Siri, Google Assistant, Portal) from tens of meters away and through windows/structures. This isn’t science fiction, the team demonstrated control at distances up to ~110 m with commercially available lasers. Why it matters: voice assistants and smart home devices increasingly control sensitive assets (locks, vehicles, payments, home automation). Light-induced audio injection bypasses traditional acoustic channels and human hearing, enabling remote attackers to issue real commands with zero physical access and no audible evidence from unlocking smart locks to triggering purchases. Key technical takeaways: • Physical signal injection via light: MEMS microphones can unintentionally interpret amplitude-modulated light as sound

Continue reading on Dev.to

Opens in a new tab

Read Full Article
3 views

Related Articles