MEMS Vulnerability Makes It Possible to Hijack Voice Assistants, Including Alexa, Google Home, or Siri
Virtual assistants were created to perform various tasks to make our lives easier. Over the years, they have changed a lot, and, today, we even have voice assistants like Alexa and Siri that can control several smart devices and perform various tasks after being given verbal commands. Many users use them to listen to music, check the weather, or shop online. Of course, to be able to perform these tasks, a voice assistant has to have access to its user’s accounts and other smart home devices. Thus, hijacking it might allow attackers to shop in the victim’s name, turn his devices on and off, and so on. Unfortunately, recent findings revealed that cybercriminals might not even need to use malicious tools to do so. Apparently, it might be possible for a hacker to take over a smart assistant with the help of a laser pointer, audio amplifier, and a few other things. To learn more about this discovery, we invite you to read our full blog post.
While some people cannot imagine their lives without smart homes, some users avoid having too many smart devices in fear for their security. Events like the one during which hackers attacked owners of a Nest camera or the discovery of a vulnerability in Amazon smart doorbells, only prove that using such devices is always a risk. However, having a device with a voice assistant that might be connected to all of the smart gadgets in your home might be even more dangerous. A few months ago, cybersecurity specialists proved that voice assistants could be employed by hackers via malicious applications to spy on its users. Now, it looks like researchers discovered an attack on voice assistants that could allow cybercriminals to hijack them and do much more than listen to what you are saying.
How could hackers take over smart assistants?
According to researchers from the University of Electro-Communications and University of Michigan who published a paper called Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems, an attacker could misuse the so-called MEMS (micro-electro-mechanical systems) vulnerability to hijack devices like Alexa, Portal, Google Assistant, and Siri. It would seem that microphones on these gadgets can have the mentioned weakness, and it can be exploited by employing things like a laser pointer, audio amplifier, audio cable, and laser current driver. By combining these things, it might be possible to create a tool that would allow injecting a recorded command into a voice assistant. In other words, attackers could communicate with audio assistants by transforming light into sound and injecting recorded commands via a targeted device's microphone.
What do hackers need to carry out the laser-based audio injection attacks?
Luckily, to hijack a device and give the so-called light commands, hackers would need to see the gadget they are targeting as well as be within a distance of 110 meters or less. Also, researchers say that in some cases, hackers might have to be even closer to exploit the MEMS vulnerability. It all depends on the targeted device. As for what might attackers do once they hijack your voice assistant, it might depend on what kind of smart devices you have and what kind of accounts are linked to your voice assistant. For example, if you linked Alexa to your Amazon account so it could buy things for you, the attacker could misuse this to purchase goods for himself. What might sound even scarier is that attackers could brute-force PIN codes of your smart locks or even start your car if it is linked to your speaker.
How to protect your voice assistant from laser-based audio injection attacks?
Researchers say that the lack of authentication mechanisms on microphones is what makes it possible to carry out successful laser-based audio injection attacks. Of course, the specialists who discovered this have already notified companies that create voice-controllable systems about the MEMS vulnerability. However, it may take time until they fully understand the problem and find a way to protect their devices from the mentioned attacks. Therefore, until it happens, you may have to stop hackers from hijacking your voice assistant yourself.
As mentioned earlier, the success of such attacks does not rely on tricking users into revealing information that would help gain access to targeted devices or dropping malicious applications that could grant it. Thus, usual safety precautions, such as watching out for suspicious applications or not sharing your sensitive information carelessly, might be of no use, in this particular case. Instead, it is advisable to keep your device away from windowsills or places in your home that could be visible through the windows. If attackers cannot point their lasers to your device’s microphone, your voice assistant should be safe.
Overall, the discovery of laser-based audio injection attacks reminds us that while having a smart home might make things more comfortable and more fun, it can also be hazardous. However, it does not mean that you have to say goodbye to your smart home devices to be safe. What we recommend instead is that you learn how to enhance your smart home security. Besides, you should always stay on top of the latest cybersecurity news so that you would know how to protect your devices against the latest threats.