Siri
Apple's Siri

Smart assistants such as Apple's Siri, Amazon's Alexa, and Google Home have proven somewhat useful and extremely convenient throughout various aspects of our everyday lives. These devices can turn on lights, unlock doors, and even start certain vehicles all from the sound of our voice. However, these products have recently been exposed as having a major vulnerability.

After teaming up with cybersecurity researcher Takeshi Sugawara from the University of Electro-Communications in Japan, Professor Kevin Fu and a team of researchers from the University of Michigan have discovered that these devices are surprisingly easy to hack using just your average laser pointer. Dubbed as 'Light Commands', the system is really quite simple and, thus, a terrifying realization of just how at risk users of these devices are. They published their findings here.

A smart assistant's primary function is to respond to commands by either answering questions proposed by the user or executing tasks given by the user. As this is typically done by verbally addressing the device, it's safe to say that the internal microphone is the most important component therein, and therefore the key component in the hacking of these devices. 

What exactly is the hack?

The hack begins by simply manipulating the internal microphones, thus tricking the device into responding and executing tasks even in the absence of a verbal command. As a matter of fact, this can be accomplished from a distance and even through a window.

 

 

How is it done?

For starters, microphones are a very basic technology and work by simply converting sound into electrical signals. Inside every microphone is a small plate, called a diaphragm; when you speak into the microphone, the diaphragm moves, thus creating electrical signals. The internal software then translates the electrical signals, recognizes the command, and provides the proper response. According to the research, light can cause the same effect, and therefore can silently "speak" to voice-activated devices.

"It's possible to make microphones respond to light as if it were sound," says Sugawara. "This means that anything that acts on sound commands will act on light commands."

 

However, that may not be the only explanation for the plausible hack. According to Paul Horowitz, a professor emeritus of physics and electrical engineering at Harvard and the co-author of The Art of Electronics, there is another possible explanation for the light-as-speech effect demonstrated by the research team's experiments.

Mr. Horowitz says that if the components of the device aren't completely opaque, then there's a chance that the beam of light actually bypasses the microphone and directly contacts the microchip responsible for interpreting the vibrations. This, too, could alter the functionality of the device, causing it to register electrical signals as a result of the laser's vibrations, mistaking the laser for an actual voice command.

"There's no dearth of theories, one or more of which is happening here," Horowitz says.

What are the companies doing about it?

 It has been reported that Google and Amazon are working with Sugawara's team to enhance security features. One suggestion was a reflective cover over the microphones or possibly adding another microphone to the adjacent side as it would be difficult to manipulate multiple areas of the device. Apple and Facebook have not replied, according to the researchers.

How to Personally Safe Guard Your Device from Potential Hackers

 While these tech companies continue working with the international team to resolve the issue and improve security, it's a good idea to take precautionary measures of your own. First, try to avoid placing your device within close proximity to a window, or with an unobstructed path to the outside. Also, check your security settings and make sure your device is as secure as possible. Lastly, incorporate spoken PINs if your device has the option to do so.