The French government agency ANSSI has found out that Apple’s Siri and Android’s Google Now were both able to be exploited and hacked into from 16 feet away. This particular hack is done by using a radio transmitter, which then taps into headphones with a microphone that is plugged into the mobile device, and then that headphone cable ends up acting like an antenna to transmit the radio waves.
Part of this should not be a surprise as Apple has used headphone cables in the past to enable the FM radio reception of the iPod Nano device, which proves just how easy that they can be used for radio antennas. ANSSI found out that this little trick Apple has been using for years can be exploited to turn an iPhone or Android device into thinking that the connected microphone is the one making the commands, when in reality it is not. A hacker does not have to say a word for this to work, because the radio signal is telling Google Now or Siri to make those texts and calls. If the hacker tells Siri to send a text and dial the hacker’s phone number, then that device can quickly turn into an eavesdropping device. This exploit also could end up taking the browser on the Android or iOS device to a malware site, or send various spam or phishing messages to that email account or social media account.
This particular hack can be done from about six feet away with equipment that you could easily fit inside of a small purse or backpack. If you have a more powerful form of this hack, then it could work up to 16 feet away, although the equipment would have to be housed in something bigger, such as a car. The good news is that this hack can only be used on the headphone-connected iPhones, and Siri would have to be enabled from the lock screen, although this is the default setting on Apple iPhones.
The bad news is that this hack works with both the newer iOS devices such as the iPhone 6S, and the older devices too. “Hey Siri” now is always listening in as a default, but the older devices have Siri activation on the headphones, and the hack is spoofing this button press that is required in order to activate Siri. For instance, the hackers can spoof the activation on the Apple EarPods, and any other headphones that have the Siri button on the product. While we already know anyone can access Siri if they have the device, this new technique basically means you can access the device in a more remote and stealthy way. Since you can access the device in a stealthy manner with this hack, it becomes more likely that the user will not even know what is happening.
Even Android devices now are coming with the voice recognition for Google Now access, which makes it harder for a hack to happen, but Apple has not built this functionality into Siri. The older Android devices though are the most at risk of being exploited by this hack compared to the newer devices. “Hey Siri” is now being tailored to the individual user starting with iOS 9, which basically means Siri the digital assistant will be able to recognize the voice of the user, and this could eventually lead into voice recognition as a security feature down the road.
For people who are concerned about this type of hack, the best way to stop it from happening is to disable the access to Siri from the lockscreen. You can do this by opening up “Settings” on the iOS device, then clicking “Touch ID and Passcode” and scroll to uncheck Siri, which should say “allow access when locked” and there you have it. You also can disable the access to various other parts as well, such as notifications and reply to messages or wallet. Out of all of these the Wallet is the other big one that you really want to disable which would eliminate the possibility a hacker gets your financial or payment information from this radio wave hack.
You also can go into the root Settings and then go to the Control Center, and then check disable access on the lock screen, which would prevent airplane mode being used if it was stolen without turning the device off. ANSSI told both Google and Apple that they need to make a shield for their headphone cords so that it would be more difficult for a hacker to co-opt the headphones. There would also be a possibility that using electromagnetic sensors as security would work as well if Google or Apple wants to adopt that feature into future devices.