Security researchers have discovered a way to covertly hijack Siri and other smartphe digital assistants using ultrasic waves, sounds that cannot normally be heard by humans.
The attack, inaudible to the human ear, can be used for messages, making phe calls, fraudulent or to take pictures without the user’s knowledge.
Nicknamed SurfingAttack, the exploit uses high-frequency, inaudible sound waves to activate and interact with a digital device in the wizard. While similar attacks have occurred in the past, SurfingAttack focuses the transmissi of said waves through solid materials, such as tables.
The researchers found they could use a $5 piezoelectric transducer, cnected to the lower part of the table, to send these ultrasic waves, and activate a voice assistant without the knowledge of the user.
The use of these inaudible ultrasic waves, the team has been able to wake up the voices of wizards and issue commands to make phe calls, take pictures, or a message that ctained a two-factor authenticati password.
To further hide the attack, the researchers first sent out an inaudible order down a device’s volume before recording the answers with the help of another device, hidden under a table.
SurfingAttack has been tested a total of 17 devices and found to be effective against most of them. Select Apple iPhones, Google Pixels and Samsung Galaxy devices are vulnerable to attack, although the research does not have specific notice that the iPhone models have been tested.
All the digital assistants, including Siri, Google Assistant, and Bixby, are vulnerable.
ly the Huawei Mate 9 and the Samsung Galaxy Note 10+ was safe from the attack, although the researchers attribute the different sound properties of their materials. They have also indicated that the attack had been less effective when used the tables clovered by a tablecloth.
The technique rests the exploitati of the n-linearity of a device’s MEMS microphe, which are used in most voice-ctrol devices and include a small membrane that can translate sound or light waves in the usable commands.
Although effective against smartphes, the team discovered that SurfingAttack does not work the speakers as the Amaz Echo or Google Home devices. The main risk seems to be hidden cameras, hidden in advance in the underside of coffee tables, desk, and other similar surfaces.
The research has been published by an internatial team of researchers from the University of Washingt in St. Louis, Michigan State University, the Chinese Academy of Sciences, and the University of Nebraska-Lincoln. It was first presented to the Network Distributed System Security Cference Feb. 24 in San Diego.
SurfingAttack is far from the first time that inaudible sound waves have been used to exploit vulnerabilities. The research builds several previous research projects, including the similarly named DolphinAttack.