Robot vacuum cleaners can spy on private conversations | LidarPhone Attack | Acoustic Eavesdropping - YouTube

Channel: unknown

[0]
When your robot vacuum cleaner does its work around the house, beware that it could pick
[6]
up private conversations along with the dust and dirt.
[10]
Computer scientists from National University of Singapore (NUS) have demonstrated that
[15]
it is indeed possible to spy on private conversations using a common robot vacuum cleaner and its
[23]
built-in Light Detection and Ranging (Lidar) sensor.
[26]
The novel method, called LidarPhone, repurposes the Lidar sensor that a robot vacuum cleaner
[34]
normally uses for navigating around a home into a laser-based microphone to eavesdrop
[42]
on private conversations.
[44]
The proliferation of smart devices – including smart speakers and smart security cameras
[51]
– has increased the avenues for hackers to snoop on our private moments.
[58]
The NUS researchers' method shows it is now possible to gather sensitive data just by
[65]
using something as harmless as a household robot vacuum cleaner.
[71]
Their work demonstrates the urgent need to find practical solutions to prevent such malicious
[78]
attacks.
[79]
The core of the LidarPhone attack method is the Lidar sensor, a device which fires out
[87]
an invisible scanning laser, and creates a map of its surroundings.
[92]
By reflecting lasers off common objects such as a dustbin or a takeaway bag located near
[100]
a person’s computer speaker or television soundbar, the attacker could obtain information
[105]
about the original sound that made the objects’ surfaces vibrate.
[110]
Using applied signal processing and deep learning algorithms, speech could be recovered from
[117]
the audio data, and sensitive information could potentially be obtained.
[121]
In their experiments, the researchers used a common robot vacuum cleaner with two sources
[129]
of sound.
[130]
One was the voice of a person reading out numbers played from a computer speaker, while
[136]
the other source was music clips from television shows played through a television soundbar.
[143]
The team collected more than 19 hours of recorded audio files and passed them through deep learning
[151]
algorithms that were trained to either match human voices or identify musical sequences.
[158]
The system was able to detect the digits being spoken aloud, which could constitute a victim’s
[165]
credit card or bank account numbers.
[168]
Music clips from television shows could potentially disclose the victim’s viewing preferences
[174]
or political orientation.
[176]
The system achieved a classification accuracy rate of 91 per cent when recovering spoken
[182]
digits, and a 90 per cent accuracy rate when classifying music clips.
[188]
These results are significantly higher than a random guess of 10 per cent.
[193]
The researchers also experimented with common household materials to test how well they
[199]
reflected the Lidar laser beam and found that the accuracy of audio recovery varied between
[205]
different materials.
[206]
They discovered the best material for reflecting the laser beam was a glossy polypropylene
[212]
bag, while the worst was glossy cardboard.
[216]
To prevent Lidars from being misused, the researchers recommend users to consider not
[222]
connecting their robot vacuum cleaners to the Internet.
[225]
The team also recommends that Lidar sensor manufacturers incorporate a mechanism that
[231]
cannot be overridden, to prevent the internal laser from firing when the Lidar is not rotating.
[237]
The team is working on applying ideas learnt from LidarPhone to autonomous vehicles – which
[244]
also use Lidar sensors – as they could also be used to eavesdrop on conversations happening
[249]
in nearby cars through minute vibrations of the car windows.
[254]
They are also looking at the vulnerability of active laser sensors found on the latest
[260]
smartphones, which could reveal further privacy issues.