LiDARs play a critical role in Autonomous Vehicles’ (AVs) perception and their safe operations. Recent works have demonstrated that it is possible to spoof LiDAR return signals to elicit fake objects. In this work we demonstrate how the same physical capabilities can be used to mount a new, even more dangerous class of attacks, namely Object Removal Attacks (ORAs). ORAs aim to force 3D object detectors to fail. We leverage the default setting of LiDARs that record a single return signal per direction to perturb point clouds in the region of interest (RoI) of 3D objects. By injecting illegitimate points behind the target object, we effectively shift points away from the target objects’ RoIs. Our initial results using a simple random point selection strategy show that the attack is effective in degrading the performance of commonly used 3D object detection models.

View More Papers

SerialDetector: Principled and Practical Exploration of Object Injection Vulnerabilities...

Mikhail Shcherbakov (KTH Royal Institute of Technology), Musard Balliu (KTH Royal Institute of Technology)

Read More

The Bluetooth CYBORG: Analysis of the Full Human-Machine Passkey...

Michael Troncoso (Naval Postgraduate School), Britta Hale (Naval Postgraduate School)

Read More

Forward and Backward Private Conjunctive Searchable Symmetric Encryption

Sikhar Patranabis (ETH Zurich), Debdeep Mukhopadhyay (IIT Kharagpur)

Read More

Raising Trust in the Food Supply Chain

Alexander Krumpholz, Marthie Grobler, Raj Gaire, Claire Mason, Shanae Burns (CSIRO Data61)

Read More