Zhongyuan Hau, Kenneth Co, Soteris Demetriou, and Emil Lupu (Imperial College London)

Best Short Paper Award Runner-up!

LiDARs play a critical role in Autonomous Vehicles’ (AVs) perception and their safe operations. Recent works have demonstrated that it is possible to spoof LiDAR return signals to elicit fake objects. In this work we demonstrate how the same physical capabilities can be used to mount a new, even more dangerous class of attacks, namely Object Removal Attacks (ORAs). ORAs aim to force 3D object detectors to fail. We leverage the default setting of LiDARs that record a single return signal per direction to perturb point clouds in the region of interest (RoI) of 3D objects. By injecting illegitimate points behind the target object, we effectively shift points away from the target objects’ RoIs. Our initial results using a simple random point selection strategy show that the attack is effective in degrading the performance of commonly used 3D object detection models.

View More Papers

[WITHDRAWN] First, Do No Harm: Studying the manipulation of...

Shubham Agarwal (Saarland University), Ben Stock (CISPA Helmholtz Center for Information Security)

Read More

Demo #13: Attacking LiDAR Semantic Segmentation in Autonomous Driving

Yi Zhu (State University of New York at Buffalo), Chenglin Miao (University of Georgia), Foad Hajiaghajani (State University of New...

Read More

Mondrian: Comprehensive Inter-domain Network Zoning Architecture

Jonghoon Kwon (ETH Zürich), Claude Hähni (ETH Zürich), Patrick Bamert (Zürcher Kantonalbank), Adrian Perrig (ETH Zürich)

Read More

ROV++: Improved Deployable Defense against BGP Hijacking

Reynaldo Morillo (University of Connecticut), Justin Furuness (University of Connecticut), Cameron Morris (University of Connecticut), James Breslin (University of Connecticut),...

Read More