Zhongyuan Hau, Kenneth Co, Soteris Demetriou, and Emil Lupu (Imperial College London)

Best Short Paper Award Runner-up!

LiDARs play a critical role in Autonomous Vehicles’ (AVs) perception and their safe operations. Recent works have demonstrated that it is possible to spoof LiDAR return signals to elicit fake objects. In this work we demonstrate how the same physical capabilities can be used to mount a new, even more dangerous class of attacks, namely Object Removal Attacks (ORAs). ORAs aim to force 3D object detectors to fail. We leverage the default setting of LiDARs that record a single return signal per direction to perturb point clouds in the region of interest (RoI) of 3D objects. By injecting illegitimate points behind the target object, we effectively shift points away from the target objects’ RoIs. Our initial results using a simple random point selection strategy show that the attack is effective in degrading the performance of commonly used 3D object detection models.

View More Papers

Towards a TEE-based V2V Protocol for Connected and Autonomous...

Mohit Kumar Jangid (Ohio State University) and Zhiqiang Lin (Ohio State University)

Read More

Demo #9: Dynamic Time Warping as a Tool for...

Mars Rayno (Colorado State University) and Jeremy Daily (Colorado State University)

Read More

Reinforcement Learning-based Hierarchical Seed Scheduling for Greybox Fuzzing

Jinghan Wang (University of California, Riverside), Chengyu Song (University of California, Riverside), Heng Yin (University of California, Riverside)

Read More

Trust the Crowd: Wireless Witnessing to Detect Attacks on...

Kai Jansen (Ruhr University Bochum), Liang Niu (New York University), Nian Xue (New York University), Ivan Martinovic (University of Oxford), Christina Pöpper (New York University Abu Dhabi)

Read More