Bo Yang (Zhejiang University), Yushi Cheng (Tsinghua University), Zizhi Jin (Zhejiang University), Xiaoyu Ji (Zhejiang University) and Wenyuan Xu (Zhejiang University)

Due to the booming of autonomous driving, in which LiDAR plays a critical role in the task of environment perception, its reliability issues have drawn much attention recently. LiDARs usually utilize deep neural models for 3D point cloud perception, which have been demonstrated to be vulnerable to imperceptible adversarial examples. However, prior work usually manipulates point clouds in the digital world without considering the physical working principle of the actual LiDAR. As a result, the generated adversarial point clouds may be realizable and effective in simulation but cannot be perceived by physical LiDARs. In this work, we introduce the physical principle of LiDARs and propose a new method for generating 3D adversarial point clouds in accord with it that can achieve two types of spoofing attacks: object hiding and object creating. We also evaluate the effectiveness of the proposed method with two 3D object detectors on the KITTI vision benchmark.

View More Papers

Time-Based CAN Intrusion Detection Benchmark

Deborah Blevins (University of Kentucky), Pablo Moriano, Robert Bridges, Miki Verma, Michael Iannacone, and Samuel Hollifield (Oak Ridge National Laboratory)

Read More

MUVIDS: False MAVLink Injection Attack Detection in Communication for...

Seonghoon Jeong, Eunji Park, Kang Uk Seo, Jeong Do Yoo, and Huy Kang Kim (Korea University)

Read More

The Taming of the Stack: Isolating Stack Data from...

Kaiming Huang (Penn State University), Yongzhe Huang (Penn State University), Mathias Payer (EPFL), Zhiyun Qian (UC Riverside), Jack Sampson (Penn State University), Gang Tan (Penn State University), Trent Jaeger (Penn State University)

Read More

(Short) Spoofing Mobileye 630’s Video Camera Using a Projector

Ben Nassi, Dudi Nassi, Raz Ben Netanel and Yuval Elovici (Ben-Gurion University of the Negev)

Read More