Christopher DiPalma, Ningfei Wang, Takami Sato, and Qi Alfred Chen (UC Irvine)

Robust perception is crucial for autonomous vehicle security. In this work, we design a practical adversarial patch attack against camera-based obstacle detection. We identify that the back of a box truck is an effective attack vector. We also improve attack robustness by considering a variety of input frames associated with the attack scenario. This demo includes videos that show our attack can cause end-to-end consequences on a representative autonomous driving system in a simulator.

View More Papers

WeepingCAN: A Stealthy CAN Bus-off Attack

Gedare Bloom (University of Colorado Colorado Springs) Best Paper Award Winner ($300 cash prize)!

Read More

CANCloak: Deceiving Two ECUs with One Frame

Li Yue, Zheming Li, Tingting Yin, and Chao Zhang (Tsinghua University)

Read More

VISAS-Detecting GPS spoofing attacks against drones by analyzing camera's...

Barak Davidovich (Ben-Gurion University of the Negev), Ben Nassi (Ben-Gurion University of the Negev) and Yuval Elovici (Ben-Gurion University of the Negev)

Read More

When DNS Goes Dark: Understanding Privacy and Shaping Policy...

Vijay k. Gurbani and Cynthia Hood ( Illinois Institute of Technology), Anita Nikolich (University of Illinois), Henning Schulzrinne (Columbia University) and Radu State (University of Luxembourg)

Read More