Christopher DiPalma, Ningfei Wang, Takami Sato, and Qi Alfred Chen (UC Irvine)

Robust perception is crucial for autonomous vehicle security. In this work, we design a practical adversarial patch attack against camera-based obstacle detection. We identify that the back of a box truck is an effective attack vector. We also improve attack robustness by considering a variety of input frames associated with the attack scenario. This demo includes videos that show our attack can cause end-to-end consequences on a representative autonomous driving system in a simulator.

View More Papers

Is Your Firmware Real or Re-Hosted? A case study...

Abraham A. Clements, Logan Carpenter, William A. Moeglein (Sandia National Laboratories), Christopher Wright (Purdue University)

Read More

CROW: Code Diversification for WebAssembly

Javier Cabrera Arteaga, Orestis Floros, Benoit Baudry, Martin Monperrus (KTH Royal Institute of Technology), Oscar Vera Perez (Univ Rennes, Inria, CNRS, IRISA)

Read More

Demo #8: Identifying Drones Based on Visual Tokens

Ben Nassi (Ben-Gurion University of the Negev), Elad Feldman (Ben-Gurion University of the Negev), Aviel Levy (Ben-Gurion University of the Negev), Yaron Pirutin (Ben-Gurion University of the Negev), Asaf Shabtai (Ben-Gurion University of the Negev), Ryusuke Masuoka (Fujitsu System Integration Laboratories) and Yuval Elovici (Ben-Gurion University of the Negev)

Read More

VISAS-Detecting GPS spoofing attacks against drones by analyzing camera's...

Barak Davidovich (Ben-Gurion University of the Negev), Ben Nassi (Ben-Gurion University of the Negev) and Yuval Elovici (Ben-Gurion University of the Negev)

Read More