Christopher DiPalma, Ningfei Wang, Takami Sato, and Qi Alfred Chen (UC Irvine)

Robust perception is crucial for autonomous vehicle security. In this work, we design a practical adversarial patch attack against camera-based obstacle detection. We identify that the back of a box truck is an effective attack vector. We also improve attack robustness by considering a variety of input frames associated with the attack scenario. This demo includes videos that show our attack can cause end-to-end consequences on a representative autonomous driving system in a simulator.

View More Papers

To Err.Is Human: Characterizing the Threat of Unintended URLs...

Beliz Kaleli (Boston University), Brian Kondracki (Stony Brook University), Manuel Egele (Boston University), Nick Nikiforakis (Stony Brook University), Gianluca Stringhini (Boston University)

Read More

DOVE: A Data-Oblivious Virtual Environment

Hyun Bin Lee (University of Illinois at Urbana-Champaign), Tushar M. Jois (Johns Hopkins University), Christopher W. Fletcher (University of Illinois at Urbana-Champaign), Carl A. Gunter (University of Illinois at Urbana-Champaign)

Read More

TASE: Reducing Latency of Symbolic Execution with Transactional Memory

Adam Humphries (University of North Carolina), Kartik Cating-Subramanian (University of Colorado), Michael K. Reiter (Duke University)

Read More