Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

EqualNet: A Secure and Practical Defense for Long-term Network...

Jinwoo Kim (KAIST), Eduard Marin (Telefonica Research (Spain)), Mauro Conti (University of Padua), Seungwon Shin (KAIST)

Read More

WIP: Interrupt Attack on TEE-protected Robotic Vehicles

Mulong Luo (Cornell University) and G. Edward Suh (Cornell University)

Read More

FirmWire: Transparent Dynamic Analysis for Cellular Baseband Firmware

Grant Hernandez (University of Florida), Marius Muench (Vrije Universiteit Amsterdam), Dominik Maier (TU Berlin), Alyssa Milburn (Vrije Universiteit Amsterdam), Shinjo Park (TU Berlin), Tobias Scharnowski (Ruhr-University Bochum), Tyler Tucker (University of Florida), Patrick Traynor (University of Florida), Kevin Butler (University of Florida)

Read More

(Short) Fooling Perception via Location: A Case of Region-of-Interest...

Kanglan Tang, Junjie Shen, and Qi Alfred Chen (UC Irvine)

Read More