Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Probe the Proto: Measuring Client-Side Prototype Pollution Vulnerabilities of...

Zifeng Kang (Johns Hopkins University), Song Li (Johns Hopkins University), Yinzhi Cao (Johns Hopkins University)

Read More

Chunked-Cache: On-Demand and Scalable Cache Isolation for Security Architectures

Ghada Dessouky (Technical University of Darmstadt), Emmanuel Stapf (Technical University of Darmstadt), Pouya Mahmoody (Technical University of Darmstadt), Alexander Gruler (Technical University of Darmstadt), Ahmad-Reza Sadeghi (Technical University of Darmstadt)

Read More

VPNInspector: Systematic Investigation of the VPN Ecosystem

Reethika Ramesh (University of Michigan), Leonid Evdokimov (Independent), Diwen Xue (University of Michigan), Roya Ensafi (University of Michigan)

Read More