Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

What You See is Not What the Network Infers:...

Yijun Yang (The Chinese University of Hong Kong), Ruiyuan Gao (The Chinese University of Hong Kong), Yu Li (The Chinese University of Hong Kong), Qiuxia Lai (Communication University of China), Qiang Xu (The Chinese University of Hong Kong)

Read More

The Droid is in the Details: Environment-aware Evasion of...

Brian Kondracki (Stony Brook University), Babak Amin Azad (Stony Brook University), Najmeh Miramirkhani (Stony Brook University), Nick Nikiforakis (Stony Brook University)

Read More

NC-Max: Breaking the Security-Performance Tradeoff in Nakamoto Consensus

Ren Zhang (Nervos), Dingwei Zhang (Nervos), Quake Wang (Nervos), Shichen Wu (School of Cyber Science and Technology, Shandong University), Jan Xie (Nervos), Bart Preneel (imec-COSIC, KU Leuven)

Read More

Remote Memory-Deduplication Attacks

Martin Schwarzl (Graz University of Technology), Erik Kraft (Graz University of Technology), Moritz Lipp (Graz University of Technology), Daniel Gruss (Graz University of Technology)

Read More