Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Chhoyhopper: A Moving Target Defense with IPv6

A S M Rizvi (University of Southern California/Information Sciences Institute) and John Heidemann (University of Southern California/Information Sciences Institute)

Read More

ATTEQ-NN: Attention-based QoE-aware Evasive Backdoor Attacks

Xueluan Gong (Wuhan University), Yanjiao Chen (Zhejiang University), Jianshuo Dong (Wuhan University), Qian Wang (Wuhan University)

Read More

DITTANY: Strength-Based Dynamic Information Flow Analysis Tool for x86...

Walid J. Ghandour, Clémentine Maurice (CNRS, CRIStAL)

Read More

All things Binary

Dr. Sergey Bratus, DARPA PI and Research Associate Professor at Dartmouth College

Read More