Zhisheng Hu (Baidu), Shengjian Guo (Baidu) and Kang Li (Baidu)

In this demo, we disclose a potential bug in the Tesla Full Self-Driving (FSD) software. A vulnerable FSD vehicle can be deterministically tricked to run a red light. Attackers can cause a victim vehicle to behave in such ways without tampering or interfering with any sensors or physically accessing the vehicle. We infer that such behavior is caused by Tesla FSD’s decision system failing to take latest perception signals once it enters a specific mode. We call such problematic behavior Pringles Syndrome. Our study on multiple other autonomous driving implementations shows that this failed state update is a common failure pattern that specially needs attentions in autonomous driving software tests and developments.

View More Papers

Fighting Fake News in Encrypted Messaging with the Fuzzy...

Linsheng Liu (George Washington University), Daniel S. Roche (United States Naval Academy), Austin Theriault (George Washington University), Arkady Yerukhimovich (George Washington University)

Read More

V2X Security: Status and Open Challenges

Jonathan Petit (Director Of Engineering at Qualcomm Technologies) Dr. Jonathan Petit is Director of Engineering at Qualcomm Technologies, Inc., where he leads research in security of connected and automated vehicles (CAV). His team works on designing security solutions, but also develops tools for automotive penetration testing and builds prototypes. His recent work on misbehavior protection…

Read More

Chhoyhopper: A Moving Target Defense with IPv6

A S M Rizvi (University of Southern California/Information Sciences Institute) and John Heidemann (University of Southern California/Information Sciences Institute)

Read More

Effects of Knowledge and Experience on Privacy Decision-Making in...

Zekun Cai (Penn State University), Aiping Xiong (Penn State University)

Read More