Rui Xiao (Zhejiang University), Sibo Feng (Zhejiang University), Soundarya Ramesh (National University of Singapore), Jun Han (KAIST), Jinsong Han (Zhejiang University)

As deep neural networks (DNNs) are increasingly adopted in safety-critical applications such as autonomous driving and face recognition, they have also become targets for adversarial attacks. However, confidential information of DNNs-including model architecture – is typically hidden from attackers. As a result, adversarial attacks are often launched in black-box settings, which limits their effectiveness. In this paper, we propose ModelSpy, a stealthy DNN architecture snooping attack based on GPU electromagnetic (EM) leakage. ModelSpy is capable of extracting complete architecture from several meters away, even through walls. ModelSpy is based on the key observation that GPU emanates far-field EM signals that exhibit architecture-specific amplitude modulation during DNN inference. We develop a hierarchical reconstruction model to recover fine-grained architectural details from the noisy EM signals. To enhance scalability across diverse and evolving architectures, we design a transfer-learning scheme by exploiting the correlation between external EM leakage and internal GPU activity. We design and implement a proof-of-concept system to demonstrate ModelSpy’s feasibility. Our evaluation on five high-end consumer GPUs shows ModelSpy’s high accuracy in architecture reconstruction, including 97.6% in layer segmentation and 94.0% in hyperparameter estimation, with a working distance of up to 6 m. Furthermore, ModelSpy’s reconstructed DNN shows comparable performance to victim architecture, and can effectively enhance black-box adversarial attacks.

View More Papers

OSAVRoute: Advancing Outbound Source Address Validation Deployment Detection with...

Shuai Wang (Zhongguancun Laboratory), Ruifeng Li (Zhongguancun Laboratory), Li Chen (Zhongguancun Laboratory), Dan Li (Tsinghua University), Lancheng Qin (Zhongguancun Laboratory), Qian Cao (Zhongguancun Laboratory)

Read More

One Small Patch for a File, One Giant Leap...

Julian Rederlechner (CISPA Helmholtz Center for Information Security), Ulysse Planta (CISPA Helmholtz Center for Information Security), Ali Abbasi (CISPA Helmholtz Center for Information Security)

Read More

VR ProfiLens: User Profiling Risks in Consumer Virtual Reality...

Ismat Jarin (University of California, Irvine), Olivia Figueira (University of California, Irvine), Yu Duan (University of California, Irvine), Tu Le (The University of Alabama), Athina Markopoulou (University of California, Irvine)

Read More