Saisai Xia (Institute of Information Engineering, CAS), Wenhao Wang (Institute of Information Engineering, CAS), Zihao Wang (Nanyang Technological University (NTU)), Yuhui Zhang (Institute of Information Engineering, CAS), Yier Jin (University of Science and Technology of China), Dan Meng (Institute of Information Engineering, CAS), Rui Hou (Institute of Information Engineering, CAS)

Publicly available large pretrained models (i.e., backbones) and lightweight adapters for parameter-efficient fine-tuning (PEFT) have become standard components in modern machine learning pipelines. However, preserving the privacy of both user inputs and fine-tuned adapters---often trained on sensitive data---during inference remains a significant challenge. Applying cryptographic techniques, such as multi-party computation (MPC), to PEFT settings still incurs substantial encrypted computation across both the backbone and adapter, mainly due to the inherent two-way communication between them. To address this limitation, we propose CryptPEFT, the first PEFT solution specifically designed for private inference scenarios. CryptPEFT introduces a novel emph{one-way communication (OWC)} architecture that confines encrypted computation solely to the adapter, significantly reducing both computational and communication overhead. To maintain strong model utility under this constraint, we explore the design space of OWC-compatible adapters and employ an automated architecture search algorithm to optimize the trade-off between private inference efficiency and model utility. We evaluated CryptPEFT using Vision Transformer backbones across widely used image classification datasets. Our results show that CryptPEFT significantly outperforms existing baselines, delivering speedups ranging from $20.62times$ to $291.48times$ in simulated wide-area network (WAN) and local-area network (LAN) settings. On CIFAR-100, CryptPEFT attains 85.47% accuracy with just 2.26 seconds of inference latency. These findings demonstrate that CryptPEFT offers an efficient and privacy-preserving solution for modern PEFT-based inference.

View More Papers

Losing the Beat: Understanding and Mitigating Desynchronization Risks in...

Zhi Li (Huazhong University of Science and Technology), Zhen Xu (Huazhong University of Science and Technology), Weijie Liu (Nankai University), XiaoFeng Wang (Nanyang Technological University), Hai Jin (Huazhong University of Science and Technology), Zheli Liu (Nankai University)

Read More

Automating Function-Level TARA for Automotive Full-Lifecycle Security

Yuqiao Yang (University of Electronic Science and Technology of China), Yongzhao Zhang (University of Electronic Science and Technology of China), Wenhao Liu (GoGoByte Technology), Jun Li (GoGoByte Technology), Pengtao Shi (GoGoByte Technology), DingYu Zhong (University of Electronic Science and Technology of China), Jie Yang (University of Electronic Science and Technology of China), Ting Chen (University…

Read More