Wen-jie Lu (Ant Group), Zhicong Huang (Ant Group), Zhen Gu (Alibaba Group), Jingyu Li (Ant Group & Zhejiang University), Jian Liu (Zhejiang University), Cheng Hong (Ant Group), Kui Ren (Zhejiang University), Tao Wei (Ant Group), WenGuang Chen (Ant Group)

Large transformer-based models have realized state-of-the-art performance on lots of real-world tasks such as natural language processing and computer vision.
However, with the increasing sensitivity of the data and tasks they handle, privacy has become a major concern during model deployment.
In this work, we focus on private inference in two-party settings, where one party holds private inputs and the other holds the model.
We introduce BumbleBee, a fast and communication-friendly two-party private transformer inference system.
Our contributions are three-fold:
First, we propose optimized protocols for matrix multiplication, which significantly reduce communication costs by 80% -- 90% compared to previous techniques.
Secondly, we develop a methodology for constructing efficient protocols tailored to the non-linear activation functions employed in transformer models.
The proposed activation protocols have realized a significant enhancement in processing speed, alongside a remarkable reduction in communication costs by 80% -- 95% compared with two prior methods.
Lastly, we have performed extensive benchmarks on five transformer models.
BumbleBee demonstrates its capability by evaluating the LLaMA-7B model, generating one token in approximately 8 minutes using CPUs.
Our results further reveal that BumbleBee outperforms Iron (NeurIPS22) by over an order of magnitude and is three times faster than BOLT (Oakland24) with one-tenth communication.

View More Papers

User Comprehension and Comfort with Eye-Tracking and Hand-Tracking Permissions...

Kaiming Cheng (University of Washington), Mattea Sim (Indiana University), Tadayoshi Kohno (University of Washington), Franziska Roesner (University of Washington)

Read More

Was This You? Investigating the Design Considerations for Suspicious...

Sena Sahin (Georgia Institute of Technology), Burak Sahin (Georgia Institute of Technology), Frank Li (Georgia Institute of Technology)

Read More

Unleashing the Power of Generative Model in Recovering Variable...

Xiangzhe Xu (Purdue University), Zhuo Zhang (Purdue University), Zian Su (Purdue University), Ziyang Huang (Purdue University), Shiwei Feng (Purdue University), Yapeng Ye (Purdue University), Nan Jiang (Purdue University), Danning Xie (Purdue University), Siyuan Cheng (Purdue University), Lin Tan (Purdue University), Xiangyu Zhang (Purdue University)

Read More

SongBsAb: A Dual Prevention Approach against Singing Voice Conversion...

Guangke Chen (Pengcheng Laboratory), Yedi Zhang (National University of Singapore), Fu Song (Key Laboratory of System Software (Chinese Academy of Sciences) and State Key Laboratory of Computer Science, Institute of Software, Chinese Academy of Science; Nanjing Institute of Software Technology), Ting Wang (Stony Brook University), Xiaoning Du (Monash University), Yang Liu (Nanyang Technological University)

Read More