Tianhang Zheng (University of Missouri-Kansas City), Baochun Li (University of Toronto)

Recent work in ICML’22 established a connection between dataset condensation (DC) and differential privacy (DP), which is unfortunately problematic. To correctly connect DC and DP, we propose two differentially private dataset condensation (DPDC) algorithms—LDPDC and NDPDC. LDPDC is a linear DC algorithm that can be executed on a low-end Central Processing Unit (CPU), while NDPDC is a nonlinear DC algorithm that leverages neural networks to extract and match the latent representations between real and synthetic data. Through extensive evaluations, we demonstrate that LDPDC has comparable performance to recent DP generative methods despite its simplicity. NDPDC provides acceptable DP guarantees with a mild utility loss, compared to distribution matching (DM). Additionally, NDPDC allows a flexible trade-off between the synthetic data utility and DP budget.

View More Papers

Powers of Tau in Asynchrony

Sourav Das (University of Illinois at Urbana-Champaign), Zhuolun Xiang (Aptos), Ling Ren (University of Illinois at Urbana-Champaign)

Read More

Efficient and Timely Revocation of V2X Credentials

Gianluca Scopelliti (Ericsson & KU Leuven), Christoph Baumann (Ericsson), Fritz Alder (KU Leuven), Eddy Truyen (KU Leuven), Jan Tobias Mühlberg (Université libre de Bruxelles & KU Leuven)

Read More

MadRadar: A Black-Box Physical Layer Attack Framework on mmWave...

David Hunt (Duke University), Kristen Angell (Duke University), Zhenzhou Qi (Duke University), Tingjun Chen (Duke University), Miroslav Pajic (Duke University)

Read More

REPLICAWATCHER: Training-less Anomaly Detection in Containerized Microservices

Asbat El Khairi (University of Twente), Marco Caselli (Siemens AG), Andreas Peter (University of Oldenburg), Andrea Continella (University of Twente)

Read More