Tianhang Zheng (University of Missouri-Kansas City), Baochun Li (University of Toronto)

Recent work in ICML’22 established a connection between dataset condensation (DC) and differential privacy (DP), which is unfortunately problematic. To correctly connect DC and DP, we propose two differentially private dataset condensation (DPDC) algorithms—LDPDC and NDPDC. LDPDC is a linear DC algorithm that can be executed on a low-end Central Processing Unit (CPU), while NDPDC is a nonlinear DC algorithm that leverages neural networks to extract and match the latent representations between real and synthetic data. Through extensive evaluations, we demonstrate that LDPDC has comparable performance to recent DP generative methods despite its simplicity. NDPDC provides acceptable DP guarantees with a mild utility loss, compared to distribution matching (DM). Additionally, NDPDC allows a flexible trade-off between the synthetic data utility and DP budget.

View More Papers

The Advantages of Distributed TCAM Firewalls in Automotive Real-Time...

Evan Allen (Virginia Tech), Zeb Bowden (Virginia Tech Transportation Institute), J. Scot Ransbottom (Virginia Tech)

Read More

CrowdGuard: Federated Backdoor Detection in Federated Learning

Phillip Rieger (Technical University of Darmstadt), Torsten Krauß (University of Würzburg), Markus Miettinen (Technical University of Darmstadt), Alexandra Dmitrienko (University of Würzburg), Ahmad-Reza Sadeghi (Technical University of Darmstadt)

Read More

GTrans: Graph Transformer-Based Obfuscation-resilient Binary Code Similarity Detection

Yun Zhang (Hunan University), Yuling Liu (Hunan University), Ge Cheng (Xiangtan University), Bo Ou (Hunan University)

Read More