Harry Halpin (Nym Technologies)

With the ascendance of artificial intelligence (AI), one of the largest problems facing privacy-enhancing technologies (PETs) is how they can successfully counter-act the large-scale surveillance that is required for the collection of data–and metadata–necessary for the training of AI models. While there has been a flurry of research into the foundations of AI, the field of privacy-enhancing technologies still appears to be a grabbag of techniques without an overarching theoretical foundation. However, we will point to the potential unification of AI and PETS via the concepts of signal and noise, as formalized by informationtheoretic metrics like entropy. We overview the concept of entropy (“noise”) and its applications in both AI and PETs. For example, mixnets can be thought of as noise-generating networks, and so the inverse of neural networks. Then we defend the use of entropy as a metric to compare both different PETs, as well as both PETs and AI systems.

View More Papers

Automatic Insecurity: Exploring Email Auto-configuration in the Wild

Shushang Wen (School of Cyber Science and Technology, University of Science and Technology of China), Yiming Zhang (Tsinghua University), Yuxiang Shen (School of Cyber Science and Technology, University of Science and Technology of China), Bingyu Li (School of Cyber Science and Technology, Beihang University), Haixin Duan (Tsinghua University; Zhongguancun Laboratory), Jingqiang Lin (School of Cyber…

Read More

Poster: Securing IoT Edge Devices: Applying NIST IR 8259A...

Rahul Choutapally, Konika Reddy Saddikuti, Solomon Berhe (University of the Pacific)

Read More

CENSOR: Defense Against Gradient Inversion via Orthogonal Subspace Bayesian...

Kaiyuan Zhang (Purdue University), Siyuan Cheng (Purdue University), Guangyu Shen (Purdue University), Bruno Ribeiro (Purdue University), Shengwei An (Purdue University), Pin-Yu Chen (IBM Research AI), Xiangyu Zhang (Purdue University), Ninghui Li (Purdue University)

Read More