Harry Halpin (Nym Technologies)

With the ascendance of artificial intelligence (AI), one of the largest problems facing privacy-enhancing technologies (PETs) is how they can successfully counter-act the large-scale surveillance that is required for the collection of data–and metadata–necessary for the training of AI models. While there has been a flurry of research into the foundations of AI, the field of privacy-enhancing technologies still appears to be a grabbag of techniques without an overarching theoretical foundation. However, we will point to the potential unification of AI and PETS via the concepts of signal and noise, as formalized by informationtheoretic metrics like entropy. We overview the concept of entropy (“noise”) and its applications in both AI and PETs. For example, mixnets can be thought of as noise-generating networks, and so the inverse of neural networks. Then we defend the use of entropy as a metric to compare both different PETs, as well as both PETs and AI systems.

View More Papers

Inspecting Compiler Optimizations on Mixed Boolean Arithmetic Obfuscation

Rachael Little, Dongpeng Xu (University of New Hampshire)

Read More

NodeMedic-FINE: Automatic Detection and Exploit Synthesis for Node.js Vulnerabilities

Darion Cassel (Carnegie Mellon University), Nuno Sabino (IST & CMU), Min-Chien Hsu (Carnegie Mellon University), Ruben Martins (Carnegie Mellon University), Limin Jia (Carnegie Mellon University)

Read More

ABElity: Attribute Based Encryption for Securing RIC Communication in...

K Sowjanya (Indian Institute of Technology Delhi), Rahul Saini (Eindhoven University of Technology), Dhiman Saha (Indian Institute of Technology Bhilai), Kishor Joshi (Eindhoven University of Technology), Madhurima Das (Indian Institute of Technology Delhi)

Read More

Analysis of Misconfigured IoT MQTT Deployments and a Lightweight...

Seyed Ali Ghazi Asgar, Narasimha Reddy (Texas A&M University)

Read More