Harry Halpin (Nym Technologies)

With the ascendance of artificial intelligence (AI), one of the largest problems facing privacy-enhancing technologies (PETs) is how they can successfully counter-act the large-scale surveillance that is required for the collection of data–and metadata–necessary for the training of AI models. While there has been a flurry of research into the foundations of AI, the field of privacy-enhancing technologies still appears to be a grabbag of techniques without an overarching theoretical foundation. However, we will point to the potential unification of AI and PETS via the concepts of signal and noise, as formalized by informationtheoretic metrics like entropy. We overview the concept of entropy (“noise”) and its applications in both AI and PETs. For example, mixnets can be thought of as noise-generating networks, and so the inverse of neural networks. Then we defend the use of entropy as a metric to compare both different PETs, as well as both PETs and AI systems.

View More Papers

Vulnerability, Where Art Thou? An Investigation of Vulnerability Management...

Daniel Klischies (Ruhr University Bochum), Philipp Mackensen (Ruhr University Bochum), Veelasha Moonsamy (Ruhr University Bochum)

Read More

FUZZUER: Enabling Fuzzing of UEFI Interfaces on EDK-2

Connor Glosner (Purdue University), Aravind Machiry (Purdue University)

Read More

Panel on “Security and Privacy Issues in New 5G...

Moderator: Arupjyoti (Arup) Bhuyan, Ph.D. Director, Wireless Security Institute, Idaho National Laboratory Panelists: Ted K. Woodward, Ph.D. Technical Director for FutureG, OUSD (R&E) Phillip Porras, Program Director, Internet Security Research, SRI Donald McBride, Senior Security Researcher, Bell Laboratories, Nokia

Read More

Towards Better CFG Layouts

Jack Royer (CentraleSupélec), Frédéric TRONEL (CentraleSupélec, Inria, CNRS, University of Rennes), Yaëlle Vinçont (Univ Rennes, Inria, CNRS, IRISA)

Read More