Harry Halpin (Nym Technologies)

With the ascendance of artificial intelligence (AI), one of the largest problems facing privacy-enhancing technologies (PETs) is how they can successfully counter-act the large-scale surveillance that is required for the collection of data–and metadata–necessary for the training of AI models. While there has been a flurry of research into the foundations of AI, the field of privacy-enhancing technologies still appears to be a grabbag of techniques without an overarching theoretical foundation. However, we will point to the potential unification of AI and PETS via the concepts of signal and noise, as formalized by informationtheoretic metrics like entropy. We overview the concept of entropy (“noise”) and its applications in both AI and PETs. For example, mixnets can be thought of as noise-generating networks, and so the inverse of neural networks. Then we defend the use of entropy as a metric to compare both different PETs, as well as both PETs and AI systems.

View More Papers

Victim-Centred Abuse Investigations and Defenses for Social Media Platforms

Zaid Hakami (Florida International University and Jazan University), Ashfaq Ali Shafin (Florida International University), Peter J. Clarke (Florida International University), Niki Pissinou (Florida International University), and Bogdan Carbunar (Florida International University)

Read More

Secret Spilling Drive: Leaking User Behavior through SSD Contention

Jonas Juffinger (Graz University of Technology), Fabian Rauscher (Graz University of Technology), Giuseppe La Manna (Amazon), Daniel Gruss (Graz University of Technology)

Read More

DiStefano: Decentralized Infrastructure for Sharing Trusted Encrypted Facts and...

Sofia Celi (Brave Software), Alex Davidson (NOVA LINCS & Universidade NOVA de Lisboa), Hamed Haddadi (Imperial College London & Brave Software), Gonçalo Pestana (Hashmatter), Joe Rowell (Information Security Group, Royal Holloway, University of London)

Read More