Harry Halpin (Nym Technologies)

With the ascendance of artificial intelligence (AI), one of the largest problems facing privacy-enhancing technologies (PETs) is how they can successfully counter-act the large-scale surveillance that is required for the collection of data–and metadata–necessary for the training of AI models. While there has been a flurry of research into the foundations of AI, the field of privacy-enhancing technologies still appears to be a grabbag of techniques without an overarching theoretical foundation. However, we will point to the potential unification of AI and PETS via the concepts of signal and noise, as formalized by informationtheoretic metrics like entropy. We overview the concept of entropy (“noise”) and its applications in both AI and PETs. For example, mixnets can be thought of as noise-generating networks, and so the inverse of neural networks. Then we defend the use of entropy as a metric to compare both different PETs, as well as both PETs and AI systems.

View More Papers

Mysticeti: Reaching the Latency Limits with Uncertified DAGs

Kushal Babel (Cornell Tech & IC3), Andrey Chursin (Mysten Labs), George Danezis (Mysten Labs & University College London (UCL)), Anastasios Kichidis (Mysten Labs), Lefteris Kokoris-Kogias (Mysten Labs & IST Austria), Arun Koshy (Mysten Labs), Alberto Sonnino (Mysten Labs & University College London (UCL)), Mingwei Tian (Mysten Labs)

Read More

JBomAudit: Assessing the Landscape, Compliance, and Security Implications of...

Yue Xiao (IBM Research), Dhilung Kirat (IBM Research), Douglas Lee Schales (IBM Research), Jiyong Jang (IBM Research), Luyi Xing (Indiana University Bloomington), Xiaojing Liao (Indiana University)

Read More

Fuzzing Space Communication Protocols

Stephan Havermans (IMDEA Software Institute), Lars Baumgaertner, Jussi Roberts, Marcus Wallum (European Space Agency), Juan Caballero (IMDEA Software Institute)

Read More