Harry Halpin (Nym Technologies)

With the ascendance of artificial intelligence (AI), one of the largest problems facing privacy-enhancing technologies (PETs) is how they can successfully counter-act the large-scale surveillance that is required for the collection of data–and metadata–necessary for the training of AI models. While there has been a flurry of research into the foundations of AI, the field of privacy-enhancing technologies still appears to be a grabbag of techniques without an overarching theoretical foundation. However, we will point to the potential unification of AI and PETS via the concepts of signal and noise, as formalized by informationtheoretic metrics like entropy. We overview the concept of entropy (“noise”) and its applications in both AI and PETs. For example, mixnets can be thought of as noise-generating networks, and so the inverse of neural networks. Then we defend the use of entropy as a metric to compare both different PETs, as well as both PETs and AI systems.

View More Papers

Understanding Miniapp Malware: Identification, Dissection, and Characterization

Yuqing Yang (The Ohio State University), Yue Zhang (Drexel University), Zhiqiang Lin (The Ohio State University)

Read More

Off-Path TCP Hijacking in Wi-Fi Networks: A Packet-Size Side...

Ziqiang Wang (Southeast University), Xuewei Feng (Tsinghua University), Qi Li (Tsinghua University), Kun Sun (George Mason University), Yuxiang Yang (Tsinghua University), Mengyuan Li (University of Toronto), Ganqiu Du (China Software Testing Center), Ke Xu (Tsinghua University), Jianping Wu (Tsinghua University)

Read More

A Comprehensive Memory Safety Analysis of Bootloaders

Jianqiang Wang (CISPA Helmholtz Center for Information Security), Meng Wang (CISPA Helmholtz Center for Information Security), Qinying Wang (Zhejiang University), Nils Langius (Leibniz Universität Hannover), Li Shi (ETH Zurich), Ali Abbasi (CISPA Helmholtz Center for Information Security), Thorsten Holz (CISPA Helmholtz Center for Information Security)

Read More