Kaiming Cheng (University of Washington), Mattea Sim (Indiana University), Tadayoshi Kohno (University of Washington), Franziska Roesner (University of Washington)

Augmented reality (AR) headsets are now commercially available, including major platforms like Microsoft’s Hololens 2, Meta’s Quest Pro, and Apple’s Vision Pro. Compared to currently widely deployed smartphone or web platforms, emerging AR headsets introduce new sensors that capture substantial and potentially privacy-invasive data about the users, including eye-tracking and hand-tracking sensors. As millions of users begin to explore AR for the very first time with the release of these headsets, it is crucial to understand the current technical landscape of these new sensing technologies and how end-users perceive and understand their associated privacy and utility implications. In this work, we investigate the current eye-tracking and hand-tracking permission models for three major platforms (HoloLens 2, Quest Pro, and Vision Pro): what is the granularity of eye-tracking and hand-tracking data made available to applications on these platforms, and what information is provided to users asked to grant these permissions (if at all)? We conducted a survey with 280 participants with no prior AR experience on Prolific to investigate (1) people’s comfort with the idea of granting eye- and hand-tracking permissions on these platforms, (2) their perceived and actual comprehension of the privacy and utility implications of granting these permissions, and (3) the self-reported factors that impact their willingness to try eye-tracking and hand-tracking enabled AR technologies in the future. Based on (mis)alignments we identify between comfort, perceived and actual comprehension, and decision factors, we discuss how future AR platforms can better communicate existing privacy protections, improve privacy-preserving designs, or better communicate risks.

View More Papers

PBP: Post-training Backdoor Purification for Malware Classifiers

Dung Thuy Nguyen (Vanderbilt University), Ngoc N. Tran (Vanderbilt University), Taylor T. Johnson (Vanderbilt University), Kevin Leach (Vanderbilt University)

Read More

Stop to Unlock: Improving the Security of Android Unlock...

Alexander Suchan (SBA Research); Emanuel von Zezschwitz (Usable Security Methods Group, University of Bonn, Bonn, Germany); Katharina Krombholz (CISPA Helmholtz Center for Information Security)

Read More

Automatic Library Fuzzing through API Relation Evolvement

Jiayi Lin (The University of Hong Kong), Qingyu Zhang (The University of Hong Kong), Junzhe Li (The University of Hong Kong), Chenxin Sun (The University of Hong Kong), Hao Zhou (The Hong Kong Polytechnic University), Changhua Luo (The University of Hong Kong), Chenxiong Qian (The University of Hong Kong)

Read More

Black-box Membership Inference Attacks against Fine-tuned Diffusion Models

Yan Pang (University of Virginia), Tianhao Wang (University of Virginia)

Read More