Workshop on Measurements, Attacks, and Defenses for the Web (MADWeb) 2023 Program
Best paper award
Can You Tell Me the Time? Security Implications of the Server-Timing Header by Vik Vanderlinden, Wouter Joosen, and Mathy Vanhoef (imec-DistriNet, KU Leuven)
Best paper runner-up awardBridging the Privacy Gap: Enhanced User Consent Mechanisms on the Web by Carl Magnus Bruhner (Linköping University), David Hasselquist (Linköping University, Sectra Communications), and Niklas Carlsson (Linköping University)
Best presentation award
Applying Accessibility Metrics to Measure the Threat Landscape for Users with Disabilities by John Breton and Abdel Rahman Abdou (Carleton University)
Find the updated information on the workshop's website.
Friday, 3 March
Web privacy measurement has often focused on the implementation specifics of various tracking techniques, developing ways to block them, and producing browser add-ons which demonstrate such blocking. However, while over 20 years of this focus has yielded lots of papers, citations, and media coverage, there has been limited real-world impact. A much more promising approach to effecting systemic change at scale is to shift attention away from how tracking is performed towards evaluating if such tracking is compliant with a growing body of applicable regulations.
In this talk I will offer perspectives on compliance measurement at scale, drawing lessons from my experience in the worlds of academic research, civil liberties advocacy, class litigation, and industry. Common themes will be explored and large-scale compliance measurement technologies will be presented in-depth. Likewise, insights on how computer scientists may effectively work across and between disciplinary boundaries will be presented. Ultimately, the most effective means to achieve change at scale is not to build another add-on, it is to build coalitions of experts working together to ensure technology, business, and regulation exist in harmony.
Carl Magnus Bruhner (Linkoping University), David Hasselquist (Linkoping University, Sectra Communications), Niklas Carlsson (Linkoping University)
In the age of the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA), privacy and consent control have become even more apparent for every-day web users. Privacy banners in all shapes and sizes ask for permission through more or less challenging designs and make privacy control more of a struggle than they help users’ privacy. In this paper, we present a novel solution expanding the Advanced Data Protection Control (ADPC) mechanism to bridge current gaps in user data and privacy control. Our solution moves the consent control to the browser interface to give users a seamless and hassle-free experience, while at the same time offering content providers a way to be legally compliant with legislation. Through an extensive review, we evaluate previous works and identify current gaps in user data control. We then present a blueprint for future implementation and suggest features to support privacy control online for users globally. Given browser support, the solution provides a tangible path to effectively achieve legally compliant privacy and consent control in a user-oriented manner that could allow them to again browse the web seamlessly.
John Breton, AbdelRahman Abdou (Carleton University)
The link between user security and web accessibility is a new but growing field of research. To understand the potential threat landscape for users that require accessibility tools to access the web, we created the WATER framework. WATER measures websites using three security-related base accessibility metrics. Upon analyzing 30,000 websites from three distinct popularity ranges, we discovered that the risk for information leakage and phishing attacks is higher for these users. Over half of the analyzed websites had an accessibility percentage of less than 75%, a statistic that exposes these websites to potential accessibility-related lawsuits. Our data suggests that the current WCAG 2.1 standards may need to be revised to avoid assigning Level AA conformance to websites that undermine the security of users requiring accessibility tools. We make the WATER framework publicly available in the hopes it can be used for future research.
Hugo Jonker (Open University Netherlands), Stefan Karsch (TH Koln), Benjamin Krumnow (TH Koln), Godfried Meesters (Open University Netherlands)
Online vendors typically offer different stores to sell their items, such as desktop site, mobile site, country-specific sites, etc. Online rumours and news media reports persist that item prices between such views differ. While several academic works have investigated price differentiation, to date, no systematic method for analysing this question was put forth. We devise an approach to investigate such store-based price differentiation, based on three pillars: a framework that can perform cross-store data acquisition synchronously, a method to perform cross-store item matching, and constraints to limit client-side noise factors. We test our method in an initial case study to investigate store effects on flight pricing. We gather pricing data of 824 flights from 15 stores (incl. desktop sites, mobile apps, and mobile sites) over a 38-day period. Our experiment shows that price differences occur frequently. Moreover, even in a limited run we find strong indications of store-specific pricing for certain vendors. We conclude that (i) a larger study into store-based price differentiation is needed to better gauge this effect; (ii) future research in this general domain should take store-based differences into account in their study design.
The Internet has become a hostile place for users’ traffic. Network-based actors, including ISPs and governments, increasingly practice sophisticated forms of censorship, content injection, and traffic throttling, as well as surveillance and other privacy violations. My work attempts to expose these threats and develop technologies to better safeguard users. Detecting and defending against adversarial networks is challenging, especially at global scale, due to the Internet’s vast size and heterogeneity, the powerful capabilities of in-network threat actors, and the lack of ground-truth on the counterfactual traffic that would exist in the absence of interference. Overcoming these challenges requires new techniques and systems, both for collecting and interpreting evidence of hostile networks and for building defensive tools that effectively meet user needs.
In this talk, I’ll first cover my approach to monitoring Internet censorship. I introduced an entirely new family of censorship measurement techniques, based on network side-channels, that can remotely detect censorship events occurring between distant pairs of network locations. To overcome the systems and data science challenges of operating these techniques and synthesizing their results into a holistic view of online censorship, my students and I created Censored Planet, a censorship observatory that continuously tests the reachability of thousands of popular or sensitive sites from over 100,000 vantage points in 221 countries. Next, I’ll discuss our efforts to understand and defend the consumer VPN ecosystem. Although millions of end-users rely on VPNs to protect their privacy and security, this multibillion-dollar industry includes numerous snakeoil products, is laxly regulated, and remains severely understudied. To address this, my lab created VPNalyzer, a project that aims to bring transparency and better security to consumer VPNs. Our work includes a cross-platform test suite that crowd-sources VPN security testing, coupled with large-scale user studies that aim to understand the needs and threat models of VPN users.
Tamara Bondar, Hala Assal, AbdelRahman Abdou (Carleton University)
In efforts to understand the reasons behind Internet-connected devices remaining vulnerable for a long time, previous literature analyzed the effectiveness of large-scale vulnerability notifications on remediation rates. Herein we focus on the perspective of system administrators. Through an online survey study with 89 system administrators worldwide, we investigate factors affecting their decisions to remediate or ignore a security vulnerability. We use Censys to find servers with vulnerable public-facing services, extract the abuse contact information from WHOIS, and email an invitation to fill out the survey. We found no evidence that awareness of the existence of a vulnerability affects remediation plans, which explains the consistently small remediation rates following notification campaigns conducted in previous research. More interestingly, participants did not agree on a specific factor as the primary cause for lack of remediation. Many factors appeared roughly equally important, including backwards compatibility, technical knowledge, available resources, and motive to remediate.
Vik Vanderlinden, Wouter Joosen, Mathy Vanhoef (imec-DistriNet, KU Leuven)
Performing a remote timing attack typically entails the collection of many timing measurements in order to overcome noise due to network jitter. If an attacker can reduce the amount of jitter in their measurements, they can exploit timing leaks using fewer measurements. To reduce the amount of jitter, an attacker may use timing information that is made available by a server. In this paper, we exploit the use of the server-timing header, which was created for performance monitoring and in some cases exposes millisecond accurate information about server-side execution times. We show that the header is increasingly often used, with an uptick in adoption rates in recent months. The websites that use the header often host dynamic content of which the generation time can potentially leak sensitive information. Our new attack techniques, one of which collects the header timing values from an intermediate proxy, improve performance over standard attacks using roundtrip times. Experiments show that, overall, our new attacks (significantly) decrease the number of samples required to exploit timing leaks. The attack is especially effective against geographically distant servers.
Takahito Sakamoto, Takuya Murozono (DataSign Inc)
Software as a Service (SaaS) dies. Several SaaS die every day because it becomes too difficult to continue their business. SaaS lets website owners install a small amount of code, called a tag, on a website to extend its functionality of the website. However, sometimes that tag becomes a zombie. In this paper, we coordinate two studies to reveal the danger of the zombification of tags. (1) A research of domains used by dead SaaS tags. (2) An investigation of websites with dead tags. The results of our work show that of the 53 domains used with 49 dead SaaS tags, 18 domains have already been re-registered by a third party or are ready to be re-registered. We also scanned about 1.15 million websites of domestic companies and found 26 dead SaaS tags on approximately 18,000 websites. Finally, we found that three new SaaS tags have been abused by attackers, indicating the danger of zombification tags.