Privacy Limitations Of Interest-based Advertising On The Web: A Post-mortem Empirical Analysis Of Google's FLoC

In 2020, Google announced it would disable third-party cookies in the Chrome browser to improve user privacy. In order to continue to enable interest-based advertising while mitigating risks of individualized user tracking, Google proposed FLoC. The FLoC algorithm assigns users to "cohorts"...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Berke, Alex, Calacci, Dan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In 2020, Google announced it would disable third-party cookies in the Chrome browser to improve user privacy. In order to continue to enable interest-based advertising while mitigating risks of individualized user tracking, Google proposed FLoC. The FLoC algorithm assigns users to "cohorts" that represent groups of users with similar browsing behaviors so that ads can be served to users based on their cohort. In 2022, after testing FLoC in a real world trial, Google canceled the proposal with little explanation. In this work, we provide a post-mortem analysis of two critical privacy risks for FloC by applying an implementation of FLoC to a browsing dataset collected from over 90,000 U.S. devices over a one year period. First, we show how, contrary to its privacy goals, FLoC would have enabled cross-site user tracking by providing a unique identifier for users available across sites, similar to the third-party cookies FLoC was meant to be an improvement over. We show how FLoC cohort ID sequences observed over time can provide this identifier to trackers, even with third-party cookies disabled. We estimate the number of users in our dataset that could be uniquely identified by FLoC IDs is more than 50% after 3 weeks and more than 95% after 4 weeks. We also show how these risks increase when cohort data are combined with browser fingerprinting, and how our results underestimate the true risks FLoC would have posed in a real-world deployment. Second, we examine the risk of FLoC leaking sensitive demographic information. Although we find statistically significant differences in browsing behaviors between demographic groups, we do not find that FLoC significantly risks exposing race or income information about users in our dataset. Our contributions provide insights and example analyses for future approaches that seek to protect user privacy while monetizing the web.
DOI:10.48550/arxiv.2201.13402