Privacy pitfalls of releasing in-vehicle network data

The ever-increasing volume of vehicular data has enabled different service providers to access and monetize in-vehicle network data of millions of drivers. However, such data often carry personal or even potentially sensitive information, and therefore service providers either need to ask for driver...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Vehicular Communications 2023-02, Vol.39, p.100565, Article 100565
Hauptverfasser: Gazdag, András, Lestyán, Szilvia, Remeli, Mina, Ács, Gergely, Holczer, Tamás, Biczók, Gergely
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The ever-increasing volume of vehicular data has enabled different service providers to access and monetize in-vehicle network data of millions of drivers. However, such data often carry personal or even potentially sensitive information, and therefore service providers either need to ask for drivers' consent or anonymize such data in order to comply with data protection regulations. In this paper, we show that both fine-grained consent control as well as the adequate anonymization of in-network vehicular data are very challenging. First, by exploiting that in-vehicle sensor measurements are inherently interdependent, we are able to effectively i) re-identify a driver even from the raw, unprocessed CAN data with 97% accuracy, and ii) reconstruct the vehicle's complete location trajectory knowing only its speed and steering wheel position. Since such signal interdependencies are hard to identify even for data controllers, drivers' consent will arguably not be informed and hence may become invalid. Second, we show that the non-systematic application of different standard anonymization techniques (e.g., aggregation, suppression, signal distortion) often results in volatile, empirical privacy guarantees to the population as a whole but fails to provide a strong, worst-case privacy guarantee to every single individual. Therefore, we advocate the application of principled privacy models (such as Differential Privacy) to anonymize data with strong worst-case guarantee.
ISSN:2214-2096
DOI:10.1016/j.vehcom.2022.100565