On the Relationship Between Information-Theoretic Privacy Metrics And Probabilistic Information Privacy
Information-theoretic (IT) measures based on $f$-divergences have recently gained interest as a measure of privacy leakage as they allow for trading off privacy against utility using only a single-value characterization. However, their operational interpretations in the privacy context are unclear....
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Information-theoretic (IT) measures based on $f$-divergences have recently
gained interest as a measure of privacy leakage as they allow for trading off
privacy against utility using only a single-value characterization. However,
their operational interpretations in the privacy context are unclear. In this
paper, we relate the notion of probabilistic information privacy (IP) to
several IT privacy metrics based on $f$-divergences. We interpret probabilistic
IP under both the detection and estimation frameworks and link it to
differential privacy, thus allowing a precise operational interpretation of
these IT privacy metrics. We show that the $\chi^2$-divergence privacy metric
is stronger than those based on total variation distance and Kullback-Leibler
divergence. Therefore, we further develop a data-driven empirical risk
framework based on the $\chi^2$-divergence privacy metric and realized using
deep neural networks. This framework is agnostic to the adversarial attack
model. Empirical experiments demonstrate the efficacy of our approach. |
---|---|
DOI: | 10.48550/arxiv.2301.08401 |