Privacy-Aware Randomized Quantization via Linear Programming
Differential privacy mechanisms such as the Gaussian or Laplace mechanism have been widely used in data analytics for preserving individual privacy. However, they are mostly designed for continuous outputs and are unsuitable for scenarios where discrete values are necessary. Although various quantiz...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Differential privacy mechanisms such as the Gaussian or Laplace mechanism
have been widely used in data analytics for preserving individual privacy.
However, they are mostly designed for continuous outputs and are unsuitable for
scenarios where discrete values are necessary. Although various quantization
mechanisms were proposed recently to generate discrete outputs under
differential privacy, the outcomes are either biased or have an inferior
accuracy-privacy trade-off. In this paper, we propose a family of quantization
mechanisms that is unbiased and differentially private. It has a high degree of
freedom and we show that some existing mechanisms can be considered as special
cases of ours. To find the optimal mechanism, we formulate a linear
optimization that can be solved efficiently using linear programming tools.
Experiments show that our proposed mechanism can attain a better
privacy-accuracy trade-off compared to baselines. |
---|---|
DOI: | 10.48550/arxiv.2406.02599 |