Heterogeneous Differential Privacy
The massive collection of personal data by personalization systems has rendered the preservation of privacy of individuals more and more difficult. Most of the proposed approaches to preserve privacy in personalization systems usually address this issue uniformly across users, thus ignoring the fact...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The massive collection of personal data by personalization systems has
rendered the preservation of privacy of individuals more and more difficult.
Most of the proposed approaches to preserve privacy in personalization systems
usually address this issue uniformly across users, thus ignoring the fact that
users have different privacy attitudes and expectations (even among their own
personal data). In this paper, we propose to account for this non-uniformity of
privacy expectations by introducing the concept of heterogeneous differential
privacy. This notion captures both the variation of privacy expectations among
users as well as across different pieces of information related to the same
user. We also describe an explicit mechanism achieving heterogeneous
differential privacy, which is a modification of the Laplacian mechanism by
Dwork, McSherry, Nissim, and Smith. In a nutshell, this mechanism achieves
heterogeneous differential privacy by manipulating the sensitivity of the
function using a linear transformation on the input domain. Finally, we
evaluate on real datasets the impact of the proposed mechanism with respect to
a semantic clustering task. The results of our experiments demonstrate that
heterogeneous differential privacy can account for different privacy attitudes
while sustaining a good level of utility as measured by the recall for the
semantic clustering task. |
---|---|
DOI: | 10.48550/arxiv.1504.06998 |