Private and Utility Enhanced Recommendations With Local Differential Privacy and Gaussian Mixture Model
Recommendation systems rely heavily on behavioural and preferential data (e.g., ratings and likes) of a user to produce accurate recommendations. However, such unethical data aggregation and analytical practices of Service Providers (SP) causes privacy concerns among users. Local differential privac...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on knowledge and data engineering 2023-04, Vol.35 (4), p.4151-4163 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recommendation systems rely heavily on behavioural and preferential data (e.g., ratings and likes) of a user to produce accurate recommendations. However, such unethical data aggregation and analytical practices of Service Providers (SP) causes privacy concerns among users. Local differential privacy (LDP) based perturbation mechanisms address this concern by adding noise to users' data at the user-side before sending it to the SP. The SP then uses the perturbed data to perform recommendations. Although LDP protects the privacy of users from SP, it causes a substantial decline in recommendation accuracy. We propose an LDP-based Matrix Factorization (MF) with a Gaussian Mixture Model (MoG) to address this problem. The LDP perturbation mechanism, i.e., Bounded Laplace (BLP), regulates the effect of noise by confining the perturbed ratings to a predetermined domain. We derive a sufficient condition of the scale parameter for BLP to satisfy \varepsilon ɛ -LDP. We use the MoG model at the SP to estimate the noise added locally to the ratings and the MF algorithm to predict missing ratings. Our LDP based recommendation system improves the predictive accuracy without violating LDP principles. We demonstrate that our method offers a substantial increase in recommendation accuracy under a strong privacy guarantee through empirical evaluations on three real-world datasets, i.e., Movielens, Libimseti and Jester. |
---|---|
ISSN: | 1041-4347 1558-2191 |
DOI: | 10.1109/TKDE.2021.3126577 |