SoK: A Review of Differentially Private Linear Models For High-Dimensional Data
Linear models are ubiquitous in data science, but are particularly prone to overfitting and data memorization in high dimensions. To guarantee the privacy of training data, differential privacy can be used. Many papers have proposed optimization techniques for high-dimensional differentially private...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Linear models are ubiquitous in data science, but are particularly prone to
overfitting and data memorization in high dimensions. To guarantee the privacy
of training data, differential privacy can be used. Many papers have proposed
optimization techniques for high-dimensional differentially private linear
models, but a systematic comparison between these methods does not exist. We
close this gap by providing a comprehensive review of optimization methods for
private high-dimensional linear models. Empirical tests on all methods
demonstrate robust and coordinate-optimized algorithms perform best, which can
inform future research. Code for implementing all methods is released online. |
---|---|
DOI: | 10.48550/arxiv.2404.01141 |