Agnostic Private Density Estimation for GMMs via List Global Stability
We consider the problem of private density estimation for mixtures of unrestricted high dimensional Gaussians in the agnostic setting. We prove the first upper bound on the sample complexity of this problem. Previously, private learnability of high dimensional GMMs was only known in the realizable s...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We consider the problem of private density estimation for mixtures of
unrestricted high dimensional Gaussians in the agnostic setting. We prove the
first upper bound on the sample complexity of this problem. Previously, private
learnability of high dimensional GMMs was only known in the realizable setting
[Afzali et al., 2024].
To prove our result, we exploit the notion of $\textit{list global
stability}$ [Ghazi et al., 2021b,a] that was originally introduced in the
context of private supervised learning. We define an agnostic variant of this
definition, showing that its existence is sufficient for agnostic private
density estimation. We then construct an agnostic list globally stable learner
for GMMs. |
---|---|
DOI: | 10.48550/arxiv.2407.04783 |