Rates of convergence for density estimation with generative adversarial networks
In this work we undertake a thorough study of the non-asymptotic properties of the vanilla generative adversarial networks (GANs). We prove an oracle inequality for the Jensen-Shannon (JS) divergence between the underlying density $\mathsf{p}^*$ and the GAN estimate with a significantly better stati...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this work we undertake a thorough study of the non-asymptotic properties
of the vanilla generative adversarial networks (GANs). We prove an oracle
inequality for the Jensen-Shannon (JS) divergence between the underlying
density $\mathsf{p}^*$ and the GAN estimate with a significantly better
statistical error term compared to the previously known results. The advantage
of our bound becomes clear in application to nonparametric density estimation.
We show that the JS-divergence between the GAN estimate and $\mathsf{p}^*$
decays as fast as $(\log{n}/n)^{2\beta/(2\beta + d)}$, where $n$ is the sample
size and $\beta$ determines the smoothness of $\mathsf{p}^*$. This rate of
convergence coincides (up to logarithmic factors) with minimax optimal for the
considered class of densities. |
---|---|
DOI: | 10.48550/arxiv.2102.00199 |