Uniform approximation rates and metric entropy of shallow neural networks

We study the approximation properties of the variation spaces corresponding to shallow neural networks with respect to the uniform norm. Specifically, we consider the spectral Barron space, which consists of the convex hull of decaying Fourier modes, and the convex hull of indicator functions of hal...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Research in the mathematical sciences 2022-09, Vol.9 (3), Article 46
Hauptverfasser: Ma, Limin, Siegel, Jonathan W., Xu, Jinchao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We study the approximation properties of the variation spaces corresponding to shallow neural networks with respect to the uniform norm. Specifically, we consider the spectral Barron space, which consists of the convex hull of decaying Fourier modes, and the convex hull of indicator functions of half-spaces, which corresponds to shallow neural networks with sigmoidal activation function. Up to logarithmic factors, we determine the metric entropy and nonlinear dictionary approximation rates for these spaces with respect to the uniform norm. Combined with previous results with respect to the L 2 -norm, this also gives the metric entropy up to logarithmic factors with respect to any L p -norm with 1 ≤ p ≤ ∞ . In addition, we study the approximation rates for high-order spectral Barron spaces using shallow neural networks with ReLU k activation function. Specifically, we show that for a sufficiently high-order spectral Barron space, ReLU k networks are able to achieve an approximation rate of n - ( k + 1 ) with respect to the uniform norm.
ISSN:2522-0144
2197-9847
DOI:10.1007/s40687-022-00346-y