Exponential ReLU DNN Expression of Holomorphic Maps in High Dimension

For a parameter dimension d ∈ N , we consider the approximation of many-parametric maps u : [ - 1 , 1 ] d → R by deep ReLU neural networks. The input dimension d may possibly be large, and we assume quantitative control of the domain of holomorphy of u : i.e., u admits a holomorphic extension to a B...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Constructive approximation 2022-02, Vol.55 (1), p.537-582
Hauptverfasser: Opschoor, J. A. A., Schwab, Ch, Zech, J.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:For a parameter dimension d ∈ N , we consider the approximation of many-parametric maps u : [ - 1 , 1 ] d → R by deep ReLU neural networks. The input dimension d may possibly be large, and we assume quantitative control of the domain of holomorphy of u : i.e., u admits a holomorphic extension to a Bernstein polyellipse E ρ 1 × ⋯ × E ρ d ⊂ C d of semiaxis sums ρ i > 1 containing [ - 1 , 1 ] d . We establish the exponential rate O ( exp ( - b N 1 / ( d + 1 ) ) ) of expressive power in terms of the total NN size N and of the input dimension d of the ReLU NN in W 1 , ∞ ( [ - 1 , 1 ] d ) . The constant b > 0 depends on ( ρ j ) j = 1 d which characterizes the coordinate-wise sizes of the Bernstein-ellipses for u . We also prove exponential convergence in stronger norms for the approximation by DNNs with more regular, so-called “rectified power unit” activations. Finally, we extend DNN expression rate bounds also to two classes of non-holomorphic functions, in particular to d -variate, Gevrey-regular functions, and, by composition, to certain multivariate probability distribution functions with Lipschitz marginals.
ISSN:0176-4276
1432-0940
DOI:10.1007/s00365-021-09542-5