Error bounds for approximations with deep ReLU neural networks in $W^{s,p}$ norms
We analyze approximation rates of deep ReLU neural networks for Sobolev-regular functions with respect to weaker Sobolev norms. First, we construct, based on a calculus of ReLU networks, artificial neural networks with ReLU activation functions that achieve certain approximation rates. Second, we es...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We analyze approximation rates of deep ReLU neural networks for
Sobolev-regular functions with respect to weaker Sobolev norms. First, we
construct, based on a calculus of ReLU networks, artificial neural networks
with ReLU activation functions that achieve certain approximation rates.
Second, we establish lower bounds for the approximation by ReLU neural networks
for classes of Sobolev-regular functions. Our results extend recent advances in
the approximation theory of ReLU networks to the regime that is most relevant
for applications in the numerical analysis of partial differential equations. |
---|---|
DOI: | 10.48550/arxiv.1902.07896 |