Error bounds for ReLU networks with depth and width parameters

Neural networks have shown high successful performance in a wide range of tasks, but further studies are needed to improve its performance. We construct the specific neural network architecture with a local connection which is universal approximator, and analyze its approximation error. This locally...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Japan journal of industrial and applied mathematics 2023, Vol.40 (1), p.275-288
Hauptverfasser: Kang, Jae-Mo, Moon, Sunghwan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Neural networks have shown high successful performance in a wide range of tasks, but further studies are needed to improve its performance. We construct the specific neural network architecture with a local connection which is universal approximator, and analyze its approximation error. This locally connected networks has higher application than one with the full connection because the locally connected network can be used to explain diverse neural networks such as CNNs. Our error estimate depends on two parameters: one controlling the depth of the hidden layer, and the other, the width of the hidden layers.
ISSN:0916-7005
1868-937X
DOI:10.1007/s13160-022-00515-0