Towards Understanding the Importance of Shortcut Connections in Residual Networks
Residual Network (ResNet) is undoubtedly a milestone in deep learning. ResNet is equipped with shortcut connections between layers, and exhibits efficient training using simple first order algorithms. Despite of the great empirical success, the reason behind is far from being well understood. In thi...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Residual Network (ResNet) is undoubtedly a milestone in deep learning. ResNet
is equipped with shortcut connections between layers, and exhibits efficient
training using simple first order algorithms. Despite of the great empirical
success, the reason behind is far from being well understood. In this paper, we
study a two-layer non-overlapping convolutional ResNet. Training such a network
requires solving a non-convex optimization problem with a spurious local
optimum. We show, however, that gradient descent combined with proper
normalization, avoids being trapped by the spurious local optimum, and
converges to a global optimum in polynomial time, when the weight of the first
layer is initialized at 0, and that of the second layer is initialized
arbitrarily in a ball. Numerical experiments are provided to support our
theory. |
---|---|
DOI: | 10.48550/arxiv.1909.04653 |