Tandem Blocks in Deep Convolutional Neural Networks
Due to the success of residual networks (resnets) and related architectures, shortcut connections have quickly become standard tools for building convolutional neural networks. The explanations in the literature for the apparent effectiveness of shortcuts are varied and often contradictory. We hypot...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Due to the success of residual networks (resnets) and related architectures,
shortcut connections have quickly become standard tools for building
convolutional neural networks. The explanations in the literature for the
apparent effectiveness of shortcuts are varied and often contradictory. We
hypothesize that shortcuts work primarily because they act as linear
counterparts to nonlinear layers. We test this hypothesis by using several
variations on the standard residual block, with different types of linear
connections, to build small image classification networks. Our experiments show
that other kinds of linear connections can be even more effective than the
identity shortcuts. Our results also suggest that the best type of linear
connection for a given application may depend on both network width and depth. |
---|---|
DOI: | 10.48550/arxiv.1806.00145 |