DenseNets Reloaded: Paradigm Shift Beyond ResNets and ViTs
This paper revives Densely Connected Convolutional Networks (DenseNets) and reveals the underrated effectiveness over predominant ResNet-style architectures. We believe DenseNets' potential was overlooked due to untouched training methods and traditional design elements not fully revealing thei...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper revives Densely Connected Convolutional Networks (DenseNets) and
reveals the underrated effectiveness over predominant ResNet-style
architectures. We believe DenseNets' potential was overlooked due to untouched
training methods and traditional design elements not fully revealing their
capabilities. Our pilot study shows dense connections through concatenation are
strong, demonstrating that DenseNets can be revitalized to compete with modern
architectures. We methodically refine suboptimal components - architectural
adjustments, block redesign, and improved training recipes towards widening
DenseNets and boosting memory efficiency while keeping concatenation shortcuts.
Our models, employing simple architectural elements, ultimately surpass Swin
Transformer, ConvNeXt, and DeiT-III - key architectures in the residual
learning lineage. Furthermore, our models exhibit near state-of-the-art
performance on ImageNet-1K, competing with the very recent models and
downstream tasks, ADE20k semantic segmentation, and COCO object
detection/instance segmentation. Finally, we provide empirical analyses that
uncover the merits of the concatenation over additive shortcuts, steering a
renewed preference towards DenseNet-style designs. Our code is available at
https://github.com/naver-ai/rdnet. |
---|---|
DOI: | 10.48550/arxiv.2403.19588 |