Convolutional neural network training with dynamic epoch ordering
The paper presented exposes a novel approach to feed data to a Convolutional Neural Network (CNN) while training. Normally, neural networks are fed with shuffled data without any control of what type of examples contains a minibatch. For situations where data are abundant and there does not exist an...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The paper presented exposes a novel approach to feed data to a Convolutional Neural Network (CNN) while training. Normally, neural networks are fed with shuffled data without any control of what type of examples contains a minibatch. For situations where data are abundant and there does not exist an unbalancing between classes, shuffling the training data is enough to ensure a balanced mini-batch. On the contrary, most real-world problems end up with databases where some classes are predominant vs others, ill-conditioning the training network to learn those classes forgetting the others. For those conditioned cases, most common methods simply discard a certain number of samples until the data is balanced, but this paper proposes an ordered method of feeding data while preserving randomness in the mini-batch composition and using all available samples. This method has proven to solve the problem with unbalanced data-sets while competing with other methods. Moreover, the paper will
focus its attention to a well know CNN network structure, named Deep Residual Networks.
Peer Reviewed |
---|---|
DOI: | 10.3233/FAIA190113 |