CapsNet regularization and its conjugation with ResNet for signature identification
•We proposed a regularized version of CapsNet aiming for more generalization and overfitting avoidance.•Adding regularized term leads to removing the decoder in the baseline CapsNet that causes a noticeable reduction of learning parameters and fast convergence.•We also proposed a conjugation of regu...
Gespeichert in:
Veröffentlicht in: | Pattern recognition 2021-12, Vol.120, p.107851, Article 107851 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •We proposed a regularized version of CapsNet aiming for more generalization and overfitting avoidance.•Adding regularized term leads to removing the decoder in the baseline CapsNet that causes a noticeable reduction of learning parameters and fast convergence.•We also proposed a conjugation of regularized CapsNet with ResNet dealing with the small number of training samples or the input images with a large size.•We evaluated our approach over three well-known publicly available datasets.
We propose a new regularization term for CapsNet that significantly improves the generalization power of the original method from small training data while requiring much fewer parameters, making it suitable for large input images. We also propose a very efficient DNN architecture that integrates CapsNet with ResNet to obtain the advantages of the two architectures. CapsNet allows a powerful understanding of the objects’ components and their positions, while ResNet provides efficient feature extraction and description. Our approach is general, and we demonstrate it on the problem of signature identification from images. To show our approach superiority, we provide several evaluations with different protocols. We also show that our approach outperforms the state-of-the-art on this problem with thorough experiments on three publicly available datasets CEDAR, MCYT, and UTSig. |
---|---|
ISSN: | 0031-3203 1873-5142 |
DOI: | 10.1016/j.patcog.2021.107851 |