Multi-Attention Ghost Residual Fusion Network for Image Classification

In order to achieve high-efficiency and high-precision multi-image classification tasks, a multi-attention ghost residual fusion network (MAGR) is proposed. MAGR is formed by cascading basic feature extraction network (BFE), ghost residual mapping network (GRM) and image classification network (IC)....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2021, Vol.9, p.81421-81431
Hauptverfasser: Jia, Xiaofen, Du, Shengjie, Guo, Yongcun, Huang, Yourui, Zhao, Baiting
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In order to achieve high-efficiency and high-precision multi-image classification tasks, a multi-attention ghost residual fusion network (MAGR) is proposed. MAGR is formed by cascading basic feature extraction network (BFE), ghost residual mapping network (GRM) and image classification network (IC). The BFE uses spatial and channel attention mechanisms to help the MAGR extract low-level features of the input image in a targeted manner. The GRM is formed by cascading 4 multi-branch group convolutional ghost residual blocks (MGR-Blocks). Each MGR-Block is cascaded by a dimension reducer and several ghost residual sub-networks (GRSs). The GRS integrates ghost convolution and residual connection, and the use of ghost convolution can significantly reduce parameters and achieve high-efficient classification. The GRS is a parallel convolution structure with 32 branches, which ensures that GRM has enough width to extract advanced features and extract as much feature information as possible, so as to obtain high-precision classification. The IC completes the aggregation of high-dimensional channel feature information, and then achieves a significant improvement in the classification accuracy of MAGR, by fusing the effective channel attention mechanism, global average pooling and SoftMax layer. Simulation experiment shows that MAGR has excellent classification capability while achieving high efficiency and lightweight. Compare with VGG16, the parameters of MAGR on CIFAR-10 is reduced by 94.8% while the classification accuracy is increased by 1.18%. Compare with MobileNetV2, the parameters of MAGR on CIFAR-100 is reduced by 33.9% while the classification accuracy is increased by 15.6%.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2021.3079435