Semisupervised Learning-Based SAR ATR via Self-Consistent Augmentation

In synthetic aperture radar (SAR) automatic target recognition, it is expensive and time-consuming to annotate the targets. Thus, training a network with a few labeled data and plenty of unlabeled data attracts attention of many researchers. In this article, we design a semisupervised learning frame...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing 2021-06, Vol.59 (6), p.4862-4873
Hauptverfasser: Wang, Chen, Shi, Jun, Zhou, Yuanyuan, Yang, Xiaqing, Zhou, Zenan, Wei, Shunjun, Zhang, Xiaoling
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In synthetic aperture radar (SAR) automatic target recognition, it is expensive and time-consuming to annotate the targets. Thus, training a network with a few labeled data and plenty of unlabeled data attracts attention of many researchers. In this article, we design a semisupervised learning framework including self-consistent augmentation rule, mixup-based mixture, and weighted loss, which allows a classification network to utilize unlabeled data during training and ultimately alleviates the demand of labeled data. The proposed self-consistent augmentation rule forces the samples before and after augmentation to share the same labels to utilize the unlabeled data, which can ensure the prominent effect of supervised learning part of the framework for training by balancing amounts of labeled and unlabeled samples in a minibatch, and makes the network achieve better performance. Then, a mixture method is introduced to mix the labeled, unlabeled, and augmented samples for the better involvement of label information in the mixed samples. By using cross-entropy loss for the mixed-labeled mixtures and mean-squared error loss for the mixed-unlabeled mixtures, the total loss is defined as the weighted sum of them. The experiments on the MSTAR data set and OpenSARShip data set show that the performance of the method is not only far better than the state of the art among current semisupervised-based classifiers but also near to the state of the art among the supervised learning-based networks.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2020.3013968