SAR ATR of Ground Vehicles Based on LM-BN-CNN
In recent studies, synthetic aperture radar (SAR) automatic target recognition (ATR) algorithms based on convolutional neural network (CNN) have achieved high recognition rates in the moving and stationary target acquisition and recognition (MSTAR) data set. However, the correlation between clutter...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2018-12, Vol.56 (12), p.7282-7293 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In recent studies, synthetic aperture radar (SAR) automatic target recognition (ATR) algorithms based on convolutional neural network (CNN) have achieved high recognition rates in the moving and stationary target acquisition and recognition (MSTAR) data set. However, the correlation between clutter in the training and test data sets is ignored in these algorithms, although most of them used only the center part of the images by removing a lot of the clutter but not everything, which may result in better performance than what would be achieved in the operational scenarios. To tackle this problem, we propose a target segmentation method based on morphological operations to generate data sets without clutter. Then, we design the large-margin softmax (LM-softmax) batch-normalization CNN (LM-BN-CNN) structure, which utilizes the LM-softmax classifier in the last layer to increase the separability of samples after clutter removal. In addition, this structure performs BN with constant mean and variance to increase the convergence speed and reduce overfitting. Experiments on the MSTAR data set have shown that LM-BN-CNN obtains better performance than the available CNNs designed for SAR ATR of ground vehicles, and it has robustness to large depression angle variation, configuration variants, and version variants. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2018.2849967 |