Dense connection and depthwise separable convolution based CNN for polarimetric SAR image classification

Convolution neural networks (CNN) have achieved great success in natural image processing where large amounts of training data are available. However, for the polarimetric synthetic aperture radar (PolSAR) image classification problem, the number of labeled training samples is typically limited. To...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Knowledge-based systems 2020-04, Vol.194, p.105542, Article 105542
Hauptverfasser: Shang, Ronghua, He, Jianghai, Wang, Jiaming, Xu, Kaiming, Jiao, Licheng, Stolkin, Rustam
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Convolution neural networks (CNN) have achieved great success in natural image processing where large amounts of training data are available. However, for the polarimetric synthetic aperture radar (PolSAR) image classification problem, the number of labeled training samples is typically limited. To improve the performance of CNN on limited training data, we propose a new network, the densely connected and depthwise separable convolutional neural network (DSNet). According to characteristics of PolSAR data, DSNet uses depthwise separable convolution to replace standard convolution, to independently extract features over each channel in PolSAR images. DSNet also introduces dense connections to directly connect non-adjacent layers. With the depthwise separable convolution and dense connections, DSNet can avoid extracting redundant features, reuse the hierarchical feature maps of PolSAR images and reduce the number of training parameters. Compared with normal CNN, DSNet is more lightweight and its training parameters decrease to less than 1/9. We compare DSNet against several popular algorithms on three different data sets, and show that DSNet achieves better results while using less training samples.
ISSN:0950-7051
1872-7409
DOI:10.1016/j.knosys.2020.105542