Skip-Connected Covariance Network for Remote Sensing Scene Classification

This paper proposes a novel end-to-end learning model, called skip-connected covariance (SCCov) network, for remote sensing scene classification (RSSC). The innovative contribution of this paper is to embed two novel modules into the traditional convolutional neural network (CNN) model, i.e., skip c...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2020-05, Vol.31 (5), p.1461-1474
Hauptverfasser: He, Nanjun, Fang, Leyuan, Li, Shutao, Plaza, Javier, Plaza, Antonio
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper proposes a novel end-to-end learning model, called skip-connected covariance (SCCov) network, for remote sensing scene classification (RSSC). The innovative contribution of this paper is to embed two novel modules into the traditional convolutional neural network (CNN) model, i.e., skip connections and covariance pooling. The advantages of newly developed SCCov are twofold. First, by means of the skip connections, the multi-resolution feature maps produced by the CNN are combined together, which provides important benefits to address the presence of large-scale variance in RSSC data sets. Second, by using covariance pooling, we can fully exploit the second-order information contained in such multi-resolution feature maps. This allows the CNN to achieve more representative feature learning when dealing with RSSC problems. Experimental results, conducted using three large-scale benchmark data sets, demonstrate that our newly proposed SCCov network exhibits very competitive or superior classification performance when compared with the current state-of-the-art RSSC techniques, using a much lower amount of parameters. Specifically, our SCCov only needs 10% of the parameters used by its counterparts.
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2019.2920374