Adversarial Remote Sensing Scene Classification Based on Lie Group Feature Learning

Convolutional Neural Networks have been widely used in remote sensing scene classification. Since this kind of model needs a large number of training samples containing data category information, a Generative Adversarial Network (GAN) is usually used to address the problem of lack of samples. Howeve...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Remote sensing (Basel, Switzerland) Switzerland), 2023-02, Vol.15 (4), p.914
Hauptverfasser: Xu, Chengjun, Shu, Jingqian, Zhu, Guobin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Convolutional Neural Networks have been widely used in remote sensing scene classification. Since this kind of model needs a large number of training samples containing data category information, a Generative Adversarial Network (GAN) is usually used to address the problem of lack of samples. However, GAN mainly generates scene data samples that do not contain category information. To address this problem, a novel supervised adversarial Lie Group feature learning network is proposed. In the case of limited data samples, the model can effectively generate data samples with category information. There are two main differences between our method and the traditional GAN. First, our model takes category information and data samples as the input of the model and optimizes the constraint of category information in the loss function, so that data samples containing category information can be generated. Secondly, the object scale sample generation strategy is introduced, which can generate data samples of different scales and ensure that the generated data samples contain richer feature information. After large-scale experiments on two publicly available and challenging datasets, it is found that our method can achieve better scene classification accuracy even with limited data samples.
ISSN:2072-4292
2072-4292
DOI:10.3390/rs15040914