Open Set Few-Shot Remote Sensing Scene Classification Based on a Multiorder Graph Convolutional Network and Domain Adaptation
Few-shot scene classification aims to recognize task data, which consists of unlabeled data and a few annotated samples, given some labeled auxiliary data. The task and the auxiliary data often come from the same source with nonoverlapping categories; such tasks are defined as closed set few-shot sc...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2022, Vol.60, p.1-17 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Few-shot scene classification aims to recognize task data, which consists of unlabeled data and a few annotated samples, given some labeled auxiliary data. The task and the auxiliary data often come from the same source with nonoverlapping categories; such tasks are defined as closed set few-shot scene classification. However, most remote sensing applications involve open set few-shot scene classification, where the categories of the task and auxiliary data are often partly overlapped and sources of data are usually different (i.e., different domains). We propose an open set few-shot scene classification method called the multiorder graph convolutional network (MGCN) specifically for such tasks. MGCN solves the open set few-shot scene classification task in two ways: reducing the interdomain discrepancy by a feature dispersion degree weighting domain adaptation method and reducing the dispersion degree of node features by a proposed MGCN. Based on a shallow backbone and an improved graph convolutional network, MGCN improves the classification performance of the open set few-shot classification task by utilizing information from common categories and reducing domain discrepancies. The experimental results, based on three public remote sensing image datasets and a Tibetan Plateau remote sensing scene dataset that we collected, demonstrate the effectiveness of the proposed method on the open set few-shot scene classification task. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2022.3222449 |