Prototype-Guided Multitask Adversarial Network for Cross-Domain LiDAR Point Clouds Semantic Segmentation
Unsupervised domain adaptation (UDA) segmentation aims to leverage labeled source data to make accurate predictions on unlabeled target data. The key is to make the segmentation network learn domain-invariant representations. In this work, we propose a prototype-guided multitask adversarial network...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2023, Vol.61, p.1-13 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Unsupervised domain adaptation (UDA) segmentation aims to leverage labeled source data to make accurate predictions on unlabeled target data. The key is to make the segmentation network learn domain-invariant representations. In this work, we propose a prototype-guided multitask adversarial network (PMAN) to achieve this. First, we propose an intensity-aware segmentation network (IAS-Net) that leverages the private intensity information of target data to substantially facilitate feature learning of the target domain. Second, the category-level cross-domain feature alignment strategy is introduced to flee the side effects of global feature alignment. It employs the prototype (class centroid) and includes two essential operations: 1) build an auxiliary nonparametric classifier to evaluate the semantic alignment degree of each point based on the prediction consistency between the main and auxiliary classifiers and 2) introduce two class-conditional point-to-prototype learning objectives for better alignment. One is to explicitly perform category-level feature alignment in a progressive manner, and the other aims to shape the source feature representation to be discriminative. Extensive experiments reveal that our PMAN outperforms state-of-the-art results on two benchmark datasets. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2023.3234542 |