CTMNet: Enhanced Open-Pit Mine Extraction and Change Detection With a Hybrid CNN-Transformer Multitask Network
Automatic open-pit mine extraction and change detection from high-resolution remote sensing images are of great importance to mineral resource management. However, the high spatial heterogeneity and spectral variations of mining area scenarios make these tasks challenging. Motivated by the strong co...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2024, Vol.62, p.1-19 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Automatic open-pit mine extraction and change detection from high-resolution remote sensing images are of great importance to mineral resource management. However, the high spatial heterogeneity and spectral variations of mining area scenarios make these tasks challenging. Motivated by the strong correlation between the two tasks and their potential mutual benefits, this article presents a hybrid convolutional neural network (CNN)-Transformer multitask network (CTMNet). Constructed in an encoder-decoder manner, CTMNet has two sperate extraction paths (EPs) to localize the regions of interest for bi-temporal images, along with a change detection path (CDP) to identify discrepancies by differentiating the multiscale feature representations from the EPs. As the basic building block for the EP, a CNN-Transformer hybrid block is designed to enhance the global and local feature representation capacity. To cope with the variations in the bi-temporal images, we propose the feature alignment module for the CDP. A hard sample mining-based contrastive constraint loss is proposed to emphasize the contributions of hard samples to the training process. The experimental results on a collected open-pit mine extraction and change detection dataset (OMECSet) and two public datasets reveal the validity of the CTMNet when compared to the state-of-the-art methods. The OMECSet and the code of CTMNet have been made public available at https://figshare.com/s/80519cb980ca54456447 . |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2024.3492715 |