Triplet-Based Semantic Relation Learning for Aerial Remote Sensing Image Change Detection
This letter presents a novel supervised change detection method based on a deep siamese semantic network framework, which is trained by using improved triplet loss function for optical aerial images. The proposed framework can not only extract features directly from image pairs which include multisc...
Gespeichert in:
Veröffentlicht in: | IEEE geoscience and remote sensing letters 2019-02, Vol.16 (2), p.266-270 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This letter presents a novel supervised change detection method based on a deep siamese semantic network framework, which is trained by using improved triplet loss function for optical aerial images. The proposed framework can not only extract features directly from image pairs which include multiscale information and are more abstract as well as robust, but also enhance the interclass separability and the intraclass inseparability by learning semantic relation. The feature vectors of the pixels pair with the same label are closer, and at the same time, the feature vectors of the pixels with different labels are farther from each other. Moreover, we use the distance of the feature map to detect the changes on the difference map between the image pair. Binarized change map can be obtained by a simple threshold. Experiments on optical aerial image data set validate that the proposed approach produces comparable, even better results, favorably to the state-of-the-art methods in terms of F-measure. |
---|---|
ISSN: | 1545-598X 1558-0571 |
DOI: | 10.1109/LGRS.2018.2869608 |