Explicit Change-Relation Learning for Change Detection in VHR Remote Sensing Images

Change detection is a concerned task in the interpretation of remote sensing images. The mining of the relationship on change features is usually implicit in the deep learning networks that contain single-branch or two-branch encoders. However, due to the lack of artificial prior design for the rela...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE geoscience and remote sensing letters 2024, Vol.21, p.1-5
Hauptverfasser: Zheng, Dalong, Wu, Zebin, Liu, Jia, Xu, Yang, Hung, Chih-Cheng, Wei, Zhihui
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 5
container_issue
container_start_page 1
container_title IEEE geoscience and remote sensing letters
container_volume 21
creator Zheng, Dalong
Wu, Zebin
Liu, Jia
Xu, Yang
Hung, Chih-Cheng
Wei, Zhihui
description Change detection is a concerned task in the interpretation of remote sensing images. The mining of the relationship on change features is usually implicit in the deep learning networks that contain single-branch or two-branch encoders. However, due to the lack of artificial prior design for the relationship on change features, these networks cannot learn enough semantic information on change features and lead to the poor performance. So, we propose a new network architecture explicit change-relation network (ECRNet) for the explicit mining of change-relation features. In our study of the literature, our suggestion is that the change features for change detection should be divided into prechanged image features, postchanged image features, and change-relation features. In order to fully mining these three kinds of change features, we propose the triple branch network combining the transformer and convolutional neural network (CNN) to extract and fuse these change features from two perspectives of global information and local information, respectively. In addition, we design the continuous change-relation (CCR) branch to further obtain the continuous and detailed change-relation features to improve the change discrimination capability of the model. The experimental results show that our network performs better than those of the existing advanced networks by the F1 score improvements of 0.66/0.37/0.70/1.09 on the very high-resolution (VHR) remote sensing datasets of the LEVIR-CD/SVCD/WHU-CD/SYSU-CD. Our source code is available at https://github.com/DalongZ/ECRNet .
doi_str_mv 10.1109/LGRS.2024.3366981
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2938023609</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10439252</ieee_id><sourcerecordid>2938023609</sourcerecordid><originalsourceid>FETCH-LOGICAL-c294t-a76dc37055c4123fd2527e144c5e5c9e24723665987b3dbc3a1c855a8645fec93</originalsourceid><addsrcrecordid>eNpNkE9LAzEQxYMoWKsfQPAQ8Lw1fzfJUWptCwvCVsVbSNPZmtLu1mQL-u3dtT14moH5vZk3D6FbSkaUEvNQTMvFiBEmRpznudH0DA2olDojUtHzvhcyk0Z_XKKrlDakI7VWA7SYfO-3wYcWjz9dvYashK1rQ1PjAlysQ73GVRNPQ_wELfi_aajx-6zEJeyaFvAC6tSj851bQ7pGF5XbJrg51SF6e568jmdZ8TKdjx-LzDMj2sypfOW5IlJ6QRmvVkwyBVQIL0F6A0wo1v3SmVZLvlp67qjXUjqdC1mBN3yI7o9797H5OkBq7aY5xLo7aZnhmnRq0lP0SPnYpBShsvsYdi7-WEpsn53ts7N9dvaUXae5O2oCAPzjBTedSf4LA2xpfg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2938023609</pqid></control><display><type>article</type><title>Explicit Change-Relation Learning for Change Detection in VHR Remote Sensing Images</title><source>IEEE Electronic Library (IEL)</source><creator>Zheng, Dalong ; Wu, Zebin ; Liu, Jia ; Xu, Yang ; Hung, Chih-Cheng ; Wei, Zhihui</creator><creatorcontrib>Zheng, Dalong ; Wu, Zebin ; Liu, Jia ; Xu, Yang ; Hung, Chih-Cheng ; Wei, Zhihui</creatorcontrib><description>Change detection is a concerned task in the interpretation of remote sensing images. The mining of the relationship on change features is usually implicit in the deep learning networks that contain single-branch or two-branch encoders. However, due to the lack of artificial prior design for the relationship on change features, these networks cannot learn enough semantic information on change features and lead to the poor performance. So, we propose a new network architecture explicit change-relation network (ECRNet) for the explicit mining of change-relation features. In our study of the literature, our suggestion is that the change features for change detection should be divided into prechanged image features, postchanged image features, and change-relation features. In order to fully mining these three kinds of change features, we propose the triple branch network combining the transformer and convolutional neural network (CNN) to extract and fuse these change features from two perspectives of global information and local information, respectively. In addition, we design the continuous change-relation (CCR) branch to further obtain the continuous and detailed change-relation features to improve the change discrimination capability of the model. The experimental results show that our network performs better than those of the existing advanced networks by the F1 score improvements of 0.66/0.37/0.70/1.09 on the very high-resolution (VHR) remote sensing datasets of the LEVIR-CD/SVCD/WHU-CD/SYSU-CD. Our source code is available at https://github.com/DalongZ/ECRNet .</description><identifier>ISSN: 1545-598X</identifier><identifier>EISSN: 1558-0571</identifier><identifier>DOI: 10.1109/LGRS.2024.3366981</identifier><identifier>CODEN: IGRSBY</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Artificial neural networks ; Change detection ; change-relation features ; Convolution ; convolutional neural network (CNN) ; Data mining ; Decoding ; Deep learning ; Design ; Detection ; Feature extraction ; Fuses ; Machine learning ; Neural networks ; Remote sensing ; Semantics ; Source code ; transformer ; Transformers</subject><ispartof>IEEE geoscience and remote sensing letters, 2024, Vol.21, p.1-5</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c294t-a76dc37055c4123fd2527e144c5e5c9e24723665987b3dbc3a1c855a8645fec93</citedby><cites>FETCH-LOGICAL-c294t-a76dc37055c4123fd2527e144c5e5c9e24723665987b3dbc3a1c855a8645fec93</cites><orcidid>0000-0003-4572-4558 ; 0000-0002-7162-0202 ; 0000-0002-5999-2361 ; 0000-0003-3514-9705 ; 0000-0003-0477-5957 ; 0000-0002-4841-6051</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10439252$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,4009,27902,27903,27904,54736</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10439252$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Zheng, Dalong</creatorcontrib><creatorcontrib>Wu, Zebin</creatorcontrib><creatorcontrib>Liu, Jia</creatorcontrib><creatorcontrib>Xu, Yang</creatorcontrib><creatorcontrib>Hung, Chih-Cheng</creatorcontrib><creatorcontrib>Wei, Zhihui</creatorcontrib><title>Explicit Change-Relation Learning for Change Detection in VHR Remote Sensing Images</title><title>IEEE geoscience and remote sensing letters</title><addtitle>LGRS</addtitle><description>Change detection is a concerned task in the interpretation of remote sensing images. The mining of the relationship on change features is usually implicit in the deep learning networks that contain single-branch or two-branch encoders. However, due to the lack of artificial prior design for the relationship on change features, these networks cannot learn enough semantic information on change features and lead to the poor performance. So, we propose a new network architecture explicit change-relation network (ECRNet) for the explicit mining of change-relation features. In our study of the literature, our suggestion is that the change features for change detection should be divided into prechanged image features, postchanged image features, and change-relation features. In order to fully mining these three kinds of change features, we propose the triple branch network combining the transformer and convolutional neural network (CNN) to extract and fuse these change features from two perspectives of global information and local information, respectively. In addition, we design the continuous change-relation (CCR) branch to further obtain the continuous and detailed change-relation features to improve the change discrimination capability of the model. The experimental results show that our network performs better than those of the existing advanced networks by the F1 score improvements of 0.66/0.37/0.70/1.09 on the very high-resolution (VHR) remote sensing datasets of the LEVIR-CD/SVCD/WHU-CD/SYSU-CD. Our source code is available at https://github.com/DalongZ/ECRNet .</description><subject>Artificial neural networks</subject><subject>Change detection</subject><subject>change-relation features</subject><subject>Convolution</subject><subject>convolutional neural network (CNN)</subject><subject>Data mining</subject><subject>Decoding</subject><subject>Deep learning</subject><subject>Design</subject><subject>Detection</subject><subject>Feature extraction</subject><subject>Fuses</subject><subject>Machine learning</subject><subject>Neural networks</subject><subject>Remote sensing</subject><subject>Semantics</subject><subject>Source code</subject><subject>transformer</subject><subject>Transformers</subject><issn>1545-598X</issn><issn>1558-0571</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkE9LAzEQxYMoWKsfQPAQ8Lw1fzfJUWptCwvCVsVbSNPZmtLu1mQL-u3dtT14moH5vZk3D6FbSkaUEvNQTMvFiBEmRpznudH0DA2olDojUtHzvhcyk0Z_XKKrlDakI7VWA7SYfO-3wYcWjz9dvYashK1rQ1PjAlysQ73GVRNPQ_wELfi_aajx-6zEJeyaFvAC6tSj851bQ7pGF5XbJrg51SF6e568jmdZ8TKdjx-LzDMj2sypfOW5IlJ6QRmvVkwyBVQIL0F6A0wo1v3SmVZLvlp67qjXUjqdC1mBN3yI7o9797H5OkBq7aY5xLo7aZnhmnRq0lP0SPnYpBShsvsYdi7-WEpsn53ts7N9dvaUXae5O2oCAPzjBTedSf4LA2xpfg</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Zheng, Dalong</creator><creator>Wu, Zebin</creator><creator>Liu, Jia</creator><creator>Xu, Yang</creator><creator>Hung, Chih-Cheng</creator><creator>Wei, Zhihui</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TG</scope><scope>7UA</scope><scope>8FD</scope><scope>C1K</scope><scope>F1W</scope><scope>FR3</scope><scope>H8D</scope><scope>H96</scope><scope>JQ2</scope><scope>KL.</scope><scope>KR7</scope><scope>L.G</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0003-4572-4558</orcidid><orcidid>https://orcid.org/0000-0002-7162-0202</orcidid><orcidid>https://orcid.org/0000-0002-5999-2361</orcidid><orcidid>https://orcid.org/0000-0003-3514-9705</orcidid><orcidid>https://orcid.org/0000-0003-0477-5957</orcidid><orcidid>https://orcid.org/0000-0002-4841-6051</orcidid></search><sort><creationdate>2024</creationdate><title>Explicit Change-Relation Learning for Change Detection in VHR Remote Sensing Images</title><author>Zheng, Dalong ; Wu, Zebin ; Liu, Jia ; Xu, Yang ; Hung, Chih-Cheng ; Wei, Zhihui</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c294t-a76dc37055c4123fd2527e144c5e5c9e24723665987b3dbc3a1c855a8645fec93</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Artificial neural networks</topic><topic>Change detection</topic><topic>change-relation features</topic><topic>Convolution</topic><topic>convolutional neural network (CNN)</topic><topic>Data mining</topic><topic>Decoding</topic><topic>Deep learning</topic><topic>Design</topic><topic>Detection</topic><topic>Feature extraction</topic><topic>Fuses</topic><topic>Machine learning</topic><topic>Neural networks</topic><topic>Remote sensing</topic><topic>Semantics</topic><topic>Source code</topic><topic>transformer</topic><topic>Transformers</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zheng, Dalong</creatorcontrib><creatorcontrib>Wu, Zebin</creatorcontrib><creatorcontrib>Liu, Jia</creatorcontrib><creatorcontrib>Xu, Yang</creatorcontrib><creatorcontrib>Hung, Chih-Cheng</creatorcontrib><creatorcontrib>Wei, Zhihui</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Meteorological &amp; Geoastrophysical Abstracts</collection><collection>Water Resources Abstracts</collection><collection>Technology Research Database</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ASFA: Aquatic Sciences and Fisheries Abstracts</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy &amp; Non-Living Resources</collection><collection>ProQuest Computer Science Collection</collection><collection>Meteorological &amp; Geoastrophysical Abstracts - Academic</collection><collection>Civil Engineering Abstracts</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) Professional</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE geoscience and remote sensing letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Zheng, Dalong</au><au>Wu, Zebin</au><au>Liu, Jia</au><au>Xu, Yang</au><au>Hung, Chih-Cheng</au><au>Wei, Zhihui</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Explicit Change-Relation Learning for Change Detection in VHR Remote Sensing Images</atitle><jtitle>IEEE geoscience and remote sensing letters</jtitle><stitle>LGRS</stitle><date>2024</date><risdate>2024</risdate><volume>21</volume><spage>1</spage><epage>5</epage><pages>1-5</pages><issn>1545-598X</issn><eissn>1558-0571</eissn><coden>IGRSBY</coden><abstract>Change detection is a concerned task in the interpretation of remote sensing images. The mining of the relationship on change features is usually implicit in the deep learning networks that contain single-branch or two-branch encoders. However, due to the lack of artificial prior design for the relationship on change features, these networks cannot learn enough semantic information on change features and lead to the poor performance. So, we propose a new network architecture explicit change-relation network (ECRNet) for the explicit mining of change-relation features. In our study of the literature, our suggestion is that the change features for change detection should be divided into prechanged image features, postchanged image features, and change-relation features. In order to fully mining these three kinds of change features, we propose the triple branch network combining the transformer and convolutional neural network (CNN) to extract and fuse these change features from two perspectives of global information and local information, respectively. In addition, we design the continuous change-relation (CCR) branch to further obtain the continuous and detailed change-relation features to improve the change discrimination capability of the model. The experimental results show that our network performs better than those of the existing advanced networks by the F1 score improvements of 0.66/0.37/0.70/1.09 on the very high-resolution (VHR) remote sensing datasets of the LEVIR-CD/SVCD/WHU-CD/SYSU-CD. Our source code is available at https://github.com/DalongZ/ECRNet .</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/LGRS.2024.3366981</doi><tpages>5</tpages><orcidid>https://orcid.org/0000-0003-4572-4558</orcidid><orcidid>https://orcid.org/0000-0002-7162-0202</orcidid><orcidid>https://orcid.org/0000-0002-5999-2361</orcidid><orcidid>https://orcid.org/0000-0003-3514-9705</orcidid><orcidid>https://orcid.org/0000-0003-0477-5957</orcidid><orcidid>https://orcid.org/0000-0002-4841-6051</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1545-598X
ispartof IEEE geoscience and remote sensing letters, 2024, Vol.21, p.1-5
issn 1545-598X
1558-0571
language eng
recordid cdi_proquest_journals_2938023609
source IEEE Electronic Library (IEL)
subjects Artificial neural networks
Change detection
change-relation features
Convolution
convolutional neural network (CNN)
Data mining
Decoding
Deep learning
Design
Detection
Feature extraction
Fuses
Machine learning
Neural networks
Remote sensing
Semantics
Source code
transformer
Transformers
title Explicit Change-Relation Learning for Change Detection in VHR Remote Sensing Images
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-24T08%3A23%3A54IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Explicit%20Change-Relation%20Learning%20for%20Change%20Detection%20in%20VHR%20Remote%20Sensing%20Images&rft.jtitle=IEEE%20geoscience%20and%20remote%20sensing%20letters&rft.au=Zheng,%20Dalong&rft.date=2024&rft.volume=21&rft.spage=1&rft.epage=5&rft.pages=1-5&rft.issn=1545-598X&rft.eissn=1558-0571&rft.coden=IGRSBY&rft_id=info:doi/10.1109/LGRS.2024.3366981&rft_dat=%3Cproquest_RIE%3E2938023609%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2938023609&rft_id=info:pmid/&rft_ieee_id=10439252&rfr_iscdi=true