Multi-Scale Dual-Domain Guidance Network for Pan-sharpening

The goal of pan-sharpening is to produce a high-spatial-resolution multi-spectral (HRMS) image from a low-spatial-resolution multi-spectral (LRMS) counterpart by super-resolving the LRMS one under the guidance of a texture-rich panchromatic (PAN) image. Existing research has concentrated on using sp...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing 2023-05, p.1-1
Hauptverfasser: He, Xuanhua, Yan, Keyu, Zhang, Jie, Li, Rui, Xie, Chengjun, Zhou, Man, Hong, Danfeng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1
container_issue
container_start_page 1
container_title IEEE transactions on geoscience and remote sensing
container_volume
creator He, Xuanhua
Yan, Keyu
Zhang, Jie
Li, Rui
Xie, Chengjun
Zhou, Man
Hong, Danfeng
description The goal of pan-sharpening is to produce a high-spatial-resolution multi-spectral (HRMS) image from a low-spatial-resolution multi-spectral (LRMS) counterpart by super-resolving the LRMS one under the guidance of a texture-rich panchromatic (PAN) image. Existing research has concentrated on using spatial information to generate HRMS images, but has neglected to investigate the frequency domain, which severely restricts the performance improvement. In this work, we propose a novel pan-sharpening approach, named Multi-Scale Dual-Domain Guidance Network (MSDDN) by fully exploring and exploiting the distinguished information in both the spatial and frequency domains. Specifically, the network is inborn with multi-scale U-shape manner and composed by two core parts: a spatial guidance sub-network for fusing local spatial information and a frequency guidance sub-network for fusing global frequency domain information and encouraging dual-domain complementary learning. In this way, the model can capture multi-scale dual-domain information to help it generate high-quality pan-sharpening results. Employing the proposed model on different datasets, the quantitative and qualitative results demonstrate that our method performs appreciatively against other state-of-the-art approaches and comprises a strong generalization ability for real-world scenes. The source code is available at https://github.com/alexhe101/MSDDN.
doi_str_mv 10.1109/TGRS.2023.3273334
format Article
fullrecord <record><control><sourceid>ieee_RIE</sourceid><recordid>TN_cdi_ieee_primary_10119207</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10119207</ieee_id><sourcerecordid>10119207</sourcerecordid><originalsourceid>FETCH-ieee_primary_101192073</originalsourceid><addsrcrecordid>eNqFyT0OgjAYANAOmog_BzBx6AWK_VoUG0dRXDRG2EmDRaulkBZivL2Lu9MbHkJzoCEAFcs8vWYho4yHnMWc82iAAgpiTdhGsBEae_-kFKIVxAHannrTaZKV0iic9NKQpKmltjjt9U3aUuGz6t6Ne-GqcfgiLfEP6Vpltb1P0bCSxqvZzwlaHPb57ki0Uqpona6l-xRAAQSjMf_TXwlfNeY</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Multi-Scale Dual-Domain Guidance Network for Pan-sharpening</title><source>IEEE Electronic Library (IEL)</source><creator>He, Xuanhua ; Yan, Keyu ; Zhang, Jie ; Li, Rui ; Xie, Chengjun ; Zhou, Man ; Hong, Danfeng</creator><creatorcontrib>He, Xuanhua ; Yan, Keyu ; Zhang, Jie ; Li, Rui ; Xie, Chengjun ; Zhou, Man ; Hong, Danfeng</creatorcontrib><description>The goal of pan-sharpening is to produce a high-spatial-resolution multi-spectral (HRMS) image from a low-spatial-resolution multi-spectral (LRMS) counterpart by super-resolving the LRMS one under the guidance of a texture-rich panchromatic (PAN) image. Existing research has concentrated on using spatial information to generate HRMS images, but has neglected to investigate the frequency domain, which severely restricts the performance improvement. In this work, we propose a novel pan-sharpening approach, named Multi-Scale Dual-Domain Guidance Network (MSDDN) by fully exploring and exploiting the distinguished information in both the spatial and frequency domains. Specifically, the network is inborn with multi-scale U-shape manner and composed by two core parts: a spatial guidance sub-network for fusing local spatial information and a frequency guidance sub-network for fusing global frequency domain information and encouraging dual-domain complementary learning. In this way, the model can capture multi-scale dual-domain information to help it generate high-quality pan-sharpening results. Employing the proposed model on different datasets, the quantitative and qualitative results demonstrate that our method performs appreciatively against other state-of-the-art approaches and comprises a strong generalization ability for real-world scenes. The source code is available at https://github.com/alexhe101/MSDDN.</description><identifier>ISSN: 0196-2892</identifier><identifier>DOI: 10.1109/TGRS.2023.3273334</identifier><identifier>CODEN: IGRSD2</identifier><language>eng</language><publisher>IEEE</publisher><subject>Convolution ; Feature extraction ; Fourier transforms ; Frequency-domain analysis ; Multiresolution analysis ; Pan-sharpening ; Spatial resolution ; Spatial-frequency domain ; Superresolution</subject><ispartof>IEEE transactions on geoscience and remote sensing, 2023-05, p.1-1</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><orcidid>0000-0001-7218-6703 ; 0009-0004-6637-853X ; 0000-0002-0629-2038 ; 0000-0002-3212-9584 ; 0000-0002-2885-1216 ; 0000-0003-2872-605X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10119207$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>315,781,785,797,27929,27930,54763</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10119207$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>He, Xuanhua</creatorcontrib><creatorcontrib>Yan, Keyu</creatorcontrib><creatorcontrib>Zhang, Jie</creatorcontrib><creatorcontrib>Li, Rui</creatorcontrib><creatorcontrib>Xie, Chengjun</creatorcontrib><creatorcontrib>Zhou, Man</creatorcontrib><creatorcontrib>Hong, Danfeng</creatorcontrib><title>Multi-Scale Dual-Domain Guidance Network for Pan-sharpening</title><title>IEEE transactions on geoscience and remote sensing</title><addtitle>TGRS</addtitle><description>The goal of pan-sharpening is to produce a high-spatial-resolution multi-spectral (HRMS) image from a low-spatial-resolution multi-spectral (LRMS) counterpart by super-resolving the LRMS one under the guidance of a texture-rich panchromatic (PAN) image. Existing research has concentrated on using spatial information to generate HRMS images, but has neglected to investigate the frequency domain, which severely restricts the performance improvement. In this work, we propose a novel pan-sharpening approach, named Multi-Scale Dual-Domain Guidance Network (MSDDN) by fully exploring and exploiting the distinguished information in both the spatial and frequency domains. Specifically, the network is inborn with multi-scale U-shape manner and composed by two core parts: a spatial guidance sub-network for fusing local spatial information and a frequency guidance sub-network for fusing global frequency domain information and encouraging dual-domain complementary learning. In this way, the model can capture multi-scale dual-domain information to help it generate high-quality pan-sharpening results. Employing the proposed model on different datasets, the quantitative and qualitative results demonstrate that our method performs appreciatively against other state-of-the-art approaches and comprises a strong generalization ability for real-world scenes. The source code is available at https://github.com/alexhe101/MSDDN.</description><subject>Convolution</subject><subject>Feature extraction</subject><subject>Fourier transforms</subject><subject>Frequency-domain analysis</subject><subject>Multiresolution analysis</subject><subject>Pan-sharpening</subject><subject>Spatial resolution</subject><subject>Spatial-frequency domain</subject><subject>Superresolution</subject><issn>0196-2892</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNqFyT0OgjAYANAOmog_BzBx6AWK_VoUG0dRXDRG2EmDRaulkBZivL2Lu9MbHkJzoCEAFcs8vWYho4yHnMWc82iAAgpiTdhGsBEae_-kFKIVxAHannrTaZKV0iic9NKQpKmltjjt9U3aUuGz6t6Ne-GqcfgiLfEP6Vpltb1P0bCSxqvZzwlaHPb57ki0Uqpona6l-xRAAQSjMf_TXwlfNeY</recordid><startdate>20230504</startdate><enddate>20230504</enddate><creator>He, Xuanhua</creator><creator>Yan, Keyu</creator><creator>Zhang, Jie</creator><creator>Li, Rui</creator><creator>Xie, Chengjun</creator><creator>Zhou, Man</creator><creator>Hong, Danfeng</creator><general>IEEE</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><orcidid>https://orcid.org/0000-0001-7218-6703</orcidid><orcidid>https://orcid.org/0009-0004-6637-853X</orcidid><orcidid>https://orcid.org/0000-0002-0629-2038</orcidid><orcidid>https://orcid.org/0000-0002-3212-9584</orcidid><orcidid>https://orcid.org/0000-0002-2885-1216</orcidid><orcidid>https://orcid.org/0000-0003-2872-605X</orcidid></search><sort><creationdate>20230504</creationdate><title>Multi-Scale Dual-Domain Guidance Network for Pan-sharpening</title><author>He, Xuanhua ; Yan, Keyu ; Zhang, Jie ; Li, Rui ; Xie, Chengjun ; Zhou, Man ; Hong, Danfeng</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-ieee_primary_101192073</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Convolution</topic><topic>Feature extraction</topic><topic>Fourier transforms</topic><topic>Frequency-domain analysis</topic><topic>Multiresolution analysis</topic><topic>Pan-sharpening</topic><topic>Spatial resolution</topic><topic>Spatial-frequency domain</topic><topic>Superresolution</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>He, Xuanhua</creatorcontrib><creatorcontrib>Yan, Keyu</creatorcontrib><creatorcontrib>Zhang, Jie</creatorcontrib><creatorcontrib>Li, Rui</creatorcontrib><creatorcontrib>Xie, Chengjun</creatorcontrib><creatorcontrib>Zhou, Man</creatorcontrib><creatorcontrib>Hong, Danfeng</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><jtitle>IEEE transactions on geoscience and remote sensing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>He, Xuanhua</au><au>Yan, Keyu</au><au>Zhang, Jie</au><au>Li, Rui</au><au>Xie, Chengjun</au><au>Zhou, Man</au><au>Hong, Danfeng</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Multi-Scale Dual-Domain Guidance Network for Pan-sharpening</atitle><jtitle>IEEE transactions on geoscience and remote sensing</jtitle><stitle>TGRS</stitle><date>2023-05-04</date><risdate>2023</risdate><spage>1</spage><epage>1</epage><pages>1-1</pages><issn>0196-2892</issn><coden>IGRSD2</coden><abstract>The goal of pan-sharpening is to produce a high-spatial-resolution multi-spectral (HRMS) image from a low-spatial-resolution multi-spectral (LRMS) counterpart by super-resolving the LRMS one under the guidance of a texture-rich panchromatic (PAN) image. Existing research has concentrated on using spatial information to generate HRMS images, but has neglected to investigate the frequency domain, which severely restricts the performance improvement. In this work, we propose a novel pan-sharpening approach, named Multi-Scale Dual-Domain Guidance Network (MSDDN) by fully exploring and exploiting the distinguished information in both the spatial and frequency domains. Specifically, the network is inborn with multi-scale U-shape manner and composed by two core parts: a spatial guidance sub-network for fusing local spatial information and a frequency guidance sub-network for fusing global frequency domain information and encouraging dual-domain complementary learning. In this way, the model can capture multi-scale dual-domain information to help it generate high-quality pan-sharpening results. Employing the proposed model on different datasets, the quantitative and qualitative results demonstrate that our method performs appreciatively against other state-of-the-art approaches and comprises a strong generalization ability for real-world scenes. The source code is available at https://github.com/alexhe101/MSDDN.</abstract><pub>IEEE</pub><doi>10.1109/TGRS.2023.3273334</doi><orcidid>https://orcid.org/0000-0001-7218-6703</orcidid><orcidid>https://orcid.org/0009-0004-6637-853X</orcidid><orcidid>https://orcid.org/0000-0002-0629-2038</orcidid><orcidid>https://orcid.org/0000-0002-3212-9584</orcidid><orcidid>https://orcid.org/0000-0002-2885-1216</orcidid><orcidid>https://orcid.org/0000-0003-2872-605X</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 0196-2892
ispartof IEEE transactions on geoscience and remote sensing, 2023-05, p.1-1
issn 0196-2892
language eng
recordid cdi_ieee_primary_10119207
source IEEE Electronic Library (IEL)
subjects Convolution
Feature extraction
Fourier transforms
Frequency-domain analysis
Multiresolution analysis
Pan-sharpening
Spatial resolution
Spatial-frequency domain
Superresolution
title Multi-Scale Dual-Domain Guidance Network for Pan-sharpening
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-14T17%3A07%3A35IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Multi-Scale%20Dual-Domain%20Guidance%20Network%20for%20Pan-sharpening&rft.jtitle=IEEE%20transactions%20on%20geoscience%20and%20remote%20sensing&rft.au=He,%20Xuanhua&rft.date=2023-05-04&rft.spage=1&rft.epage=1&rft.pages=1-1&rft.issn=0196-2892&rft.coden=IGRSD2&rft_id=info:doi/10.1109/TGRS.2023.3273334&rft_dat=%3Cieee_RIE%3E10119207%3C/ieee_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10119207&rfr_iscdi=true