Mun-GAN: A Multi-scale Unsupervised Network for Remote Sensing Image Pansharpening

In remote sensing image fusion, pansharpening is a type of remote sensing image fusion method that aims to fuse panchromatic (PAN) images and multispectral (MS) images to produce high-resolution multispectral (HRMS) images. Deep learning based pansharpening technology offers a series of advanced uns...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing 2023-06, p.1-1
Hauptverfasser: Liu, Xiaobo, Liu, Xiang, Dai, Haoran, Kan, Xudong, Plaza, Antonio, Zu, Wenjie
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1
container_issue
container_start_page 1
container_title IEEE transactions on geoscience and remote sensing
container_volume
creator Liu, Xiaobo
Liu, Xiang
Dai, Haoran
Kan, Xudong
Plaza, Antonio
Zu, Wenjie
description In remote sensing image fusion, pansharpening is a type of remote sensing image fusion method that aims to fuse panchromatic (PAN) images and multispectral (MS) images to produce high-resolution multispectral (HRMS) images. Deep learning based pansharpening technology offers a series of advanced unsupervised algorithms. However, there are several challenges: (1) The existing unsupervised pansharpening methods only consider the fusion of single-scale features; (2) for the fusion of MS and PAN image feature branches, the existing pansharpening methods are implemented directly by concatenation and summation, without paying attention to critical features or suppressing redundant features; (3) the semantic gap in the long skip connections of the network architecture will create unexpected results. In this paper, we design a multiscale unsupervised architecture based on generative adversarial networks (GANs) for remote sensing image pansharpening (Mun-GAN), which consists of a generator and two discriminators. The generator includes a multi-scale feature extractor (MFE), a self-adaptation weighted fusion (SWF) module, and a nest feature aggregation (NFA) module. First, the MFE is utilized to extract multiscale feature information from the input images and to then pass this information to the SWF module for adaptive weight fusion. Then, multiscale features are reconstructed by the NFA module to obtain HRMS images. The two discriminators are spectral and spatial discriminators used against the generator. Moreover, we design a hybrid loss function to aggregate the multiscale spectral and spatial feature information. Compared with other state-of-the-art methods using QuickBird, GaoFen-2 and WorldView-3 images, which demonstrate that the Mun-GAN yields better fusion results.
doi_str_mv 10.1109/TGRS.2023.3288073
format Article
fullrecord <record><control><sourceid>ieee_RIE</sourceid><recordid>TN_cdi_ieee_primary_10159248</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10159248</ieee_id><sourcerecordid>10159248</sourcerecordid><originalsourceid>FETCH-ieee_primary_101592483</originalsourceid><addsrcrecordid>eNqFybsOgjAUgOEOmoiXBzBxOC8AtkUU3IzxNkAM4GwaPWAVCmlB49vr4O70J99PyJhRhzEaTNNdnDicctdxue_ThdshFmXB3OZ-wHukb8ydUjbz2MIicdgqe7eKlrCCsC0aaZuLKBBOyrQ16qc0eIUIm1elH5BVGmIsqwYhQWWkyuFQihzhKJS5CV2j-tqQdDNRGBz9OiCT7SZd722JiOday1Lo95lR5gV85rt_9ge1uj4-</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Mun-GAN: A Multi-scale Unsupervised Network for Remote Sensing Image Pansharpening</title><source>IEEE Electronic Library (IEL)</source><creator>Liu, Xiaobo ; Liu, Xiang ; Dai, Haoran ; Kan, Xudong ; Plaza, Antonio ; Zu, Wenjie</creator><creatorcontrib>Liu, Xiaobo ; Liu, Xiang ; Dai, Haoran ; Kan, Xudong ; Plaza, Antonio ; Zu, Wenjie</creatorcontrib><description>In remote sensing image fusion, pansharpening is a type of remote sensing image fusion method that aims to fuse panchromatic (PAN) images and multispectral (MS) images to produce high-resolution multispectral (HRMS) images. Deep learning based pansharpening technology offers a series of advanced unsupervised algorithms. However, there are several challenges: (1) The existing unsupervised pansharpening methods only consider the fusion of single-scale features; (2) for the fusion of MS and PAN image feature branches, the existing pansharpening methods are implemented directly by concatenation and summation, without paying attention to critical features or suppressing redundant features; (3) the semantic gap in the long skip connections of the network architecture will create unexpected results. In this paper, we design a multiscale unsupervised architecture based on generative adversarial networks (GANs) for remote sensing image pansharpening (Mun-GAN), which consists of a generator and two discriminators. The generator includes a multi-scale feature extractor (MFE), a self-adaptation weighted fusion (SWF) module, and a nest feature aggregation (NFA) module. First, the MFE is utilized to extract multiscale feature information from the input images and to then pass this information to the SWF module for adaptive weight fusion. Then, multiscale features are reconstructed by the NFA module to obtain HRMS images. The two discriminators are spectral and spatial discriminators used against the generator. Moreover, we design a hybrid loss function to aggregate the multiscale spectral and spatial feature information. Compared with other state-of-the-art methods using QuickBird, GaoFen-2 and WorldView-3 images, which demonstrate that the Mun-GAN yields better fusion results.</description><identifier>ISSN: 0196-2892</identifier><identifier>DOI: 10.1109/TGRS.2023.3288073</identifier><identifier>CODEN: IGRSD2</identifier><language>eng</language><publisher>IEEE</publisher><subject>Data mining ; Deep learning ; Feature extraction ; Generative adversarial network ; Generators ; Image fusion ; Multiscale ; Pansharpening ; Remote sensing ; Remote sensing image ; Spatial resolution ; Unsupervised learning</subject><ispartof>IEEE transactions on geoscience and remote sensing, 2023-06, p.1-1</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><orcidid>0000-0001-8298-7715 ; 0000-0002-3807-2531 ; 0000-0002-9613-1659</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10159248$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10159248$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Liu, Xiaobo</creatorcontrib><creatorcontrib>Liu, Xiang</creatorcontrib><creatorcontrib>Dai, Haoran</creatorcontrib><creatorcontrib>Kan, Xudong</creatorcontrib><creatorcontrib>Plaza, Antonio</creatorcontrib><creatorcontrib>Zu, Wenjie</creatorcontrib><title>Mun-GAN: A Multi-scale Unsupervised Network for Remote Sensing Image Pansharpening</title><title>IEEE transactions on geoscience and remote sensing</title><addtitle>TGRS</addtitle><description>In remote sensing image fusion, pansharpening is a type of remote sensing image fusion method that aims to fuse panchromatic (PAN) images and multispectral (MS) images to produce high-resolution multispectral (HRMS) images. Deep learning based pansharpening technology offers a series of advanced unsupervised algorithms. However, there are several challenges: (1) The existing unsupervised pansharpening methods only consider the fusion of single-scale features; (2) for the fusion of MS and PAN image feature branches, the existing pansharpening methods are implemented directly by concatenation and summation, without paying attention to critical features or suppressing redundant features; (3) the semantic gap in the long skip connections of the network architecture will create unexpected results. In this paper, we design a multiscale unsupervised architecture based on generative adversarial networks (GANs) for remote sensing image pansharpening (Mun-GAN), which consists of a generator and two discriminators. The generator includes a multi-scale feature extractor (MFE), a self-adaptation weighted fusion (SWF) module, and a nest feature aggregation (NFA) module. First, the MFE is utilized to extract multiscale feature information from the input images and to then pass this information to the SWF module for adaptive weight fusion. Then, multiscale features are reconstructed by the NFA module to obtain HRMS images. The two discriminators are spectral and spatial discriminators used against the generator. Moreover, we design a hybrid loss function to aggregate the multiscale spectral and spatial feature information. Compared with other state-of-the-art methods using QuickBird, GaoFen-2 and WorldView-3 images, which demonstrate that the Mun-GAN yields better fusion results.</description><subject>Data mining</subject><subject>Deep learning</subject><subject>Feature extraction</subject><subject>Generative adversarial network</subject><subject>Generators</subject><subject>Image fusion</subject><subject>Multiscale</subject><subject>Pansharpening</subject><subject>Remote sensing</subject><subject>Remote sensing image</subject><subject>Spatial resolution</subject><subject>Unsupervised learning</subject><issn>0196-2892</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNqFybsOgjAUgOEOmoiXBzBxOC8AtkUU3IzxNkAM4GwaPWAVCmlB49vr4O70J99PyJhRhzEaTNNdnDicctdxue_ThdshFmXB3OZ-wHukb8ydUjbz2MIicdgqe7eKlrCCsC0aaZuLKBBOyrQ16qc0eIUIm1elH5BVGmIsqwYhQWWkyuFQihzhKJS5CV2j-tqQdDNRGBz9OiCT7SZd722JiOday1Lo95lR5gV85rt_9ge1uj4-</recordid><startdate>20230620</startdate><enddate>20230620</enddate><creator>Liu, Xiaobo</creator><creator>Liu, Xiang</creator><creator>Dai, Haoran</creator><creator>Kan, Xudong</creator><creator>Plaza, Antonio</creator><creator>Zu, Wenjie</creator><general>IEEE</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><orcidid>https://orcid.org/0000-0001-8298-7715</orcidid><orcidid>https://orcid.org/0000-0002-3807-2531</orcidid><orcidid>https://orcid.org/0000-0002-9613-1659</orcidid></search><sort><creationdate>20230620</creationdate><title>Mun-GAN: A Multi-scale Unsupervised Network for Remote Sensing Image Pansharpening</title><author>Liu, Xiaobo ; Liu, Xiang ; Dai, Haoran ; Kan, Xudong ; Plaza, Antonio ; Zu, Wenjie</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-ieee_primary_101592483</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Data mining</topic><topic>Deep learning</topic><topic>Feature extraction</topic><topic>Generative adversarial network</topic><topic>Generators</topic><topic>Image fusion</topic><topic>Multiscale</topic><topic>Pansharpening</topic><topic>Remote sensing</topic><topic>Remote sensing image</topic><topic>Spatial resolution</topic><topic>Unsupervised learning</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Liu, Xiaobo</creatorcontrib><creatorcontrib>Liu, Xiang</creatorcontrib><creatorcontrib>Dai, Haoran</creatorcontrib><creatorcontrib>Kan, Xudong</creatorcontrib><creatorcontrib>Plaza, Antonio</creatorcontrib><creatorcontrib>Zu, Wenjie</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><jtitle>IEEE transactions on geoscience and remote sensing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Liu, Xiaobo</au><au>Liu, Xiang</au><au>Dai, Haoran</au><au>Kan, Xudong</au><au>Plaza, Antonio</au><au>Zu, Wenjie</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Mun-GAN: A Multi-scale Unsupervised Network for Remote Sensing Image Pansharpening</atitle><jtitle>IEEE transactions on geoscience and remote sensing</jtitle><stitle>TGRS</stitle><date>2023-06-20</date><risdate>2023</risdate><spage>1</spage><epage>1</epage><pages>1-1</pages><issn>0196-2892</issn><coden>IGRSD2</coden><abstract>In remote sensing image fusion, pansharpening is a type of remote sensing image fusion method that aims to fuse panchromatic (PAN) images and multispectral (MS) images to produce high-resolution multispectral (HRMS) images. Deep learning based pansharpening technology offers a series of advanced unsupervised algorithms. However, there are several challenges: (1) The existing unsupervised pansharpening methods only consider the fusion of single-scale features; (2) for the fusion of MS and PAN image feature branches, the existing pansharpening methods are implemented directly by concatenation and summation, without paying attention to critical features or suppressing redundant features; (3) the semantic gap in the long skip connections of the network architecture will create unexpected results. In this paper, we design a multiscale unsupervised architecture based on generative adversarial networks (GANs) for remote sensing image pansharpening (Mun-GAN), which consists of a generator and two discriminators. The generator includes a multi-scale feature extractor (MFE), a self-adaptation weighted fusion (SWF) module, and a nest feature aggregation (NFA) module. First, the MFE is utilized to extract multiscale feature information from the input images and to then pass this information to the SWF module for adaptive weight fusion. Then, multiscale features are reconstructed by the NFA module to obtain HRMS images. The two discriminators are spectral and spatial discriminators used against the generator. Moreover, we design a hybrid loss function to aggregate the multiscale spectral and spatial feature information. Compared with other state-of-the-art methods using QuickBird, GaoFen-2 and WorldView-3 images, which demonstrate that the Mun-GAN yields better fusion results.</abstract><pub>IEEE</pub><doi>10.1109/TGRS.2023.3288073</doi><orcidid>https://orcid.org/0000-0001-8298-7715</orcidid><orcidid>https://orcid.org/0000-0002-3807-2531</orcidid><orcidid>https://orcid.org/0000-0002-9613-1659</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 0196-2892
ispartof IEEE transactions on geoscience and remote sensing, 2023-06, p.1-1
issn 0196-2892
language eng
recordid cdi_ieee_primary_10159248
source IEEE Electronic Library (IEL)
subjects Data mining
Deep learning
Feature extraction
Generative adversarial network
Generators
Image fusion
Multiscale
Pansharpening
Remote sensing
Remote sensing image
Spatial resolution
Unsupervised learning
title Mun-GAN: A Multi-scale Unsupervised Network for Remote Sensing Image Pansharpening
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-11T03%3A09%3A08IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Mun-GAN:%20A%20Multi-scale%20Unsupervised%20Network%20for%20Remote%20Sensing%20Image%20Pansharpening&rft.jtitle=IEEE%20transactions%20on%20geoscience%20and%20remote%20sensing&rft.au=Liu,%20Xiaobo&rft.date=2023-06-20&rft.spage=1&rft.epage=1&rft.pages=1-1&rft.issn=0196-2892&rft.coden=IGRSD2&rft_id=info:doi/10.1109/TGRS.2023.3288073&rft_dat=%3Cieee_RIE%3E10159248%3C/ieee_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10159248&rfr_iscdi=true