Fanet: A deep learning framework for black and odorous water extraction
Black and odorous water (BOW) is a common issue in rapidly urbanizaing developing countries. Existing methods for extracting BOW from remote sensing images focus mainly on spectral information and ignores important spatial characteristics like texture, context and orientation. Deep learning has emer...
Gespeichert in:
Veröffentlicht in: | European journal of remote sensing 2023-12, Vol.56 (1) |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | 1 |
container_start_page | |
container_title | European journal of remote sensing |
container_volume | 56 |
creator | Zheng, Guizhou Zhao, Yingying Pan, Zixuan Chen, Zhixing Qiu, Zhonghang Zheng, Tingting |
description | Black and odorous water (BOW) is a common issue in rapidly urbanizaing developing countries. Existing methods for extracting BOW from remote sensing images focus mainly on spectral information and ignores important spatial characteristics like texture, context and orientation. Deep learning has emerged as a powerful approach for BOW extraction, but its effectiveness is hindered by limited amount of labeled data and a small proportion of objects. In this paper, we proposed a fully convolutional adversarial network (FANet) for end-to-end pixel-level semantic segmentation of BOW.. FANet combines a fully convolutional network (FCN) with a larger receptive field and perceptual loss, and employs adversarial learning to enhance stability in the absence of sufficient data labels. The Normalized Difference BOW Index, which can reflect the higher spectral reflectance of BOW in the near-infrared band, is used as the input of FANet together with RGB. In addition, we create a standard BOW dataset containing 5100 Gaofen-2 of 224 × 224 pixels. Evaluation of FANet on BOW dataset using intersection over union and F1-score demonstrates its superiority over popular models like FCN, U-net, and Segnet. FANet successfully preserves the integrity, continuity, and boundaries of BOW, achieving superior performance in both quantity and quality of BOW extraction. |
doi_str_mv | 10.1080/22797254.2023.2234077 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2905421940</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_dcfa203d65794f388befafa2ac3153db</doaj_id><sourcerecordid>2905421940</sourcerecordid><originalsourceid>FETCH-LOGICAL-c451t-48da5f51eba182cc8c9417e91ff82e684884560aa20e6984bfd02ea6d44efd43</originalsourceid><addsrcrecordid>eNp9kUFLAzEQhRdRsGh_ghDw3Jpkk92sJ0uxtVDw0nuYTSZl2-2mZrfU_nuztoonc5nwZuZLeC9JHhgdM6roE-d5kXMpxpzydMx5KmieXyWDXh_1jes_99tk2LYbGo-iNFfFIJnPoMHumUyIRdyTGiE0VbMmLsAOjz5sifOBlDWYLYHGEm998IeWHKHDQPCzC2C6yjf3yY2DusXhpd4lq9nravo2Wr7PF9PJcmSEZN1IKAvSSYYlMMWNUaYQLMeCOac4ZkooJWRGATjFrFCidJZyhMwKgc6K9C5ZnLHWw0bvQ7WDcNIeKv0t-LDWELrK1KitcZGS2kzmhXCpUiU6iBKYlMnUlpH1eGbtg_84YNvpjT-EJv5e84JKwVkhaJyS5ykTfNsGdL-vMqr7BPRPArpPQF8SiHsv572qiQ7uIHpZW93BqfYhmtuYqtXp_4gvP5WMOg</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2905421940</pqid></control><display><type>article</type><title>Fanet: A deep learning framework for black and odorous water extraction</title><source>Taylor & Francis Open Access</source><source>DOAJ Directory of Open Access Journals</source><creator>Zheng, Guizhou ; Zhao, Yingying ; Pan, Zixuan ; Chen, Zhixing ; Qiu, Zhonghang ; Zheng, Tingting</creator><creatorcontrib>Zheng, Guizhou ; Zhao, Yingying ; Pan, Zixuan ; Chen, Zhixing ; Qiu, Zhonghang ; Zheng, Tingting</creatorcontrib><description>Black and odorous water (BOW) is a common issue in rapidly urbanizaing developing countries. Existing methods for extracting BOW from remote sensing images focus mainly on spectral information and ignores important spatial characteristics like texture, context and orientation. Deep learning has emerged as a powerful approach for BOW extraction, but its effectiveness is hindered by limited amount of labeled data and a small proportion of objects. In this paper, we proposed a fully convolutional adversarial network (FANet) for end-to-end pixel-level semantic segmentation of BOW.. FANet combines a fully convolutional network (FCN) with a larger receptive field and perceptual loss, and employs adversarial learning to enhance stability in the absence of sufficient data labels. The Normalized Difference BOW Index, which can reflect the higher spectral reflectance of BOW in the near-infrared band, is used as the input of FANet together with RGB. In addition, we create a standard BOW dataset containing 5100 Gaofen-2 of 224 × 224 pixels. Evaluation of FANet on BOW dataset using intersection over union and F1-score demonstrates its superiority over popular models like FCN, U-net, and Segnet. FANet successfully preserves the integrity, continuity, and boundaries of BOW, achieving superior performance in both quantity and quality of BOW extraction.</description><identifier>ISSN: 2279-7254</identifier><identifier>EISSN: 2279-7254</identifier><identifier>DOI: 10.1080/22797254.2023.2234077</identifier><language>eng</language><publisher>Cagiari: Taylor & Francis</publisher><subject>black and odorous water ; Datasets ; Deep learning ; Developing countries ; fully convolutional network ; generative adversarial network ; LDCs ; Pixels ; Reflectance ; Remote sensing ; remote sensing images ; Semantic segmentation ; Spectral reflectance</subject><ispartof>European journal of remote sensing, 2023-12, Vol.56 (1)</ispartof><rights>2023 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group. 2023</rights><rights>2023 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group. This work is licensed under the Creative Commons Attribution – Non-Commercial License http://creativecommons.org/licenses/by-nc/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c451t-48da5f51eba182cc8c9417e91ff82e684884560aa20e6984bfd02ea6d44efd43</citedby><cites>FETCH-LOGICAL-c451t-48da5f51eba182cc8c9417e91ff82e684884560aa20e6984bfd02ea6d44efd43</cites><orcidid>0000-0001-9295-0721 ; 0000-0002-2890-6395</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.tandfonline.com/doi/pdf/10.1080/22797254.2023.2234077$$EPDF$$P50$$Ginformaworld$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.tandfonline.com/doi/full/10.1080/22797254.2023.2234077$$EHTML$$P50$$Ginformaworld$$Hfree_for_read</linktohtml><link.rule.ids>315,781,785,865,2103,27504,27926,27927,59145,59146</link.rule.ids></links><search><creatorcontrib>Zheng, Guizhou</creatorcontrib><creatorcontrib>Zhao, Yingying</creatorcontrib><creatorcontrib>Pan, Zixuan</creatorcontrib><creatorcontrib>Chen, Zhixing</creatorcontrib><creatorcontrib>Qiu, Zhonghang</creatorcontrib><creatorcontrib>Zheng, Tingting</creatorcontrib><title>Fanet: A deep learning framework for black and odorous water extraction</title><title>European journal of remote sensing</title><description>Black and odorous water (BOW) is a common issue in rapidly urbanizaing developing countries. Existing methods for extracting BOW from remote sensing images focus mainly on spectral information and ignores important spatial characteristics like texture, context and orientation. Deep learning has emerged as a powerful approach for BOW extraction, but its effectiveness is hindered by limited amount of labeled data and a small proportion of objects. In this paper, we proposed a fully convolutional adversarial network (FANet) for end-to-end pixel-level semantic segmentation of BOW.. FANet combines a fully convolutional network (FCN) with a larger receptive field and perceptual loss, and employs adversarial learning to enhance stability in the absence of sufficient data labels. The Normalized Difference BOW Index, which can reflect the higher spectral reflectance of BOW in the near-infrared band, is used as the input of FANet together with RGB. In addition, we create a standard BOW dataset containing 5100 Gaofen-2 of 224 × 224 pixels. Evaluation of FANet on BOW dataset using intersection over union and F1-score demonstrates its superiority over popular models like FCN, U-net, and Segnet. FANet successfully preserves the integrity, continuity, and boundaries of BOW, achieving superior performance in both quantity and quality of BOW extraction.</description><subject>black and odorous water</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Developing countries</subject><subject>fully convolutional network</subject><subject>generative adversarial network</subject><subject>LDCs</subject><subject>Pixels</subject><subject>Reflectance</subject><subject>Remote sensing</subject><subject>remote sensing images</subject><subject>Semantic segmentation</subject><subject>Spectral reflectance</subject><issn>2279-7254</issn><issn>2279-7254</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>0YH</sourceid><sourceid>8G5</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><sourceid>DOA</sourceid><recordid>eNp9kUFLAzEQhRdRsGh_ghDw3Jpkk92sJ0uxtVDw0nuYTSZl2-2mZrfU_nuztoonc5nwZuZLeC9JHhgdM6roE-d5kXMpxpzydMx5KmieXyWDXh_1jes_99tk2LYbGo-iNFfFIJnPoMHumUyIRdyTGiE0VbMmLsAOjz5sifOBlDWYLYHGEm998IeWHKHDQPCzC2C6yjf3yY2DusXhpd4lq9nravo2Wr7PF9PJcmSEZN1IKAvSSYYlMMWNUaYQLMeCOac4ZkooJWRGATjFrFCidJZyhMwKgc6K9C5ZnLHWw0bvQ7WDcNIeKv0t-LDWELrK1KitcZGS2kzmhXCpUiU6iBKYlMnUlpH1eGbtg_84YNvpjT-EJv5e84JKwVkhaJyS5ykTfNsGdL-vMqr7BPRPArpPQF8SiHsv572qiQ7uIHpZW93BqfYhmtuYqtXp_4gvP5WMOg</recordid><startdate>20231231</startdate><enddate>20231231</enddate><creator>Zheng, Guizhou</creator><creator>Zhao, Yingying</creator><creator>Pan, Zixuan</creator><creator>Chen, Zhixing</creator><creator>Qiu, Zhonghang</creator><creator>Zheng, Tingting</creator><general>Taylor & Francis</general><general>Taylor & Francis Ltd</general><general>Taylor & Francis Group</general><scope>0YH</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7TG</scope><scope>7XB</scope><scope>8FD</scope><scope>8FK</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>H8D</scope><scope>KL.</scope><scope>L7M</scope><scope>M2O</scope><scope>MBDVC</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>Q9U</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0001-9295-0721</orcidid><orcidid>https://orcid.org/0000-0002-2890-6395</orcidid></search><sort><creationdate>20231231</creationdate><title>Fanet: A deep learning framework for black and odorous water extraction</title><author>Zheng, Guizhou ; Zhao, Yingying ; Pan, Zixuan ; Chen, Zhixing ; Qiu, Zhonghang ; Zheng, Tingting</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c451t-48da5f51eba182cc8c9417e91ff82e684884560aa20e6984bfd02ea6d44efd43</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>black and odorous water</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Developing countries</topic><topic>fully convolutional network</topic><topic>generative adversarial network</topic><topic>LDCs</topic><topic>Pixels</topic><topic>Reflectance</topic><topic>Remote sensing</topic><topic>remote sensing images</topic><topic>Semantic segmentation</topic><topic>Spectral reflectance</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zheng, Guizhou</creatorcontrib><creatorcontrib>Zhao, Yingying</creatorcontrib><creatorcontrib>Pan, Zixuan</creatorcontrib><creatorcontrib>Chen, Zhixing</creatorcontrib><creatorcontrib>Qiu, Zhonghang</creatorcontrib><creatorcontrib>Zheng, Tingting</creatorcontrib><collection>Taylor & Francis Open Access</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Meteorological & Geoastrophysical Abstracts</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Technology Research Database</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>Aerospace Database</collection><collection>Meteorological & Geoastrophysical Abstracts - Academic</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Research Library</collection><collection>Research Library (Corporate)</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central Basic</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>European journal of remote sensing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zheng, Guizhou</au><au>Zhao, Yingying</au><au>Pan, Zixuan</au><au>Chen, Zhixing</au><au>Qiu, Zhonghang</au><au>Zheng, Tingting</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Fanet: A deep learning framework for black and odorous water extraction</atitle><jtitle>European journal of remote sensing</jtitle><date>2023-12-31</date><risdate>2023</risdate><volume>56</volume><issue>1</issue><issn>2279-7254</issn><eissn>2279-7254</eissn><abstract>Black and odorous water (BOW) is a common issue in rapidly urbanizaing developing countries. Existing methods for extracting BOW from remote sensing images focus mainly on spectral information and ignores important spatial characteristics like texture, context and orientation. Deep learning has emerged as a powerful approach for BOW extraction, but its effectiveness is hindered by limited amount of labeled data and a small proportion of objects. In this paper, we proposed a fully convolutional adversarial network (FANet) for end-to-end pixel-level semantic segmentation of BOW.. FANet combines a fully convolutional network (FCN) with a larger receptive field and perceptual loss, and employs adversarial learning to enhance stability in the absence of sufficient data labels. The Normalized Difference BOW Index, which can reflect the higher spectral reflectance of BOW in the near-infrared band, is used as the input of FANet together with RGB. In addition, we create a standard BOW dataset containing 5100 Gaofen-2 of 224 × 224 pixels. Evaluation of FANet on BOW dataset using intersection over union and F1-score demonstrates its superiority over popular models like FCN, U-net, and Segnet. FANet successfully preserves the integrity, continuity, and boundaries of BOW, achieving superior performance in both quantity and quality of BOW extraction.</abstract><cop>Cagiari</cop><pub>Taylor & Francis</pub><doi>10.1080/22797254.2023.2234077</doi><orcidid>https://orcid.org/0000-0001-9295-0721</orcidid><orcidid>https://orcid.org/0000-0002-2890-6395</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2279-7254 |
ispartof | European journal of remote sensing, 2023-12, Vol.56 (1) |
issn | 2279-7254 2279-7254 |
language | eng |
recordid | cdi_proquest_journals_2905421940 |
source | Taylor & Francis Open Access; DOAJ Directory of Open Access Journals |
subjects | black and odorous water Datasets Deep learning Developing countries fully convolutional network generative adversarial network LDCs Pixels Reflectance Remote sensing remote sensing images Semantic segmentation Spectral reflectance |
title | Fanet: A deep learning framework for black and odorous water extraction |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-18T01%3A56%3A50IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Fanet:%20A%20deep%20learning%20framework%20for%20black%20and%20odorous%20water%20extraction&rft.jtitle=European%20journal%20of%20remote%20sensing&rft.au=Zheng,%20Guizhou&rft.date=2023-12-31&rft.volume=56&rft.issue=1&rft.issn=2279-7254&rft.eissn=2279-7254&rft_id=info:doi/10.1080/22797254.2023.2234077&rft_dat=%3Cproquest_cross%3E2905421940%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2905421940&rft_id=info:pmid/&rft_doaj_id=oai_doaj_org_article_dcfa203d65794f388befafa2ac3153db&rfr_iscdi=true |