Building Footprint Extraction From Unmanned Aerial Vehicle Images Via PRU-Net: Application to Change Detection

As the manual detection of building footprint is inefficient and labor-intensive, this study proposed a method of building footprint extraction and change detection based on deep convolutional neural networks. The study modified the existing U-Net model to develop the "PRU-Net" model. PRU-...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE journal of selected topics in applied earth observations and remote sensing 2021, Vol.14, p.2236-2248
Hauptverfasser: Liu, Wei, Xu, Jiawei, Guo, Zihui, Li, Erzhu, Li, Xing, Zhang, Lianpeng, Liu, Wensong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 2248
container_issue
container_start_page 2236
container_title IEEE journal of selected topics in applied earth observations and remote sensing
container_volume 14
creator Liu, Wei
Xu, Jiawei
Guo, Zihui
Li, Erzhu
Li, Xing
Zhang, Lianpeng
Liu, Wensong
description As the manual detection of building footprint is inefficient and labor-intensive, this study proposed a method of building footprint extraction and change detection based on deep convolutional neural networks. The study modified the existing U-Net model to develop the "PRU-Net" model. PRU-Net incorporates pyramid scene parsing (PSP) to allow multiscale scene parsing, a residual block (RB) in ResNet for feature extraction, and focal loss to address sample imbalance. Within the proposed method, building footprint extraction is conducted as follows: 1) unmanned aerial vehicle images are cropped, denoised, and semantically marked, and datasets are created (including training/validation and prediction datasets); 2) the training/validation and prediction datasets are input into the full convolutional neural network PRU-Net for model training/validation and prediction. Compared with the U-Net, PSP+U-Net (PU-Net), and U-Net++ models, PRU-Net offers improved footprint extraction of buildings with a range of sizes and shapes. The large-scale experimental results demonstrated the effectiveness of the PSP module for multiscale scene analysis and the RB module for feature extraction. After demonstrating the improvements in building extraction offered by PRU-Net, the building footprint results were further processed to generate a building change map.
doi_str_mv 10.1109/JSTARS.2021.3052495
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2488745839</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9327466</ieee_id><doaj_id>oai_doaj_org_article_31be86322b0a482e8fd814e19e7eadf4</doaj_id><sourcerecordid>2488745839</sourcerecordid><originalsourceid>FETCH-LOGICAL-c408t-26aca8f727c96eb05ba67a0c4d61ba9e0b9e54d5bcccf40064027893c168c463</originalsourceid><addsrcrecordid>eNo9kc1u2zAQhIWgBeqmfYJcCPQsl_8ic3PdOHURtEXi5EqsqJVDQxZdigaat69sBTktsJhvdgdTFFeMzhmj9uvPh83i_mHOKWdzQRWXVl0UM84UK5kS6l0xY1bYkkkqPxQfh2FHqeaVFbOi_3YMXRP6LVnFmA8p9Jnc_MsJfA6xJ6sU9-Sx30PfY0MWmAJ05Amfg--QrPewxYE8BSB_7h_LX5ivyeJw6IKHM5wjWT5Dv0XyHTOeDT8V71voBvz8Oi-Lzepms_xR3v2-XS8Xd6WX1OSSa_Bg2opX3mqsqapBV0C9bDSrwSKtLSrZqNp738oxjKS8MlZ4po2XWlwW68m2ibBzY6o9pBcXIbjzIqatg5RPIZxgNRotOK8pSMPRtI1hEpnFCqFp5ej1ZfI6pPj3iEN2u3hM_fi949KYSioj7KgSk8qnOAwJ27erjLpTR27qyJ06cq8djdTVRAVEfCOs4JXUWvwHr0CNrg</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2488745839</pqid></control><display><type>article</type><title>Building Footprint Extraction From Unmanned Aerial Vehicle Images Via PRU-Net: Application to Change Detection</title><source>Directory of Open Access Journals</source><source>EZB Electronic Journals Library</source><creator>Liu, Wei ; Xu, Jiawei ; Guo, Zihui ; Li, Erzhu ; Li, Xing ; Zhang, Lianpeng ; Liu, Wensong</creator><creatorcontrib>Liu, Wei ; Xu, Jiawei ; Guo, Zihui ; Li, Erzhu ; Li, Xing ; Zhang, Lianpeng ; Liu, Wensong</creatorcontrib><description>As the manual detection of building footprint is inefficient and labor-intensive, this study proposed a method of building footprint extraction and change detection based on deep convolutional neural networks. The study modified the existing U-Net model to develop the "PRU-Net" model. PRU-Net incorporates pyramid scene parsing (PSP) to allow multiscale scene parsing, a residual block (RB) in ResNet for feature extraction, and focal loss to address sample imbalance. Within the proposed method, building footprint extraction is conducted as follows: 1) unmanned aerial vehicle images are cropped, denoised, and semantically marked, and datasets are created (including training/validation and prediction datasets); 2) the training/validation and prediction datasets are input into the full convolutional neural network PRU-Net for model training/validation and prediction. Compared with the U-Net, PSP+U-Net (PU-Net), and U-Net++ models, PRU-Net offers improved footprint extraction of buildings with a range of sizes and shapes. The large-scale experimental results demonstrated the effectiveness of the PSP module for multiscale scene analysis and the RB module for feature extraction. After demonstrating the improvements in building extraction offered by PRU-Net, the building footprint results were further processed to generate a building change map.</description><identifier>ISSN: 1939-1404</identifier><identifier>EISSN: 2151-1535</identifier><identifier>DOI: 10.1109/JSTARS.2021.3052495</identifier><identifier>CODEN: IJSTHZ</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Artificial neural networks ; Building footprint change detection ; Buildings ; Change detection ; Data mining ; Datasets ; deep convolutional neural network (DCNN) ; Detection ; Feature extraction ; Image segmentation ; Labour ; Licenses ; Modules ; Neural networks ; Noise reduction ; Predictions ; Predictive models ; Scene analysis ; Semantics ; Training ; U-Net ; unmanned aerial vehicle (UAV) image ; Unmanned aerial vehicles</subject><ispartof>IEEE journal of selected topics in applied earth observations and remote sensing, 2021, Vol.14, p.2236-2248</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c408t-26aca8f727c96eb05ba67a0c4d61ba9e0b9e54d5bcccf40064027893c168c463</citedby><cites>FETCH-LOGICAL-c408t-26aca8f727c96eb05ba67a0c4d61ba9e0b9e54d5bcccf40064027893c168c463</cites><orcidid>0000-0001-8808-7961</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,864,2102,4024,27923,27924,27925</link.rule.ids></links><search><creatorcontrib>Liu, Wei</creatorcontrib><creatorcontrib>Xu, Jiawei</creatorcontrib><creatorcontrib>Guo, Zihui</creatorcontrib><creatorcontrib>Li, Erzhu</creatorcontrib><creatorcontrib>Li, Xing</creatorcontrib><creatorcontrib>Zhang, Lianpeng</creatorcontrib><creatorcontrib>Liu, Wensong</creatorcontrib><title>Building Footprint Extraction From Unmanned Aerial Vehicle Images Via PRU-Net: Application to Change Detection</title><title>IEEE journal of selected topics in applied earth observations and remote sensing</title><addtitle>JSTARS</addtitle><description>As the manual detection of building footprint is inefficient and labor-intensive, this study proposed a method of building footprint extraction and change detection based on deep convolutional neural networks. The study modified the existing U-Net model to develop the "PRU-Net" model. PRU-Net incorporates pyramid scene parsing (PSP) to allow multiscale scene parsing, a residual block (RB) in ResNet for feature extraction, and focal loss to address sample imbalance. Within the proposed method, building footprint extraction is conducted as follows: 1) unmanned aerial vehicle images are cropped, denoised, and semantically marked, and datasets are created (including training/validation and prediction datasets); 2) the training/validation and prediction datasets are input into the full convolutional neural network PRU-Net for model training/validation and prediction. Compared with the U-Net, PSP+U-Net (PU-Net), and U-Net++ models, PRU-Net offers improved footprint extraction of buildings with a range of sizes and shapes. The large-scale experimental results demonstrated the effectiveness of the PSP module for multiscale scene analysis and the RB module for feature extraction. After demonstrating the improvements in building extraction offered by PRU-Net, the building footprint results were further processed to generate a building change map.</description><subject>Artificial neural networks</subject><subject>Building footprint change detection</subject><subject>Buildings</subject><subject>Change detection</subject><subject>Data mining</subject><subject>Datasets</subject><subject>deep convolutional neural network (DCNN)</subject><subject>Detection</subject><subject>Feature extraction</subject><subject>Image segmentation</subject><subject>Labour</subject><subject>Licenses</subject><subject>Modules</subject><subject>Neural networks</subject><subject>Noise reduction</subject><subject>Predictions</subject><subject>Predictive models</subject><subject>Scene analysis</subject><subject>Semantics</subject><subject>Training</subject><subject>U-Net</subject><subject>unmanned aerial vehicle (UAV) image</subject><subject>Unmanned aerial vehicles</subject><issn>1939-1404</issn><issn>2151-1535</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNo9kc1u2zAQhIWgBeqmfYJcCPQsl_8ic3PdOHURtEXi5EqsqJVDQxZdigaat69sBTktsJhvdgdTFFeMzhmj9uvPh83i_mHOKWdzQRWXVl0UM84UK5kS6l0xY1bYkkkqPxQfh2FHqeaVFbOi_3YMXRP6LVnFmA8p9Jnc_MsJfA6xJ6sU9-Sx30PfY0MWmAJ05Amfg--QrPewxYE8BSB_7h_LX5ivyeJw6IKHM5wjWT5Dv0XyHTOeDT8V71voBvz8Oi-Lzepms_xR3v2-XS8Xd6WX1OSSa_Bg2opX3mqsqapBV0C9bDSrwSKtLSrZqNp738oxjKS8MlZ4po2XWlwW68m2ibBzY6o9pBcXIbjzIqatg5RPIZxgNRotOK8pSMPRtI1hEpnFCqFp5ej1ZfI6pPj3iEN2u3hM_fi949KYSioj7KgSk8qnOAwJ27erjLpTR27qyJ06cq8djdTVRAVEfCOs4JXUWvwHr0CNrg</recordid><startdate>2021</startdate><enddate>2021</enddate><creator>Liu, Wei</creator><creator>Xu, Jiawei</creator><creator>Guo, Zihui</creator><creator>Li, Erzhu</creator><creator>Li, Xing</creator><creator>Zhang, Lianpeng</creator><creator>Liu, Wensong</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7UA</scope><scope>8FD</scope><scope>C1K</scope><scope>F1W</scope><scope>FR3</scope><scope>H8D</scope><scope>H96</scope><scope>KR7</scope><scope>L.G</scope><scope>L7M</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0001-8808-7961</orcidid></search><sort><creationdate>2021</creationdate><title>Building Footprint Extraction From Unmanned Aerial Vehicle Images Via PRU-Net: Application to Change Detection</title><author>Liu, Wei ; Xu, Jiawei ; Guo, Zihui ; Li, Erzhu ; Li, Xing ; Zhang, Lianpeng ; Liu, Wensong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c408t-26aca8f727c96eb05ba67a0c4d61ba9e0b9e54d5bcccf40064027893c168c463</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Artificial neural networks</topic><topic>Building footprint change detection</topic><topic>Buildings</topic><topic>Change detection</topic><topic>Data mining</topic><topic>Datasets</topic><topic>deep convolutional neural network (DCNN)</topic><topic>Detection</topic><topic>Feature extraction</topic><topic>Image segmentation</topic><topic>Labour</topic><topic>Licenses</topic><topic>Modules</topic><topic>Neural networks</topic><topic>Noise reduction</topic><topic>Predictions</topic><topic>Predictive models</topic><topic>Scene analysis</topic><topic>Semantics</topic><topic>Training</topic><topic>U-Net</topic><topic>unmanned aerial vehicle (UAV) image</topic><topic>Unmanned aerial vehicles</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Liu, Wei</creatorcontrib><creatorcontrib>Xu, Jiawei</creatorcontrib><creatorcontrib>Guo, Zihui</creatorcontrib><creatorcontrib>Li, Erzhu</creatorcontrib><creatorcontrib>Li, Xing</creatorcontrib><creatorcontrib>Zhang, Lianpeng</creatorcontrib><creatorcontrib>Liu, Wensong</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005–Present</collection><collection>IEEE Xplore Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Xplore</collection><collection>CrossRef</collection><collection>Water Resources Abstracts</collection><collection>Technology Research Database</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ASFA: Aquatic Sciences and Fisheries Abstracts</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy &amp; Non-Living Resources</collection><collection>Civil Engineering Abstracts</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) Professional</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Directory of Open Access Journals</collection><jtitle>IEEE journal of selected topics in applied earth observations and remote sensing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Liu, Wei</au><au>Xu, Jiawei</au><au>Guo, Zihui</au><au>Li, Erzhu</au><au>Li, Xing</au><au>Zhang, Lianpeng</au><au>Liu, Wensong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Building Footprint Extraction From Unmanned Aerial Vehicle Images Via PRU-Net: Application to Change Detection</atitle><jtitle>IEEE journal of selected topics in applied earth observations and remote sensing</jtitle><stitle>JSTARS</stitle><date>2021</date><risdate>2021</risdate><volume>14</volume><spage>2236</spage><epage>2248</epage><pages>2236-2248</pages><issn>1939-1404</issn><eissn>2151-1535</eissn><coden>IJSTHZ</coden><abstract>As the manual detection of building footprint is inefficient and labor-intensive, this study proposed a method of building footprint extraction and change detection based on deep convolutional neural networks. The study modified the existing U-Net model to develop the "PRU-Net" model. PRU-Net incorporates pyramid scene parsing (PSP) to allow multiscale scene parsing, a residual block (RB) in ResNet for feature extraction, and focal loss to address sample imbalance. Within the proposed method, building footprint extraction is conducted as follows: 1) unmanned aerial vehicle images are cropped, denoised, and semantically marked, and datasets are created (including training/validation and prediction datasets); 2) the training/validation and prediction datasets are input into the full convolutional neural network PRU-Net for model training/validation and prediction. Compared with the U-Net, PSP+U-Net (PU-Net), and U-Net++ models, PRU-Net offers improved footprint extraction of buildings with a range of sizes and shapes. The large-scale experimental results demonstrated the effectiveness of the PSP module for multiscale scene analysis and the RB module for feature extraction. After demonstrating the improvements in building extraction offered by PRU-Net, the building footprint results were further processed to generate a building change map.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/JSTARS.2021.3052495</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0001-8808-7961</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1939-1404
ispartof IEEE journal of selected topics in applied earth observations and remote sensing, 2021, Vol.14, p.2236-2248
issn 1939-1404
2151-1535
language eng
recordid cdi_proquest_journals_2488745839
source Directory of Open Access Journals; EZB Electronic Journals Library
subjects Artificial neural networks
Building footprint change detection
Buildings
Change detection
Data mining
Datasets
deep convolutional neural network (DCNN)
Detection
Feature extraction
Image segmentation
Labour
Licenses
Modules
Neural networks
Noise reduction
Predictions
Predictive models
Scene analysis
Semantics
Training
U-Net
unmanned aerial vehicle (UAV) image
Unmanned aerial vehicles
title Building Footprint Extraction From Unmanned Aerial Vehicle Images Via PRU-Net: Application to Change Detection
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-20T21%3A48%3A08IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Building%20Footprint%20Extraction%20From%20Unmanned%20Aerial%20Vehicle%20Images%20Via%20PRU-Net:%20Application%20to%20Change%20Detection&rft.jtitle=IEEE%20journal%20of%20selected%20topics%20in%20applied%20earth%20observations%20and%20remote%20sensing&rft.au=Liu,%20Wei&rft.date=2021&rft.volume=14&rft.spage=2236&rft.epage=2248&rft.pages=2236-2248&rft.issn=1939-1404&rft.eissn=2151-1535&rft.coden=IJSTHZ&rft_id=info:doi/10.1109/JSTARS.2021.3052495&rft_dat=%3Cproquest_cross%3E2488745839%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2488745839&rft_id=info:pmid/&rft_ieee_id=9327466&rft_doaj_id=oai_doaj_org_article_31be86322b0a482e8fd814e19e7eadf4&rfr_iscdi=true