BICANet: LiDAR Point Cloud Classification Network Based on Coordinate Attention and Blueprint Separation Involution Neural Network

With the advent of the era of Industry 4.0 and the continuous development of point cloud data acquisition technology, point cloud data have been widely used in the unmanned distribution of intelligent logistics. Applying deep neural networks to accurate light detection and ranging (LiDAR) point clou...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE sensors journal 2023-11, Vol.23 (22), p.27720-27732
Hauptverfasser: Zhang, Guodao, Ye, Haiyang, Gao, Xiaoyun, Liu, Ruyu, Tao, Xiuting, Yang, Genfu, Zhou, Jian, Chen, Zhao-Min
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 27732
container_issue 22
container_start_page 27720
container_title IEEE sensors journal
container_volume 23
creator Zhang, Guodao
Ye, Haiyang
Gao, Xiaoyun
Liu, Ruyu
Tao, Xiuting
Yang, Genfu
Zhou, Jian
Chen, Zhao-Min
description With the advent of the era of Industry 4.0 and the continuous development of point cloud data acquisition technology, point cloud data have been widely used in the unmanned distribution of intelligent logistics. Applying deep neural networks to accurate light detection and ranging (LiDAR) point cloud classification results is considerably significant for unmanned transport. This article designs a 3-D point cloud classification model coordinate attention blueprint separation involution neural network (BICANet) with multidimensional feature extraction. First, to extract more point cloud features, 3-D point clouds are projected to a 2-D plane for calculating point cloud feature values, and multidimensional point cloud feature information is fused from different views. Second, the involution network is introduced to reduce the amount of redundant data for neural network computation and improve the whole network computation efficiency. At the same time, to further enhance the network feature learning capability, the blueprint separation convolution is combined with coordinate attention (CA). Finally, to draw our conclusions more rigorously, we conducted error analysis experiments and experimented with the generalization ability of our proposed BICANet model. The overall accuracy of BICANet in the Vaihingen and GML_B datasets was experimentally demonstrated to reach 86.0% and 98.8%, respectively. It is highly competitive with the currently available methods.
doi_str_mv 10.1109/JSEN.2023.3323047
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_JSEN_2023_3323047</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10286375</ieee_id><sourcerecordid>2890106596</sourcerecordid><originalsourceid>FETCH-LOGICAL-c294t-d62fd0b57b866e10812db87dab960abdfc2b0b15ccb7b9ca72c5c8c26a22722c3</originalsourceid><addsrcrecordid>eNpNkF1LwzAUhosoOKc_QPAi4HVnPtqm9W6rUydjilPwLuSr0FmbmaSKt_5yUzvBm3MO4Xnfk_NG0SmCE4RgcXG3nq8mGGIyIQQTmNC9aITSNI8RTfL9fiYwTgh9OYyOnNtAiAqa0lH0PVuU05X2l2BZX00fwYOpWw_KxnQqVO5cXdWS-9q0IFCfxr6CGXdagfBQGmNV3XKvwdR73f5SvFVg1nR6a3ujtd5yO8gX7Ydpup1TZ3nzZ3gcHVS8cfpk18fR8_X8qbyNl_c34XPLWOIi8bHKcKWgSKnIs0wjmCOsRE4VF0UGuVCVxAIKlEopqCgkp1imMpc44xhTjCUZR-eD79aa9047zzams21YyXBeQASztMgChQZKWuOc1RULl7xx-8UQZH3UrI-a9VGzXdRBczZoaq31Px7nGaEp-QFoH3xd</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2890106596</pqid></control><display><type>article</type><title>BICANet: LiDAR Point Cloud Classification Network Based on Coordinate Attention and Blueprint Separation Involution Neural Network</title><source>IEEE Electronic Library (IEL)</source><creator>Zhang, Guodao ; Ye, Haiyang ; Gao, Xiaoyun ; Liu, Ruyu ; Tao, Xiuting ; Yang, Genfu ; Zhou, Jian ; Chen, Zhao-Min</creator><creatorcontrib>Zhang, Guodao ; Ye, Haiyang ; Gao, Xiaoyun ; Liu, Ruyu ; Tao, Xiuting ; Yang, Genfu ; Zhou, Jian ; Chen, Zhao-Min</creatorcontrib><description>With the advent of the era of Industry 4.0 and the continuous development of point cloud data acquisition technology, point cloud data have been widely used in the unmanned distribution of intelligent logistics. Applying deep neural networks to accurate light detection and ranging (LiDAR) point cloud classification results is considerably significant for unmanned transport. This article designs a 3-D point cloud classification model coordinate attention blueprint separation involution neural network (BICANet) with multidimensional feature extraction. First, to extract more point cloud features, 3-D point clouds are projected to a 2-D plane for calculating point cloud feature values, and multidimensional point cloud feature information is fused from different views. Second, the involution network is introduced to reduce the amount of redundant data for neural network computation and improve the whole network computation efficiency. At the same time, to further enhance the network feature learning capability, the blueprint separation convolution is combined with coordinate attention (CA). Finally, to draw our conclusions more rigorously, we conducted error analysis experiments and experimented with the generalization ability of our proposed BICANet model. The overall accuracy of BICANet in the Vaihingen and GML_B datasets was experimentally demonstrated to reach 86.0% and 98.8%, respectively. It is highly competitive with the currently available methods.</description><identifier>ISSN: 1530-437X</identifier><identifier>EISSN: 1558-1748</identifier><identifier>DOI: 10.1109/JSEN.2023.3323047</identifier><identifier>CODEN: ISJEAZ</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Artificial neural networks ; Classification ; Classification algorithms ; Cloud computing ; Convolution ; Coordinate attention blueprint separation involution neural network (BICANet) ; Data acquisition ; deep learning ; Error analysis ; Feature extraction ; Industry 4.0 ; Laser radar ; Lidar ; light detection and ranging (LiDAR) point cloud classification ; Machine learning ; Model accuracy ; multidimensional features ; Neural networks ; Point cloud compression ; Sensors ; Separation ; Three dimensional models ; Three-dimensional displays</subject><ispartof>IEEE sensors journal, 2023-11, Vol.23 (22), p.27720-27732</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c294t-d62fd0b57b866e10812db87dab960abdfc2b0b15ccb7b9ca72c5c8c26a22722c3</citedby><cites>FETCH-LOGICAL-c294t-d62fd0b57b866e10812db87dab960abdfc2b0b15ccb7b9ca72c5c8c26a22722c3</cites><orcidid>0009-0004-7399-3952 ; 0009-0007-4006-6716 ; 0000-0001-8955-1789 ; 0009-0003-5003-513X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10286375$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10286375$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Zhang, Guodao</creatorcontrib><creatorcontrib>Ye, Haiyang</creatorcontrib><creatorcontrib>Gao, Xiaoyun</creatorcontrib><creatorcontrib>Liu, Ruyu</creatorcontrib><creatorcontrib>Tao, Xiuting</creatorcontrib><creatorcontrib>Yang, Genfu</creatorcontrib><creatorcontrib>Zhou, Jian</creatorcontrib><creatorcontrib>Chen, Zhao-Min</creatorcontrib><title>BICANet: LiDAR Point Cloud Classification Network Based on Coordinate Attention and Blueprint Separation Involution Neural Network</title><title>IEEE sensors journal</title><addtitle>JSEN</addtitle><description>With the advent of the era of Industry 4.0 and the continuous development of point cloud data acquisition technology, point cloud data have been widely used in the unmanned distribution of intelligent logistics. Applying deep neural networks to accurate light detection and ranging (LiDAR) point cloud classification results is considerably significant for unmanned transport. This article designs a 3-D point cloud classification model coordinate attention blueprint separation involution neural network (BICANet) with multidimensional feature extraction. First, to extract more point cloud features, 3-D point clouds are projected to a 2-D plane for calculating point cloud feature values, and multidimensional point cloud feature information is fused from different views. Second, the involution network is introduced to reduce the amount of redundant data for neural network computation and improve the whole network computation efficiency. At the same time, to further enhance the network feature learning capability, the blueprint separation convolution is combined with coordinate attention (CA). Finally, to draw our conclusions more rigorously, we conducted error analysis experiments and experimented with the generalization ability of our proposed BICANet model. The overall accuracy of BICANet in the Vaihingen and GML_B datasets was experimentally demonstrated to reach 86.0% and 98.8%, respectively. It is highly competitive with the currently available methods.</description><subject>Artificial neural networks</subject><subject>Classification</subject><subject>Classification algorithms</subject><subject>Cloud computing</subject><subject>Convolution</subject><subject>Coordinate attention blueprint separation involution neural network (BICANet)</subject><subject>Data acquisition</subject><subject>deep learning</subject><subject>Error analysis</subject><subject>Feature extraction</subject><subject>Industry 4.0</subject><subject>Laser radar</subject><subject>Lidar</subject><subject>light detection and ranging (LiDAR) point cloud classification</subject><subject>Machine learning</subject><subject>Model accuracy</subject><subject>multidimensional features</subject><subject>Neural networks</subject><subject>Point cloud compression</subject><subject>Sensors</subject><subject>Separation</subject><subject>Three dimensional models</subject><subject>Three-dimensional displays</subject><issn>1530-437X</issn><issn>1558-1748</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkF1LwzAUhosoOKc_QPAi4HVnPtqm9W6rUydjilPwLuSr0FmbmaSKt_5yUzvBm3MO4Xnfk_NG0SmCE4RgcXG3nq8mGGIyIQQTmNC9aITSNI8RTfL9fiYwTgh9OYyOnNtAiAqa0lH0PVuU05X2l2BZX00fwYOpWw_KxnQqVO5cXdWS-9q0IFCfxr6CGXdagfBQGmNV3XKvwdR73f5SvFVg1nR6a3ujtd5yO8gX7Ydpup1TZ3nzZ3gcHVS8cfpk18fR8_X8qbyNl_c34XPLWOIi8bHKcKWgSKnIs0wjmCOsRE4VF0UGuVCVxAIKlEopqCgkp1imMpc44xhTjCUZR-eD79aa9047zzams21YyXBeQASztMgChQZKWuOc1RULl7xx-8UQZH3UrI-a9VGzXdRBczZoaq31Px7nGaEp-QFoH3xd</recordid><startdate>20231115</startdate><enddate>20231115</enddate><creator>Zhang, Guodao</creator><creator>Ye, Haiyang</creator><creator>Gao, Xiaoyun</creator><creator>Liu, Ruyu</creator><creator>Tao, Xiuting</creator><creator>Yang, Genfu</creator><creator>Zhou, Jian</creator><creator>Chen, Zhao-Min</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>7U5</scope><scope>8FD</scope><scope>L7M</scope><orcidid>https://orcid.org/0009-0004-7399-3952</orcidid><orcidid>https://orcid.org/0009-0007-4006-6716</orcidid><orcidid>https://orcid.org/0000-0001-8955-1789</orcidid><orcidid>https://orcid.org/0009-0003-5003-513X</orcidid></search><sort><creationdate>20231115</creationdate><title>BICANet: LiDAR Point Cloud Classification Network Based on Coordinate Attention and Blueprint Separation Involution Neural Network</title><author>Zhang, Guodao ; Ye, Haiyang ; Gao, Xiaoyun ; Liu, Ruyu ; Tao, Xiuting ; Yang, Genfu ; Zhou, Jian ; Chen, Zhao-Min</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c294t-d62fd0b57b866e10812db87dab960abdfc2b0b15ccb7b9ca72c5c8c26a22722c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Artificial neural networks</topic><topic>Classification</topic><topic>Classification algorithms</topic><topic>Cloud computing</topic><topic>Convolution</topic><topic>Coordinate attention blueprint separation involution neural network (BICANet)</topic><topic>Data acquisition</topic><topic>deep learning</topic><topic>Error analysis</topic><topic>Feature extraction</topic><topic>Industry 4.0</topic><topic>Laser radar</topic><topic>Lidar</topic><topic>light detection and ranging (LiDAR) point cloud classification</topic><topic>Machine learning</topic><topic>Model accuracy</topic><topic>multidimensional features</topic><topic>Neural networks</topic><topic>Point cloud compression</topic><topic>Sensors</topic><topic>Separation</topic><topic>Three dimensional models</topic><topic>Three-dimensional displays</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zhang, Guodao</creatorcontrib><creatorcontrib>Ye, Haiyang</creatorcontrib><creatorcontrib>Gao, Xiaoyun</creatorcontrib><creatorcontrib>Liu, Ruyu</creatorcontrib><creatorcontrib>Tao, Xiuting</creatorcontrib><creatorcontrib>Yang, Genfu</creatorcontrib><creatorcontrib>Zhou, Jian</creatorcontrib><creatorcontrib>Chen, Zhao-Min</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE sensors journal</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Zhang, Guodao</au><au>Ye, Haiyang</au><au>Gao, Xiaoyun</au><au>Liu, Ruyu</au><au>Tao, Xiuting</au><au>Yang, Genfu</au><au>Zhou, Jian</au><au>Chen, Zhao-Min</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>BICANet: LiDAR Point Cloud Classification Network Based on Coordinate Attention and Blueprint Separation Involution Neural Network</atitle><jtitle>IEEE sensors journal</jtitle><stitle>JSEN</stitle><date>2023-11-15</date><risdate>2023</risdate><volume>23</volume><issue>22</issue><spage>27720</spage><epage>27732</epage><pages>27720-27732</pages><issn>1530-437X</issn><eissn>1558-1748</eissn><coden>ISJEAZ</coden><abstract>With the advent of the era of Industry 4.0 and the continuous development of point cloud data acquisition technology, point cloud data have been widely used in the unmanned distribution of intelligent logistics. Applying deep neural networks to accurate light detection and ranging (LiDAR) point cloud classification results is considerably significant for unmanned transport. This article designs a 3-D point cloud classification model coordinate attention blueprint separation involution neural network (BICANet) with multidimensional feature extraction. First, to extract more point cloud features, 3-D point clouds are projected to a 2-D plane for calculating point cloud feature values, and multidimensional point cloud feature information is fused from different views. Second, the involution network is introduced to reduce the amount of redundant data for neural network computation and improve the whole network computation efficiency. At the same time, to further enhance the network feature learning capability, the blueprint separation convolution is combined with coordinate attention (CA). Finally, to draw our conclusions more rigorously, we conducted error analysis experiments and experimented with the generalization ability of our proposed BICANet model. The overall accuracy of BICANet in the Vaihingen and GML_B datasets was experimentally demonstrated to reach 86.0% and 98.8%, respectively. It is highly competitive with the currently available methods.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/JSEN.2023.3323047</doi><tpages>13</tpages><orcidid>https://orcid.org/0009-0004-7399-3952</orcidid><orcidid>https://orcid.org/0009-0007-4006-6716</orcidid><orcidid>https://orcid.org/0000-0001-8955-1789</orcidid><orcidid>https://orcid.org/0009-0003-5003-513X</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1530-437X
ispartof IEEE sensors journal, 2023-11, Vol.23 (22), p.27720-27732
issn 1530-437X
1558-1748
language eng
recordid cdi_crossref_primary_10_1109_JSEN_2023_3323047
source IEEE Electronic Library (IEL)
subjects Artificial neural networks
Classification
Classification algorithms
Cloud computing
Convolution
Coordinate attention blueprint separation involution neural network (BICANet)
Data acquisition
deep learning
Error analysis
Feature extraction
Industry 4.0
Laser radar
Lidar
light detection and ranging (LiDAR) point cloud classification
Machine learning
Model accuracy
multidimensional features
Neural networks
Point cloud compression
Sensors
Separation
Three dimensional models
Three-dimensional displays
title BICANet: LiDAR Point Cloud Classification Network Based on Coordinate Attention and Blueprint Separation Involution Neural Network
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-20T00%3A04%3A00IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=BICANet:%20LiDAR%20Point%20Cloud%20Classification%20Network%20Based%20on%20Coordinate%20Attention%20and%20Blueprint%20Separation%20Involution%20Neural%20Network&rft.jtitle=IEEE%20sensors%20journal&rft.au=Zhang,%20Guodao&rft.date=2023-11-15&rft.volume=23&rft.issue=22&rft.spage=27720&rft.epage=27732&rft.pages=27720-27732&rft.issn=1530-437X&rft.eissn=1558-1748&rft.coden=ISJEAZ&rft_id=info:doi/10.1109/JSEN.2023.3323047&rft_dat=%3Cproquest_RIE%3E2890106596%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2890106596&rft_id=info:pmid/&rft_ieee_id=10286375&rfr_iscdi=true