Large-Scale Autonomous Flight With Real-Time Semantic SLAM Under Dense Forest Canopy

Semantic maps represent the environment using a set of semantically meaningful objects. This representation is storage-efficient, less ambiguous, and more informative, thus facilitating large-scale autonomy and the acquisition of actionable information in highly unstructured, GPS-denied environments...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE robotics and automation letters 2022-04, Vol.7 (2), p.5512-5519
Hauptverfasser: Liu, Xu, Nardari, Guilherme V., Cladera, Fernando, Tao, Yuezhan, Zhou, Alex, Donnelly, Thomas, Qu, Chao, Chen, Steven W., Romero, Roseli A. F., Taylor, Camillo J., Kumar, Vijay
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 5519
container_issue 2
container_start_page 5512
container_title IEEE robotics and automation letters
container_volume 7
creator Liu, Xu
Nardari, Guilherme V.
Cladera, Fernando
Tao, Yuezhan
Zhou, Alex
Donnelly, Thomas
Qu, Chao
Chen, Steven W.
Romero, Roseli A. F.
Taylor, Camillo J.
Kumar, Vijay
description Semantic maps represent the environment using a set of semantically meaningful objects. This representation is storage-efficient, less ambiguous, and more informative, thus facilitating large-scale autonomy and the acquisition of actionable information in highly unstructured, GPS-denied environments. In this letter, we propose an integrated system that can perform large-scale autonomous flights and real-time semantic mapping in challenging under-canopy environments. We detect and model tree trunks and ground planes from LiDAR data, which are associated across scans and used to constrain robot poses as well as tree trunk models. The autonomous navigation module utilizes a multi-level planning and mapping framework and computes dynamically feasible trajectories that lead the UAV to build a semantic map of the user-defined region of interest in a computationally and storage efficient manner. A drift-compensation mechanism is designed to minimize the odometry drift using semantic SLAM outputs in real time, while maintaining planner optimality and controller stability. This leads the UAV to execute its mission accurately and safely at scale.
doi_str_mv 10.1109/LRA.2022.3154047
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_9720974</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9720974</ieee_id><sourcerecordid>2640432033</sourcerecordid><originalsourceid>FETCH-LOGICAL-c333t-bc643d148ce06a1ea3cc6db4d172b36aecd6ffcf1b23d87f721e06974e18ca3</originalsourceid><addsrcrecordid>eNpNkM9LwzAYhoMoOObugpeA58786JL1WKZToSKsFY8hTb9uHW0zk_aw_96MDfH0fYf3fR94ELqnZE4pSZ6yTTpnhLE5p4uYxPIKTRiXMuJSiOt__y2aeb8nhNAFkzxZTFCRabeFKDe6BZyOg-1tZ0eP122z3Q34uxl2eAO6jYqmA5xDp_uhMTjP0g_81Vfg8DP0HvDaOvADXuneHo536KbWrYfZ5U5Rvn4pVm9R9vn6vkqzyHDOh6g0IuYVjZcGiNAUNDdGVGVcUclKLjSYStS1qWnJeLWUtWQ0BBMZA10azafo8bx6cPZnDHS1t6PrA1AxESxwRgJmisg5ZZz13kGtDq7ptDsqStRJngry1EmeusgLlYdzpQGAv3giGQls_gt1f2na</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2640432033</pqid></control><display><type>article</type><title>Large-Scale Autonomous Flight With Real-Time Semantic SLAM Under Dense Forest Canopy</title><source>IEEE Electronic Library (IEL)</source><creator>Liu, Xu ; Nardari, Guilherme V. ; Cladera, Fernando ; Tao, Yuezhan ; Zhou, Alex ; Donnelly, Thomas ; Qu, Chao ; Chen, Steven W. ; Romero, Roseli A. F. ; Taylor, Camillo J. ; Kumar, Vijay</creator><creatorcontrib>Liu, Xu ; Nardari, Guilherme V. ; Cladera, Fernando ; Tao, Yuezhan ; Zhou, Alex ; Donnelly, Thomas ; Qu, Chao ; Chen, Steven W. ; Romero, Roseli A. F. ; Taylor, Camillo J. ; Kumar, Vijay</creatorcontrib><description>Semantic maps represent the environment using a set of semantically meaningful objects. This representation is storage-efficient, less ambiguous, and more informative, thus facilitating large-scale autonomy and the acquisition of actionable information in highly unstructured, GPS-denied environments. In this letter, we propose an integrated system that can perform large-scale autonomous flights and real-time semantic mapping in challenging under-canopy environments. We detect and model tree trunks and ground planes from LiDAR data, which are associated across scans and used to constrain robot poses as well as tree trunk models. The autonomous navigation module utilizes a multi-level planning and mapping framework and computes dynamically feasible trajectories that lead the UAV to build a semantic map of the user-defined region of interest in a computationally and storage efficient manner. A drift-compensation mechanism is designed to minimize the odometry drift using semantic SLAM outputs in real time, while maintaining planner optimality and controller stability. This leads the UAV to execute its mission accurately and safely at scale.</description><identifier>ISSN: 2377-3766</identifier><identifier>EISSN: 2377-3766</identifier><identifier>DOI: 10.1109/LRA.2022.3154047</identifier><identifier>CODEN: IRALC6</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Aerial systems: perception and autonomy ; Autonomous aerial vehicles ; Autonomous navigation ; Autonomy ; Canopies ; Computational modeling ; Control stability ; Data models ; Drift ; field robotics ; Forestry ; Ground plane ; Mapping ; Planning ; Real time ; Real-time systems ; robotics and automation in agriculture and forestry ; Semantics ; Simultaneous localization and mapping ; SLAM ; Stability analysis ; Trajectory ; Unmanned aerial vehicles ; Unstructured data</subject><ispartof>IEEE robotics and automation letters, 2022-04, Vol.7 (2), p.5512-5519</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c333t-bc643d148ce06a1ea3cc6db4d172b36aecd6ffcf1b23d87f721e06974e18ca3</citedby><cites>FETCH-LOGICAL-c333t-bc643d148ce06a1ea3cc6db4d172b36aecd6ffcf1b23d87f721e06974e18ca3</cites><orcidid>0000-0001-9136-4691 ; 0000-0003-2164-6196 ; 0000-0002-7339-5475 ; 0000-0002-5926-7557 ; 0000-0001-9366-2780 ; 0000-0002-9332-5087 ; 0000-0002-3902-9391 ; 0000-0003-3155-0171 ; 0000-0002-7448-8411</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9720974$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9720974$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Liu, Xu</creatorcontrib><creatorcontrib>Nardari, Guilherme V.</creatorcontrib><creatorcontrib>Cladera, Fernando</creatorcontrib><creatorcontrib>Tao, Yuezhan</creatorcontrib><creatorcontrib>Zhou, Alex</creatorcontrib><creatorcontrib>Donnelly, Thomas</creatorcontrib><creatorcontrib>Qu, Chao</creatorcontrib><creatorcontrib>Chen, Steven W.</creatorcontrib><creatorcontrib>Romero, Roseli A. F.</creatorcontrib><creatorcontrib>Taylor, Camillo J.</creatorcontrib><creatorcontrib>Kumar, Vijay</creatorcontrib><title>Large-Scale Autonomous Flight With Real-Time Semantic SLAM Under Dense Forest Canopy</title><title>IEEE robotics and automation letters</title><addtitle>LRA</addtitle><description>Semantic maps represent the environment using a set of semantically meaningful objects. This representation is storage-efficient, less ambiguous, and more informative, thus facilitating large-scale autonomy and the acquisition of actionable information in highly unstructured, GPS-denied environments. In this letter, we propose an integrated system that can perform large-scale autonomous flights and real-time semantic mapping in challenging under-canopy environments. We detect and model tree trunks and ground planes from LiDAR data, which are associated across scans and used to constrain robot poses as well as tree trunk models. The autonomous navigation module utilizes a multi-level planning and mapping framework and computes dynamically feasible trajectories that lead the UAV to build a semantic map of the user-defined region of interest in a computationally and storage efficient manner. A drift-compensation mechanism is designed to minimize the odometry drift using semantic SLAM outputs in real time, while maintaining planner optimality and controller stability. This leads the UAV to execute its mission accurately and safely at scale.</description><subject>Aerial systems: perception and autonomy</subject><subject>Autonomous aerial vehicles</subject><subject>Autonomous navigation</subject><subject>Autonomy</subject><subject>Canopies</subject><subject>Computational modeling</subject><subject>Control stability</subject><subject>Data models</subject><subject>Drift</subject><subject>field robotics</subject><subject>Forestry</subject><subject>Ground plane</subject><subject>Mapping</subject><subject>Planning</subject><subject>Real time</subject><subject>Real-time systems</subject><subject>robotics and automation in agriculture and forestry</subject><subject>Semantics</subject><subject>Simultaneous localization and mapping</subject><subject>SLAM</subject><subject>Stability analysis</subject><subject>Trajectory</subject><subject>Unmanned aerial vehicles</subject><subject>Unstructured data</subject><issn>2377-3766</issn><issn>2377-3766</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkM9LwzAYhoMoOObugpeA58786JL1WKZToSKsFY8hTb9uHW0zk_aw_96MDfH0fYf3fR94ELqnZE4pSZ6yTTpnhLE5p4uYxPIKTRiXMuJSiOt__y2aeb8nhNAFkzxZTFCRabeFKDe6BZyOg-1tZ0eP122z3Q34uxl2eAO6jYqmA5xDp_uhMTjP0g_81Vfg8DP0HvDaOvADXuneHo536KbWrYfZ5U5Rvn4pVm9R9vn6vkqzyHDOh6g0IuYVjZcGiNAUNDdGVGVcUclKLjSYStS1qWnJeLWUtWQ0BBMZA10azafo8bx6cPZnDHS1t6PrA1AxESxwRgJmisg5ZZz13kGtDq7ptDsqStRJngry1EmeusgLlYdzpQGAv3giGQls_gt1f2na</recordid><startdate>20220401</startdate><enddate>20220401</enddate><creator>Liu, Xu</creator><creator>Nardari, Guilherme V.</creator><creator>Cladera, Fernando</creator><creator>Tao, Yuezhan</creator><creator>Zhou, Alex</creator><creator>Donnelly, Thomas</creator><creator>Qu, Chao</creator><creator>Chen, Steven W.</creator><creator>Romero, Roseli A. F.</creator><creator>Taylor, Camillo J.</creator><creator>Kumar, Vijay</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-9136-4691</orcidid><orcidid>https://orcid.org/0000-0003-2164-6196</orcidid><orcidid>https://orcid.org/0000-0002-7339-5475</orcidid><orcidid>https://orcid.org/0000-0002-5926-7557</orcidid><orcidid>https://orcid.org/0000-0001-9366-2780</orcidid><orcidid>https://orcid.org/0000-0002-9332-5087</orcidid><orcidid>https://orcid.org/0000-0002-3902-9391</orcidid><orcidid>https://orcid.org/0000-0003-3155-0171</orcidid><orcidid>https://orcid.org/0000-0002-7448-8411</orcidid></search><sort><creationdate>20220401</creationdate><title>Large-Scale Autonomous Flight With Real-Time Semantic SLAM Under Dense Forest Canopy</title><author>Liu, Xu ; Nardari, Guilherme V. ; Cladera, Fernando ; Tao, Yuezhan ; Zhou, Alex ; Donnelly, Thomas ; Qu, Chao ; Chen, Steven W. ; Romero, Roseli A. F. ; Taylor, Camillo J. ; Kumar, Vijay</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c333t-bc643d148ce06a1ea3cc6db4d172b36aecd6ffcf1b23d87f721e06974e18ca3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Aerial systems: perception and autonomy</topic><topic>Autonomous aerial vehicles</topic><topic>Autonomous navigation</topic><topic>Autonomy</topic><topic>Canopies</topic><topic>Computational modeling</topic><topic>Control stability</topic><topic>Data models</topic><topic>Drift</topic><topic>field robotics</topic><topic>Forestry</topic><topic>Ground plane</topic><topic>Mapping</topic><topic>Planning</topic><topic>Real time</topic><topic>Real-time systems</topic><topic>robotics and automation in agriculture and forestry</topic><topic>Semantics</topic><topic>Simultaneous localization and mapping</topic><topic>SLAM</topic><topic>Stability analysis</topic><topic>Trajectory</topic><topic>Unmanned aerial vehicles</topic><topic>Unstructured data</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Liu, Xu</creatorcontrib><creatorcontrib>Nardari, Guilherme V.</creatorcontrib><creatorcontrib>Cladera, Fernando</creatorcontrib><creatorcontrib>Tao, Yuezhan</creatorcontrib><creatorcontrib>Zhou, Alex</creatorcontrib><creatorcontrib>Donnelly, Thomas</creatorcontrib><creatorcontrib>Qu, Chao</creatorcontrib><creatorcontrib>Chen, Steven W.</creatorcontrib><creatorcontrib>Romero, Roseli A. F.</creatorcontrib><creatorcontrib>Taylor, Camillo J.</creatorcontrib><creatorcontrib>Kumar, Vijay</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE robotics and automation letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Liu, Xu</au><au>Nardari, Guilherme V.</au><au>Cladera, Fernando</au><au>Tao, Yuezhan</au><au>Zhou, Alex</au><au>Donnelly, Thomas</au><au>Qu, Chao</au><au>Chen, Steven W.</au><au>Romero, Roseli A. F.</au><au>Taylor, Camillo J.</au><au>Kumar, Vijay</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Large-Scale Autonomous Flight With Real-Time Semantic SLAM Under Dense Forest Canopy</atitle><jtitle>IEEE robotics and automation letters</jtitle><stitle>LRA</stitle><date>2022-04-01</date><risdate>2022</risdate><volume>7</volume><issue>2</issue><spage>5512</spage><epage>5519</epage><pages>5512-5519</pages><issn>2377-3766</issn><eissn>2377-3766</eissn><coden>IRALC6</coden><abstract>Semantic maps represent the environment using a set of semantically meaningful objects. This representation is storage-efficient, less ambiguous, and more informative, thus facilitating large-scale autonomy and the acquisition of actionable information in highly unstructured, GPS-denied environments. In this letter, we propose an integrated system that can perform large-scale autonomous flights and real-time semantic mapping in challenging under-canopy environments. We detect and model tree trunks and ground planes from LiDAR data, which are associated across scans and used to constrain robot poses as well as tree trunk models. The autonomous navigation module utilizes a multi-level planning and mapping framework and computes dynamically feasible trajectories that lead the UAV to build a semantic map of the user-defined region of interest in a computationally and storage efficient manner. A drift-compensation mechanism is designed to minimize the odometry drift using semantic SLAM outputs in real time, while maintaining planner optimality and controller stability. This leads the UAV to execute its mission accurately and safely at scale.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/LRA.2022.3154047</doi><tpages>8</tpages><orcidid>https://orcid.org/0000-0001-9136-4691</orcidid><orcidid>https://orcid.org/0000-0003-2164-6196</orcidid><orcidid>https://orcid.org/0000-0002-7339-5475</orcidid><orcidid>https://orcid.org/0000-0002-5926-7557</orcidid><orcidid>https://orcid.org/0000-0001-9366-2780</orcidid><orcidid>https://orcid.org/0000-0002-9332-5087</orcidid><orcidid>https://orcid.org/0000-0002-3902-9391</orcidid><orcidid>https://orcid.org/0000-0003-3155-0171</orcidid><orcidid>https://orcid.org/0000-0002-7448-8411</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 2377-3766
ispartof IEEE robotics and automation letters, 2022-04, Vol.7 (2), p.5512-5519
issn 2377-3766
2377-3766
language eng
recordid cdi_ieee_primary_9720974
source IEEE Electronic Library (IEL)
subjects Aerial systems: perception and autonomy
Autonomous aerial vehicles
Autonomous navigation
Autonomy
Canopies
Computational modeling
Control stability
Data models
Drift
field robotics
Forestry
Ground plane
Mapping
Planning
Real time
Real-time systems
robotics and automation in agriculture and forestry
Semantics
Simultaneous localization and mapping
SLAM
Stability analysis
Trajectory
Unmanned aerial vehicles
Unstructured data
title Large-Scale Autonomous Flight With Real-Time Semantic SLAM Under Dense Forest Canopy
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-06T05%3A48%3A17IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Large-Scale%20Autonomous%20Flight%20With%20Real-Time%20Semantic%20SLAM%20Under%20Dense%20Forest%20Canopy&rft.jtitle=IEEE%20robotics%20and%20automation%20letters&rft.au=Liu,%20Xu&rft.date=2022-04-01&rft.volume=7&rft.issue=2&rft.spage=5512&rft.epage=5519&rft.pages=5512-5519&rft.issn=2377-3766&rft.eissn=2377-3766&rft.coden=IRALC6&rft_id=info:doi/10.1109/LRA.2022.3154047&rft_dat=%3Cproquest_RIE%3E2640432033%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2640432033&rft_id=info:pmid/&rft_ieee_id=9720974&rfr_iscdi=true