Plane-Edge-SLAM: Seamless Fusion of Planes and Edges for SLAM in Indoor Environments

Planes and edges are attractive features for simultaneous localization and mapping (SLAM) in indoor environments because they can be reliably extracted and are robust to illumination changes. However, it remains a challenging problem to seamlessly fuse two different kinds of features to avoid degene...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on automation science and engineering 2021-10, Vol.18 (4), p.2061-2075
Hauptverfasser: Sun, Qinxuan, Yuan, Jing, Zhang, Xuebo, Duan, Feng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 2075
container_issue 4
container_start_page 2061
container_title IEEE transactions on automation science and engineering
container_volume 18
creator Sun, Qinxuan
Yuan, Jing
Zhang, Xuebo
Duan, Feng
description Planes and edges are attractive features for simultaneous localization and mapping (SLAM) in indoor environments because they can be reliably extracted and are robust to illumination changes. However, it remains a challenging problem to seamlessly fuse two different kinds of features to avoid degeneracy and accurately estimate the camera motion. In this article, a plane-edge-SLAM system using an RGB-D sensor is developed to address the seamless fusion of planes and edges. Constraint analysis is first performed to obtain a quantitative measure of how the planes constrain the camera motion estimation. Then, using the results of the constraint analysis, an adaptive weighting algorithm is elaborately designed to achieve seamless fusion. Through the fusion of planes and edges, the solution to motion estimation is fully constrained, and the problem remains well-posed in all circumstances. In addition, a probabilistic plane fitting algorithm is proposed to fit a plane model to the noisy 3-D points. By exploiting the error model of the depth sensor, the proposed plane fitting is adaptive to various measurement noises corresponding to different depth measurements. As a result, the estimated plane parameters are more accurate and robust to the points with large uncertainties. Compared with the existing plane fitting methods, the proposed method definitely benefits the performance of motion estimation. The results of extensive experiments on public data sets and in real-world indoor scenes demonstrate that the plane-edge-SLAM system can achieve high accuracy and robustness. Note to Practitioners -This article is motivated by the robust localization and mapping for mobile robots. We suggest a novel simultaneous localization and mapping (SLAM) approach fusing the plane and edge features in indoor scenes (plane-edge-SLAM). This newly proposed approach works well in the textureless or dark scenes and is robust to the sensor noise. The experiments are carried out in various indoor scenes for mobile robots, and the results demonstrate the robustness and effectiveness of the proposed framework. In future work, we will address the fusion of other high-level features (for example, 3-D lines) and the active exploration of the environments.
doi_str_mv 10.1109/TASE.2020.3032831
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TASE_2020_3032831</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9248035</ieee_id><sourcerecordid>2579439965</sourcerecordid><originalsourceid>FETCH-LOGICAL-c359t-bbeebb86d5d2ebda80fcf23b902224819eac2226250937f042bd67ab3c889f543</originalsourceid><addsrcrecordid>eNo9kE9LAzEQxYMoWKsfQLwEPKfmz2Y38VZkq4WKQus5JLsT2dImNWkFv727tniaN_B7M4-H0C2jE8aoflhNl_WEU04nggquBDtDIyalIqJS4nzQhSRSS3mJrnJeU8oLpekIrd43NgCp208gy8X09REvwW43kDOeHXIXA44e_zEZ29DiAczYx4QHGncBz0Mb-7UO312KYQthn6_RhbebDDenOUYfs3r19EIWb8_zp-mCNELqPXEOwDlVtrLl4FqrqG88F05Tzvt4TINtelVySbWoPC24a8vKOtEopb0sxBjdH-_uUvw6QN6bdTyk0L80XFa6EFqXsqfYkWpSzDmBN7vUbW36MYyaoTwzlGeG8sypvN5zd_R0APDP6z4VFVL8AjW_aTc</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2579439965</pqid></control><display><type>article</type><title>Plane-Edge-SLAM: Seamless Fusion of Planes and Edges for SLAM in Indoor Environments</title><source>IEEE Electronic Library (IEL)</source><creator>Sun, Qinxuan ; Yuan, Jing ; Zhang, Xuebo ; Duan, Feng</creator><creatorcontrib>Sun, Qinxuan ; Yuan, Jing ; Zhang, Xuebo ; Duan, Feng</creatorcontrib><description>Planes and edges are attractive features for simultaneous localization and mapping (SLAM) in indoor environments because they can be reliably extracted and are robust to illumination changes. However, it remains a challenging problem to seamlessly fuse two different kinds of features to avoid degeneracy and accurately estimate the camera motion. In this article, a plane-edge-SLAM system using an RGB-D sensor is developed to address the seamless fusion of planes and edges. Constraint analysis is first performed to obtain a quantitative measure of how the planes constrain the camera motion estimation. Then, using the results of the constraint analysis, an adaptive weighting algorithm is elaborately designed to achieve seamless fusion. Through the fusion of planes and edges, the solution to motion estimation is fully constrained, and the problem remains well-posed in all circumstances. In addition, a probabilistic plane fitting algorithm is proposed to fit a plane model to the noisy 3-D points. By exploiting the error model of the depth sensor, the proposed plane fitting is adaptive to various measurement noises corresponding to different depth measurements. As a result, the estimated plane parameters are more accurate and robust to the points with large uncertainties. Compared with the existing plane fitting methods, the proposed method definitely benefits the performance of motion estimation. The results of extensive experiments on public data sets and in real-world indoor scenes demonstrate that the plane-edge-SLAM system can achieve high accuracy and robustness. Note to Practitioners -This article is motivated by the robust localization and mapping for mobile robots. We suggest a novel simultaneous localization and mapping (SLAM) approach fusing the plane and edge features in indoor scenes (plane-edge-SLAM). This newly proposed approach works well in the textureless or dark scenes and is robust to the sensor noise. The experiments are carried out in various indoor scenes for mobile robots, and the results demonstrate the robustness and effectiveness of the proposed framework. In future work, we will address the fusion of other high-level features (for example, 3-D lines) and the active exploration of the environments.</description><identifier>ISSN: 1545-5955</identifier><identifier>EISSN: 1558-3783</identifier><identifier>DOI: 10.1109/TASE.2020.3032831</identifier><identifier>CODEN: ITASC7</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Adaptive algorithms ; Cameras ; Constraints ; D lines ; Feature extraction ; Image edge detection ; Indoor environments ; Localization ; Motion estimation ; Motion simulation ; Noise measurement ; Plane fitting ; Planes ; RGB-D camera ; Robots ; Robustness ; Sensors ; Simultaneous localization and mapping ; six-degree-of-freedom (6-DoF) camera motion estimation</subject><ispartof>IEEE transactions on automation science and engineering, 2021-10, Vol.18 (4), p.2061-2075</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021</rights><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c359t-bbeebb86d5d2ebda80fcf23b902224819eac2226250937f042bd67ab3c889f543</citedby><cites>FETCH-LOGICAL-c359t-bbeebb86d5d2ebda80fcf23b902224819eac2226250937f042bd67ab3c889f543</cites><orcidid>0000-0002-2179-2460 ; 0000-0001-5308-6539 ; 0000-0001-5495-684X ; 0000-0002-6925-9032</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9248035$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9248035$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Sun, Qinxuan</creatorcontrib><creatorcontrib>Yuan, Jing</creatorcontrib><creatorcontrib>Zhang, Xuebo</creatorcontrib><creatorcontrib>Duan, Feng</creatorcontrib><title>Plane-Edge-SLAM: Seamless Fusion of Planes and Edges for SLAM in Indoor Environments</title><title>IEEE transactions on automation science and engineering</title><addtitle>TASE</addtitle><description>Planes and edges are attractive features for simultaneous localization and mapping (SLAM) in indoor environments because they can be reliably extracted and are robust to illumination changes. However, it remains a challenging problem to seamlessly fuse two different kinds of features to avoid degeneracy and accurately estimate the camera motion. In this article, a plane-edge-SLAM system using an RGB-D sensor is developed to address the seamless fusion of planes and edges. Constraint analysis is first performed to obtain a quantitative measure of how the planes constrain the camera motion estimation. Then, using the results of the constraint analysis, an adaptive weighting algorithm is elaborately designed to achieve seamless fusion. Through the fusion of planes and edges, the solution to motion estimation is fully constrained, and the problem remains well-posed in all circumstances. In addition, a probabilistic plane fitting algorithm is proposed to fit a plane model to the noisy 3-D points. By exploiting the error model of the depth sensor, the proposed plane fitting is adaptive to various measurement noises corresponding to different depth measurements. As a result, the estimated plane parameters are more accurate and robust to the points with large uncertainties. Compared with the existing plane fitting methods, the proposed method definitely benefits the performance of motion estimation. The results of extensive experiments on public data sets and in real-world indoor scenes demonstrate that the plane-edge-SLAM system can achieve high accuracy and robustness. Note to Practitioners -This article is motivated by the robust localization and mapping for mobile robots. We suggest a novel simultaneous localization and mapping (SLAM) approach fusing the plane and edge features in indoor scenes (plane-edge-SLAM). This newly proposed approach works well in the textureless or dark scenes and is robust to the sensor noise. The experiments are carried out in various indoor scenes for mobile robots, and the results demonstrate the robustness and effectiveness of the proposed framework. In future work, we will address the fusion of other high-level features (for example, 3-D lines) and the active exploration of the environments.</description><subject>Adaptive algorithms</subject><subject>Cameras</subject><subject>Constraints</subject><subject>D lines</subject><subject>Feature extraction</subject><subject>Image edge detection</subject><subject>Indoor environments</subject><subject>Localization</subject><subject>Motion estimation</subject><subject>Motion simulation</subject><subject>Noise measurement</subject><subject>Plane fitting</subject><subject>Planes</subject><subject>RGB-D camera</subject><subject>Robots</subject><subject>Robustness</subject><subject>Sensors</subject><subject>Simultaneous localization and mapping</subject><subject>six-degree-of-freedom (6-DoF) camera motion estimation</subject><issn>1545-5955</issn><issn>1558-3783</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNo9kE9LAzEQxYMoWKsfQLwEPKfmz2Y38VZkq4WKQus5JLsT2dImNWkFv727tniaN_B7M4-H0C2jE8aoflhNl_WEU04nggquBDtDIyalIqJS4nzQhSRSS3mJrnJeU8oLpekIrd43NgCp208gy8X09REvwW43kDOeHXIXA44e_zEZ29DiAczYx4QHGncBz0Mb-7UO312KYQthn6_RhbebDDenOUYfs3r19EIWb8_zp-mCNELqPXEOwDlVtrLl4FqrqG88F05Tzvt4TINtelVySbWoPC24a8vKOtEopb0sxBjdH-_uUvw6QN6bdTyk0L80XFa6EFqXsqfYkWpSzDmBN7vUbW36MYyaoTwzlGeG8sypvN5zd_R0APDP6z4VFVL8AjW_aTc</recordid><startdate>20211001</startdate><enddate>20211001</enddate><creator>Sun, Qinxuan</creator><creator>Yuan, Jing</creator><creator>Zhang, Xuebo</creator><creator>Duan, Feng</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-2179-2460</orcidid><orcidid>https://orcid.org/0000-0001-5308-6539</orcidid><orcidid>https://orcid.org/0000-0001-5495-684X</orcidid><orcidid>https://orcid.org/0000-0002-6925-9032</orcidid></search><sort><creationdate>20211001</creationdate><title>Plane-Edge-SLAM: Seamless Fusion of Planes and Edges for SLAM in Indoor Environments</title><author>Sun, Qinxuan ; Yuan, Jing ; Zhang, Xuebo ; Duan, Feng</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c359t-bbeebb86d5d2ebda80fcf23b902224819eac2226250937f042bd67ab3c889f543</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Adaptive algorithms</topic><topic>Cameras</topic><topic>Constraints</topic><topic>D lines</topic><topic>Feature extraction</topic><topic>Image edge detection</topic><topic>Indoor environments</topic><topic>Localization</topic><topic>Motion estimation</topic><topic>Motion simulation</topic><topic>Noise measurement</topic><topic>Plane fitting</topic><topic>Planes</topic><topic>RGB-D camera</topic><topic>Robots</topic><topic>Robustness</topic><topic>Sensors</topic><topic>Simultaneous localization and mapping</topic><topic>six-degree-of-freedom (6-DoF) camera motion estimation</topic><toplevel>online_resources</toplevel><creatorcontrib>Sun, Qinxuan</creatorcontrib><creatorcontrib>Yuan, Jing</creatorcontrib><creatorcontrib>Zhang, Xuebo</creatorcontrib><creatorcontrib>Duan, Feng</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on automation science and engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Sun, Qinxuan</au><au>Yuan, Jing</au><au>Zhang, Xuebo</au><au>Duan, Feng</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Plane-Edge-SLAM: Seamless Fusion of Planes and Edges for SLAM in Indoor Environments</atitle><jtitle>IEEE transactions on automation science and engineering</jtitle><stitle>TASE</stitle><date>2021-10-01</date><risdate>2021</risdate><volume>18</volume><issue>4</issue><spage>2061</spage><epage>2075</epage><pages>2061-2075</pages><issn>1545-5955</issn><eissn>1558-3783</eissn><coden>ITASC7</coden><abstract>Planes and edges are attractive features for simultaneous localization and mapping (SLAM) in indoor environments because they can be reliably extracted and are robust to illumination changes. However, it remains a challenging problem to seamlessly fuse two different kinds of features to avoid degeneracy and accurately estimate the camera motion. In this article, a plane-edge-SLAM system using an RGB-D sensor is developed to address the seamless fusion of planes and edges. Constraint analysis is first performed to obtain a quantitative measure of how the planes constrain the camera motion estimation. Then, using the results of the constraint analysis, an adaptive weighting algorithm is elaborately designed to achieve seamless fusion. Through the fusion of planes and edges, the solution to motion estimation is fully constrained, and the problem remains well-posed in all circumstances. In addition, a probabilistic plane fitting algorithm is proposed to fit a plane model to the noisy 3-D points. By exploiting the error model of the depth sensor, the proposed plane fitting is adaptive to various measurement noises corresponding to different depth measurements. As a result, the estimated plane parameters are more accurate and robust to the points with large uncertainties. Compared with the existing plane fitting methods, the proposed method definitely benefits the performance of motion estimation. The results of extensive experiments on public data sets and in real-world indoor scenes demonstrate that the plane-edge-SLAM system can achieve high accuracy and robustness. Note to Practitioners -This article is motivated by the robust localization and mapping for mobile robots. We suggest a novel simultaneous localization and mapping (SLAM) approach fusing the plane and edge features in indoor scenes (plane-edge-SLAM). This newly proposed approach works well in the textureless or dark scenes and is robust to the sensor noise. The experiments are carried out in various indoor scenes for mobile robots, and the results demonstrate the robustness and effectiveness of the proposed framework. In future work, we will address the fusion of other high-level features (for example, 3-D lines) and the active exploration of the environments.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TASE.2020.3032831</doi><tpages>15</tpages><orcidid>https://orcid.org/0000-0002-2179-2460</orcidid><orcidid>https://orcid.org/0000-0001-5308-6539</orcidid><orcidid>https://orcid.org/0000-0001-5495-684X</orcidid><orcidid>https://orcid.org/0000-0002-6925-9032</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1545-5955
ispartof IEEE transactions on automation science and engineering, 2021-10, Vol.18 (4), p.2061-2075
issn 1545-5955
1558-3783
language eng
recordid cdi_crossref_primary_10_1109_TASE_2020_3032831
source IEEE Electronic Library (IEL)
subjects Adaptive algorithms
Cameras
Constraints
D lines
Feature extraction
Image edge detection
Indoor environments
Localization
Motion estimation
Motion simulation
Noise measurement
Plane fitting
Planes
RGB-D camera
Robots
Robustness
Sensors
Simultaneous localization and mapping
six-degree-of-freedom (6-DoF) camera motion estimation
title Plane-Edge-SLAM: Seamless Fusion of Planes and Edges for SLAM in Indoor Environments
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-12T10%3A57%3A34IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Plane-Edge-SLAM:%20Seamless%20Fusion%20of%20Planes%20and%20Edges%20for%20SLAM%20in%20Indoor%20Environments&rft.jtitle=IEEE%20transactions%20on%20automation%20science%20and%20engineering&rft.au=Sun,%20Qinxuan&rft.date=2021-10-01&rft.volume=18&rft.issue=4&rft.spage=2061&rft.epage=2075&rft.pages=2061-2075&rft.issn=1545-5955&rft.eissn=1558-3783&rft.coden=ITASC7&rft_id=info:doi/10.1109/TASE.2020.3032831&rft_dat=%3Cproquest_RIE%3E2579439965%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2579439965&rft_id=info:pmid/&rft_ieee_id=9248035&rfr_iscdi=true