GNSS‐stereo‐inertial SLAM for arable farming
The accelerating pace in the automation of agricultural tasks demands highly accurate and robust localization systems for field robots. Simultaneous Localization and Mapping (SLAM) methods inevitably accumulate drift on exploratory trajectories and primarily rely on place revisiting and loop closing...
Gespeichert in:
Veröffentlicht in: | Journal of field robotics 2024-10, Vol.41 (7), p.2215-2225 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 2225 |
---|---|
container_issue | 7 |
container_start_page | 2215 |
container_title | Journal of field robotics |
container_volume | 41 |
creator | Cremona, Javier Civera, Javier Kofman, Ernesto Pire, Taihú |
description | The accelerating pace in the automation of agricultural tasks demands highly accurate and robust localization systems for field robots. Simultaneous Localization and Mapping (SLAM) methods inevitably accumulate drift on exploratory trajectories and primarily rely on place revisiting and loop closing to keep a bounded global localization error. Loop closure techniques are significantly challenging in agricultural fields, as the local visual appearance of different views is very similar and might change easily due to weather effects. A suitable alternative in practice is to employ global sensor positioning systems jointly with the rest of the robot sensors. In this paper we propose and implement the fusion of global navigation satellite system (GNSS), stereo views, and inertial measurements for localization purposes. Specifically, we incorporate, in a tightly coupled manner, GNSS measurements into the stereo‐inertial ORB‐SLAM3 pipeline. We thoroughly evaluate our implementation in the sequences of the Rosario data set, recorded by an autonomous robot in soybean fields, and our own in‐house data. Our data includes measurements from a conventional GNSS, rarely included in evaluations of state‐of‐the‐art approaches. We characterize the performance of GNSS‐stereo‐inertial SLAM in this application case, reporting pose error reductions between 10% and 30% compared to visual–inertial and loosely coupled GNSS‐stereo‐inertial baselines. In addition to such analysis, we also release the code of our implementation as open source. |
doi_str_mv | 10.1002/rob.22232 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3129202148</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3129202148</sourcerecordid><originalsourceid>FETCH-LOGICAL-c2972-4a277a14ad4d15f2b39bd715040e699ef3ebefa01e19f761c8a1445365ad41c83</originalsourceid><addsrcrecordid>eNp1kM1Kw0AURgdRsFYXvkHAlYu085vpLGuxVYgWjK6HSXJHUtJMnWmR7nwEn9EncWrEnav7XTjfvXAQuiR4RDCmY-_KEaWU0SM0IEJkKVeZPP7LQp2isxBWGHM2UWKA8OKxKL4-PsMWPLgYmg78tjFtUuTTh8Q6nxhvyhYSa_y66V7P0Yk1bYCL3zlEL_Pb59ldmi8X97NpnlZUSZpyQ6U0hJua10RYWjJV1pIIzDFkSoFlUII1mABRVmakmkSYC5aJ2IgbG6Kr_u7Gu7cdhK1euZ3v4kvNCFUUU8IP1HVPVd6F4MHqjW_Wxu81wfogREch-kdIZMc9-960sP8f1E_Lm77xDTTIYXo</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3129202148</pqid></control><display><type>article</type><title>GNSS‐stereo‐inertial SLAM for arable farming</title><source>Wiley Online Library Journals Frontfile Complete</source><creator>Cremona, Javier ; Civera, Javier ; Kofman, Ernesto ; Pire, Taihú</creator><creatorcontrib>Cremona, Javier ; Civera, Javier ; Kofman, Ernesto ; Pire, Taihú</creatorcontrib><description>The accelerating pace in the automation of agricultural tasks demands highly accurate and robust localization systems for field robots. Simultaneous Localization and Mapping (SLAM) methods inevitably accumulate drift on exploratory trajectories and primarily rely on place revisiting and loop closing to keep a bounded global localization error. Loop closure techniques are significantly challenging in agricultural fields, as the local visual appearance of different views is very similar and might change easily due to weather effects. A suitable alternative in practice is to employ global sensor positioning systems jointly with the rest of the robot sensors. In this paper we propose and implement the fusion of global navigation satellite system (GNSS), stereo views, and inertial measurements for localization purposes. Specifically, we incorporate, in a tightly coupled manner, GNSS measurements into the stereo‐inertial ORB‐SLAM3 pipeline. We thoroughly evaluate our implementation in the sequences of the Rosario data set, recorded by an autonomous robot in soybean fields, and our own in‐house data. Our data includes measurements from a conventional GNSS, rarely included in evaluations of state‐of‐the‐art approaches. We characterize the performance of GNSS‐stereo‐inertial SLAM in this application case, reporting pose error reductions between 10% and 30% compared to visual–inertial and loosely coupled GNSS‐stereo‐inertial baselines. In addition to such analysis, we also release the code of our implementation as open source.</description><identifier>ISSN: 1556-4959</identifier><identifier>EISSN: 1556-4967</identifier><identifier>DOI: 10.1002/rob.22232</identifier><language>eng</language><publisher>Hoboken: Wiley Subscription Services, Inc</publisher><subject>agricultural robotics ; Arable land ; Global navigation satellite system ; GNSS‐stereo‐inertial SLAM ; Inertial navigation ; Localization ; precision agriculture ; Robot sensors ; Simultaneous localization and mapping ; Source code ; Visual effects ; Visual fields</subject><ispartof>Journal of field robotics, 2024-10, Vol.41 (7), p.2215-2225</ispartof><rights>2023 Wiley Periodicals LLC.</rights><rights>2024 Wiley Periodicals LLC.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c2972-4a277a14ad4d15f2b39bd715040e699ef3ebefa01e19f761c8a1445365ad41c83</citedby><cites>FETCH-LOGICAL-c2972-4a277a14ad4d15f2b39bd715040e699ef3ebefa01e19f761c8a1445365ad41c83</cites><orcidid>0000-0001-5621-6474 ; 0000-0002-7699-4262</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1002%2Frob.22232$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1002%2Frob.22232$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>314,776,780,1411,27901,27902,45550,45551</link.rule.ids></links><search><creatorcontrib>Cremona, Javier</creatorcontrib><creatorcontrib>Civera, Javier</creatorcontrib><creatorcontrib>Kofman, Ernesto</creatorcontrib><creatorcontrib>Pire, Taihú</creatorcontrib><title>GNSS‐stereo‐inertial SLAM for arable farming</title><title>Journal of field robotics</title><description>The accelerating pace in the automation of agricultural tasks demands highly accurate and robust localization systems for field robots. Simultaneous Localization and Mapping (SLAM) methods inevitably accumulate drift on exploratory trajectories and primarily rely on place revisiting and loop closing to keep a bounded global localization error. Loop closure techniques are significantly challenging in agricultural fields, as the local visual appearance of different views is very similar and might change easily due to weather effects. A suitable alternative in practice is to employ global sensor positioning systems jointly with the rest of the robot sensors. In this paper we propose and implement the fusion of global navigation satellite system (GNSS), stereo views, and inertial measurements for localization purposes. Specifically, we incorporate, in a tightly coupled manner, GNSS measurements into the stereo‐inertial ORB‐SLAM3 pipeline. We thoroughly evaluate our implementation in the sequences of the Rosario data set, recorded by an autonomous robot in soybean fields, and our own in‐house data. Our data includes measurements from a conventional GNSS, rarely included in evaluations of state‐of‐the‐art approaches. We characterize the performance of GNSS‐stereo‐inertial SLAM in this application case, reporting pose error reductions between 10% and 30% compared to visual–inertial and loosely coupled GNSS‐stereo‐inertial baselines. In addition to such analysis, we also release the code of our implementation as open source.</description><subject>agricultural robotics</subject><subject>Arable land</subject><subject>Global navigation satellite system</subject><subject>GNSS‐stereo‐inertial SLAM</subject><subject>Inertial navigation</subject><subject>Localization</subject><subject>precision agriculture</subject><subject>Robot sensors</subject><subject>Simultaneous localization and mapping</subject><subject>Source code</subject><subject>Visual effects</subject><subject>Visual fields</subject><issn>1556-4959</issn><issn>1556-4967</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNp1kM1Kw0AURgdRsFYXvkHAlYu085vpLGuxVYgWjK6HSXJHUtJMnWmR7nwEn9EncWrEnav7XTjfvXAQuiR4RDCmY-_KEaWU0SM0IEJkKVeZPP7LQp2isxBWGHM2UWKA8OKxKL4-PsMWPLgYmg78tjFtUuTTh8Q6nxhvyhYSa_y66V7P0Yk1bYCL3zlEL_Pb59ldmi8X97NpnlZUSZpyQ6U0hJua10RYWjJV1pIIzDFkSoFlUII1mABRVmakmkSYC5aJ2IgbG6Kr_u7Gu7cdhK1euZ3v4kvNCFUUU8IP1HVPVd6F4MHqjW_Wxu81wfogREch-kdIZMc9-960sP8f1E_Lm77xDTTIYXo</recordid><startdate>202410</startdate><enddate>202410</enddate><creator>Cremona, Javier</creator><creator>Civera, Javier</creator><creator>Kofman, Ernesto</creator><creator>Pire, Taihú</creator><general>Wiley Subscription Services, Inc</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-5621-6474</orcidid><orcidid>https://orcid.org/0000-0002-7699-4262</orcidid></search><sort><creationdate>202410</creationdate><title>GNSS‐stereo‐inertial SLAM for arable farming</title><author>Cremona, Javier ; Civera, Javier ; Kofman, Ernesto ; Pire, Taihú</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c2972-4a277a14ad4d15f2b39bd715040e699ef3ebefa01e19f761c8a1445365ad41c83</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>agricultural robotics</topic><topic>Arable land</topic><topic>Global navigation satellite system</topic><topic>GNSS‐stereo‐inertial SLAM</topic><topic>Inertial navigation</topic><topic>Localization</topic><topic>precision agriculture</topic><topic>Robot sensors</topic><topic>Simultaneous localization and mapping</topic><topic>Source code</topic><topic>Visual effects</topic><topic>Visual fields</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Cremona, Javier</creatorcontrib><creatorcontrib>Civera, Javier</creatorcontrib><creatorcontrib>Kofman, Ernesto</creatorcontrib><creatorcontrib>Pire, Taihú</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Journal of field robotics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Cremona, Javier</au><au>Civera, Javier</au><au>Kofman, Ernesto</au><au>Pire, Taihú</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>GNSS‐stereo‐inertial SLAM for arable farming</atitle><jtitle>Journal of field robotics</jtitle><date>2024-10</date><risdate>2024</risdate><volume>41</volume><issue>7</issue><spage>2215</spage><epage>2225</epage><pages>2215-2225</pages><issn>1556-4959</issn><eissn>1556-4967</eissn><abstract>The accelerating pace in the automation of agricultural tasks demands highly accurate and robust localization systems for field robots. Simultaneous Localization and Mapping (SLAM) methods inevitably accumulate drift on exploratory trajectories and primarily rely on place revisiting and loop closing to keep a bounded global localization error. Loop closure techniques are significantly challenging in agricultural fields, as the local visual appearance of different views is very similar and might change easily due to weather effects. A suitable alternative in practice is to employ global sensor positioning systems jointly with the rest of the robot sensors. In this paper we propose and implement the fusion of global navigation satellite system (GNSS), stereo views, and inertial measurements for localization purposes. Specifically, we incorporate, in a tightly coupled manner, GNSS measurements into the stereo‐inertial ORB‐SLAM3 pipeline. We thoroughly evaluate our implementation in the sequences of the Rosario data set, recorded by an autonomous robot in soybean fields, and our own in‐house data. Our data includes measurements from a conventional GNSS, rarely included in evaluations of state‐of‐the‐art approaches. We characterize the performance of GNSS‐stereo‐inertial SLAM in this application case, reporting pose error reductions between 10% and 30% compared to visual–inertial and loosely coupled GNSS‐stereo‐inertial baselines. In addition to such analysis, we also release the code of our implementation as open source.</abstract><cop>Hoboken</cop><pub>Wiley Subscription Services, Inc</pub><doi>10.1002/rob.22232</doi><tpages>11</tpages><orcidid>https://orcid.org/0000-0001-5621-6474</orcidid><orcidid>https://orcid.org/0000-0002-7699-4262</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1556-4959 |
ispartof | Journal of field robotics, 2024-10, Vol.41 (7), p.2215-2225 |
issn | 1556-4959 1556-4967 |
language | eng |
recordid | cdi_proquest_journals_3129202148 |
source | Wiley Online Library Journals Frontfile Complete |
subjects | agricultural robotics Arable land Global navigation satellite system GNSS‐stereo‐inertial SLAM Inertial navigation Localization precision agriculture Robot sensors Simultaneous localization and mapping Source code Visual effects Visual fields |
title | GNSS‐stereo‐inertial SLAM for arable farming |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-30T02%3A16%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=GNSS%E2%80%90stereo%E2%80%90inertial%20SLAM%20for%20arable%20farming&rft.jtitle=Journal%20of%20field%20robotics&rft.au=Cremona,%20Javier&rft.date=2024-10&rft.volume=41&rft.issue=7&rft.spage=2215&rft.epage=2225&rft.pages=2215-2225&rft.issn=1556-4959&rft.eissn=1556-4967&rft_id=info:doi/10.1002/rob.22232&rft_dat=%3Cproquest_cross%3E3129202148%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3129202148&rft_id=info:pmid/&rfr_iscdi=true |