Static multi-target-based auto-calibration of RGB cameras, 3D Radar, and 3D Lidar sensors
For environmental perception, autonomous vehicles and intelligent roadside infrastructure systems contain multiple sensors, i.e. radar (Radio Detection and Ranging), lidar (Light Detection and Ranging), and camera sensors with the aim to detect, classify and track multiple road users. Data from mult...
Gespeichert in:
Veröffentlicht in: | IEEE sensors journal 2023-08, p.1-1 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1 |
---|---|
container_issue | |
container_start_page | 1 |
container_title | IEEE sensors journal |
container_volume | |
creator | Agrawal, Shiva Bhanderi, Savankumar Doycheva, Kristina Elger, Gordon |
description | For environmental perception, autonomous vehicles and intelligent roadside infrastructure systems contain multiple sensors, i.e. radar (Radio Detection and Ranging), lidar (Light Detection and Ranging), and camera sensors with the aim to detect, classify and track multiple road users. Data from multiple sensors are fused together to enhance the perception quality of the sensor system because each sensor has strengths and weaknesses, e.g. resolution, distance measurement, and dependency on weather conditions. For data fusion, it is necessary to transform the data from the different sensor coordinates to a common coordinate frame. This process is referred to as multi-sensor calibration and is a challenging task, which is mostly performed manually. This paper introduces a new method for auto-calibrating three-dimensional (3D) radar, 3D lidar, and red-green-blue (RGB) mono-camera sensors using a static multi-target-based system. The proposed method can be used with sensors operating at different frame rates without time synchronization. Furthermore, the described static multi-target system is cost-effective, easy to build, and applicable for short to long-distance calibration. The experimental results for multiple sets of measurements show good results with projection errors measured as maximum root mean square error (RMSE) of (u,v) = (2.4, 1.8) pixels for lidar-to-camera calibration, RMSE of (u,v) = (2.2, 3.0) pixels for 3D radar-to-camera calibration and RMSE of (x,y,z) = (2.6, 2.7, 14.0) centimeters for 3D radar-to-lidar calibration. |
doi_str_mv | 10.1109/JSEN.2023.3300957 |
format | Article |
fullrecord | <record><control><sourceid>ieee_RIE</sourceid><recordid>TN_cdi_ieee_primary_10209570</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10209570</ieee_id><sourcerecordid>10209570</sourcerecordid><originalsourceid>FETCH-ieee_primary_102095703</originalsourceid><addsrcrecordid>eNqFjr0KwjAURjMoWH8eQHC4D9DUm8ZSuqpVEXGoDjrJtY0S6Y8k6eDbq-DudDicb_gYGwsMhMBkuj2k-yDEUAZSIiZR3GGeiCTymYxPPda39oEokjiKPXY-OHI6h6otneaOzF05fiWrCqDWNTynUl_NZ9LU0NwgW88hp0oZsj7IJWRUkPGB6uJrO_0xsKq2jbFD1r1RadXoxwGbrNLjYsO1UuryNLoi87oIDL8PUf7Jb3OEP7M</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Static multi-target-based auto-calibration of RGB cameras, 3D Radar, and 3D Lidar sensors</title><source>IEEE Electronic Library (IEL)</source><creator>Agrawal, Shiva ; Bhanderi, Savankumar ; Doycheva, Kristina ; Elger, Gordon</creator><creatorcontrib>Agrawal, Shiva ; Bhanderi, Savankumar ; Doycheva, Kristina ; Elger, Gordon</creatorcontrib><description>For environmental perception, autonomous vehicles and intelligent roadside infrastructure systems contain multiple sensors, i.e. radar (Radio Detection and Ranging), lidar (Light Detection and Ranging), and camera sensors with the aim to detect, classify and track multiple road users. Data from multiple sensors are fused together to enhance the perception quality of the sensor system because each sensor has strengths and weaknesses, e.g. resolution, distance measurement, and dependency on weather conditions. For data fusion, it is necessary to transform the data from the different sensor coordinates to a common coordinate frame. This process is referred to as multi-sensor calibration and is a challenging task, which is mostly performed manually. This paper introduces a new method for auto-calibrating three-dimensional (3D) radar, 3D lidar, and red-green-blue (RGB) mono-camera sensors using a static multi-target-based system. The proposed method can be used with sensors operating at different frame rates without time synchronization. Furthermore, the described static multi-target system is cost-effective, easy to build, and applicable for short to long-distance calibration. The experimental results for multiple sets of measurements show good results with projection errors measured as maximum root mean square error (RMSE) of (u,v) = (2.4, 1.8) pixels for lidar-to-camera calibration, RMSE of (u,v) = (2.2, 3.0) pixels for 3D radar-to-camera calibration and RMSE of (x,y,z) = (2.6, 2.7, 14.0) centimeters for 3D radar-to-lidar calibration.</description><identifier>ISSN: 1530-437X</identifier><identifier>DOI: 10.1109/JSEN.2023.3300957</identifier><identifier>CODEN: ISJEAZ</identifier><language>eng</language><publisher>IEEE</publisher><subject>Autonomous vehicles ; Calibration ; camera ; Cameras ; feature extraction ; intelligent roadside infrastructure ; Intelligent sensors ; Laser radar ; lidar ; Radar ; sensor calibration ; Sensors ; Three-dimensional displays</subject><ispartof>IEEE sensors journal, 2023-08, p.1-1</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><orcidid>0000-0001-8633-341X ; 0000-0002-3340-7048 ; 0000-0001-7257-6736 ; 0000-0002-7643-7327</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10209570$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>315,781,785,797,27929,27930,54763</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10209570$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Agrawal, Shiva</creatorcontrib><creatorcontrib>Bhanderi, Savankumar</creatorcontrib><creatorcontrib>Doycheva, Kristina</creatorcontrib><creatorcontrib>Elger, Gordon</creatorcontrib><title>Static multi-target-based auto-calibration of RGB cameras, 3D Radar, and 3D Lidar sensors</title><title>IEEE sensors journal</title><addtitle>JSEN</addtitle><description>For environmental perception, autonomous vehicles and intelligent roadside infrastructure systems contain multiple sensors, i.e. radar (Radio Detection and Ranging), lidar (Light Detection and Ranging), and camera sensors with the aim to detect, classify and track multiple road users. Data from multiple sensors are fused together to enhance the perception quality of the sensor system because each sensor has strengths and weaknesses, e.g. resolution, distance measurement, and dependency on weather conditions. For data fusion, it is necessary to transform the data from the different sensor coordinates to a common coordinate frame. This process is referred to as multi-sensor calibration and is a challenging task, which is mostly performed manually. This paper introduces a new method for auto-calibrating three-dimensional (3D) radar, 3D lidar, and red-green-blue (RGB) mono-camera sensors using a static multi-target-based system. The proposed method can be used with sensors operating at different frame rates without time synchronization. Furthermore, the described static multi-target system is cost-effective, easy to build, and applicable for short to long-distance calibration. The experimental results for multiple sets of measurements show good results with projection errors measured as maximum root mean square error (RMSE) of (u,v) = (2.4, 1.8) pixels for lidar-to-camera calibration, RMSE of (u,v) = (2.2, 3.0) pixels for 3D radar-to-camera calibration and RMSE of (x,y,z) = (2.6, 2.7, 14.0) centimeters for 3D radar-to-lidar calibration.</description><subject>Autonomous vehicles</subject><subject>Calibration</subject><subject>camera</subject><subject>Cameras</subject><subject>feature extraction</subject><subject>intelligent roadside infrastructure</subject><subject>Intelligent sensors</subject><subject>Laser radar</subject><subject>lidar</subject><subject>Radar</subject><subject>sensor calibration</subject><subject>Sensors</subject><subject>Three-dimensional displays</subject><issn>1530-437X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNqFjr0KwjAURjMoWH8eQHC4D9DUm8ZSuqpVEXGoDjrJtY0S6Y8k6eDbq-DudDicb_gYGwsMhMBkuj2k-yDEUAZSIiZR3GGeiCTymYxPPda39oEokjiKPXY-OHI6h6otneaOzF05fiWrCqDWNTynUl_NZ9LU0NwgW88hp0oZsj7IJWRUkPGB6uJrO_0xsKq2jbFD1r1RadXoxwGbrNLjYsO1UuryNLoi87oIDL8PUf7Jb3OEP7M</recordid><startdate>20230804</startdate><enddate>20230804</enddate><creator>Agrawal, Shiva</creator><creator>Bhanderi, Savankumar</creator><creator>Doycheva, Kristina</creator><creator>Elger, Gordon</creator><general>IEEE</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><orcidid>https://orcid.org/0000-0001-8633-341X</orcidid><orcidid>https://orcid.org/0000-0002-3340-7048</orcidid><orcidid>https://orcid.org/0000-0001-7257-6736</orcidid><orcidid>https://orcid.org/0000-0002-7643-7327</orcidid></search><sort><creationdate>20230804</creationdate><title>Static multi-target-based auto-calibration of RGB cameras, 3D Radar, and 3D Lidar sensors</title><author>Agrawal, Shiva ; Bhanderi, Savankumar ; Doycheva, Kristina ; Elger, Gordon</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-ieee_primary_102095703</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Autonomous vehicles</topic><topic>Calibration</topic><topic>camera</topic><topic>Cameras</topic><topic>feature extraction</topic><topic>intelligent roadside infrastructure</topic><topic>Intelligent sensors</topic><topic>Laser radar</topic><topic>lidar</topic><topic>Radar</topic><topic>sensor calibration</topic><topic>Sensors</topic><topic>Three-dimensional displays</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Agrawal, Shiva</creatorcontrib><creatorcontrib>Bhanderi, Savankumar</creatorcontrib><creatorcontrib>Doycheva, Kristina</creatorcontrib><creatorcontrib>Elger, Gordon</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><jtitle>IEEE sensors journal</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Agrawal, Shiva</au><au>Bhanderi, Savankumar</au><au>Doycheva, Kristina</au><au>Elger, Gordon</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Static multi-target-based auto-calibration of RGB cameras, 3D Radar, and 3D Lidar sensors</atitle><jtitle>IEEE sensors journal</jtitle><stitle>JSEN</stitle><date>2023-08-04</date><risdate>2023</risdate><spage>1</spage><epage>1</epage><pages>1-1</pages><issn>1530-437X</issn><coden>ISJEAZ</coden><abstract>For environmental perception, autonomous vehicles and intelligent roadside infrastructure systems contain multiple sensors, i.e. radar (Radio Detection and Ranging), lidar (Light Detection and Ranging), and camera sensors with the aim to detect, classify and track multiple road users. Data from multiple sensors are fused together to enhance the perception quality of the sensor system because each sensor has strengths and weaknesses, e.g. resolution, distance measurement, and dependency on weather conditions. For data fusion, it is necessary to transform the data from the different sensor coordinates to a common coordinate frame. This process is referred to as multi-sensor calibration and is a challenging task, which is mostly performed manually. This paper introduces a new method for auto-calibrating three-dimensional (3D) radar, 3D lidar, and red-green-blue (RGB) mono-camera sensors using a static multi-target-based system. The proposed method can be used with sensors operating at different frame rates without time synchronization. Furthermore, the described static multi-target system is cost-effective, easy to build, and applicable for short to long-distance calibration. The experimental results for multiple sets of measurements show good results with projection errors measured as maximum root mean square error (RMSE) of (u,v) = (2.4, 1.8) pixels for lidar-to-camera calibration, RMSE of (u,v) = (2.2, 3.0) pixels for 3D radar-to-camera calibration and RMSE of (x,y,z) = (2.6, 2.7, 14.0) centimeters for 3D radar-to-lidar calibration.</abstract><pub>IEEE</pub><doi>10.1109/JSEN.2023.3300957</doi><orcidid>https://orcid.org/0000-0001-8633-341X</orcidid><orcidid>https://orcid.org/0000-0002-3340-7048</orcidid><orcidid>https://orcid.org/0000-0001-7257-6736</orcidid><orcidid>https://orcid.org/0000-0002-7643-7327</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1530-437X |
ispartof | IEEE sensors journal, 2023-08, p.1-1 |
issn | 1530-437X |
language | eng |
recordid | cdi_ieee_primary_10209570 |
source | IEEE Electronic Library (IEL) |
subjects | Autonomous vehicles Calibration camera Cameras feature extraction intelligent roadside infrastructure Intelligent sensors Laser radar lidar Radar sensor calibration Sensors Three-dimensional displays |
title | Static multi-target-based auto-calibration of RGB cameras, 3D Radar, and 3D Lidar sensors |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-14T21%3A58%3A57IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Static%20multi-target-based%20auto-calibration%20of%20RGB%20cameras,%203D%20Radar,%20and%203D%20Lidar%20sensors&rft.jtitle=IEEE%20sensors%20journal&rft.au=Agrawal,%20Shiva&rft.date=2023-08-04&rft.spage=1&rft.epage=1&rft.pages=1-1&rft.issn=1530-437X&rft.coden=ISJEAZ&rft_id=info:doi/10.1109/JSEN.2023.3300957&rft_dat=%3Cieee_RIE%3E10209570%3C/ieee_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10209570&rfr_iscdi=true |