On the Benefits of Visual Stabilization for Frame- and Event-Based Perception

Vision-based perception systems are typically exposed to large orientation changes in different robot applications. In such conditions, their performance might be compromised due to the inherent complexity of processing data captured under challenging motion. Integration of mechanical stabilizers to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE robotics and automation letters 2024-10, Vol.9 (10), p.8802-8809
Hauptverfasser: Rodriguez-Gomez, J.P., Dios, J.R. Martinez-de, Ollero, A., Gallego, G.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 8809
container_issue 10
container_start_page 8802
container_title IEEE robotics and automation letters
container_volume 9
creator Rodriguez-Gomez, J.P.
Dios, J.R. Martinez-de
Ollero, A.
Gallego, G.
description Vision-based perception systems are typically exposed to large orientation changes in different robot applications. In such conditions, their performance might be compromised due to the inherent complexity of processing data captured under challenging motion. Integration of mechanical stabilizers to compensate for the camera rotation is not always possible due to the robot payload constraints. This letter presents a processing-based stabilization approach to compensate the camera's rotational motion both on events and on frames (i.e., images). Assuming that the camera's attitude is available, we evaluate the benefits of stabilization in two perception applications: feature tracking and estimating the translation component of the camera's ego-motion. The validation is performed using synthetic data and sequences from well-known event-based vision datasets. The experiments unveil that stabilization can improve feature tracking and camera ego-motion estimation accuracy in 27.37% and 34.82%, respectively. Concurrently, stabilization can reduce the processing time of computing the camera's linear velocity by at least 25%.
doi_str_mv 10.1109/LRA.2024.3450290
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_3102974698</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10648753</ieee_id><sourcerecordid>3102974698</sourcerecordid><originalsourceid>FETCH-LOGICAL-c175t-fd7188c8c9df480b06f1c93f79cb44f64114c09177e143fe387eb11bb1253ab73</originalsourceid><addsrcrecordid>eNpNkDtrwzAUhUVpoSHN3qGDoLNTXUu2pDEJ6QNSUvpahSRfUYfETiWn0P76OiRDpnuG75wLHyHXwMYATN8tXifjnOVizEXBcs3OyCDnUmZcluX5Sb4ko5RWjDEocsl1MSDPy4Z2X0in2GCou0TbQD_rtLNr-tZZV6_rP9vVbUNDG-l9tBvMqG0qOv_BpsumNmFFXzB63O6pK3IR7Drh6HiH5ON-_j57zBbLh6fZZJF5kEWXhUqCUl55XQWhmGNlAK95kNo7IUIpAIRnGqREEDwgVxIdgHOQF9w6yYfk9rC7je33DlNnVu0uNv1Lw6EXIEWpVU-xA-Vjm1LEYLax3tj4a4CZvTfTezN7b-bora_cHCo1Ip7gpVCy4PwfDndnag</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3102974698</pqid></control><display><type>article</type><title>On the Benefits of Visual Stabilization for Frame- and Event-Based Perception</title><source>IEEE Electronic Library (IEL)</source><creator>Rodriguez-Gomez, J.P. ; Dios, J.R. Martinez-de ; Ollero, A. ; Gallego, G.</creator><creatorcontrib>Rodriguez-Gomez, J.P. ; Dios, J.R. Martinez-de ; Ollero, A. ; Gallego, G.</creatorcontrib><description>Vision-based perception systems are typically exposed to large orientation changes in different robot applications. In such conditions, their performance might be compromised due to the inherent complexity of processing data captured under challenging motion. Integration of mechanical stabilizers to compensate for the camera rotation is not always possible due to the robot payload constraints. This letter presents a processing-based stabilization approach to compensate the camera's rotational motion both on events and on frames (i.e., images). Assuming that the camera's attitude is available, we evaluate the benefits of stabilization in two perception applications: feature tracking and estimating the translation component of the camera's ego-motion. The validation is performed using synthetic data and sequences from well-known event-based vision datasets. The experiments unveil that stabilization can improve feature tracking and camera ego-motion estimation accuracy in 27.37% and 34.82%, respectively. Concurrently, stabilization can reduce the processing time of computing the camera's linear velocity by at least 25%.</description><identifier>ISSN: 2377-3766</identifier><identifier>EISSN: 2377-3766</identifier><identifier>DOI: 10.1109/LRA.2024.3450290</identifier><identifier>CODEN: IRALC6</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>biologically-inspired robots ; Cameras ; computer vision for automation ; Data processing ; Estimation ; Event camera ; Motion simulation ; Perception ; Robot dynamics ; Robot vision systems ; Robots ; sensor fusion ; Stabilization ; Synthetic data ; Task analysis ; Tracking ; Vision ; Visualization</subject><ispartof>IEEE robotics and automation letters, 2024-10, Vol.9 (10), p.8802-8809</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c175t-fd7188c8c9df480b06f1c93f79cb44f64114c09177e143fe387eb11bb1253ab73</cites><orcidid>0000-0001-9431-7831 ; 0000-0002-2672-9241 ; 0000-0001-7628-1660 ; 0000-0003-2155-2472</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10648753$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>315,781,785,797,27929,27930,54763</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10648753$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Rodriguez-Gomez, J.P.</creatorcontrib><creatorcontrib>Dios, J.R. Martinez-de</creatorcontrib><creatorcontrib>Ollero, A.</creatorcontrib><creatorcontrib>Gallego, G.</creatorcontrib><title>On the Benefits of Visual Stabilization for Frame- and Event-Based Perception</title><title>IEEE robotics and automation letters</title><addtitle>LRA</addtitle><description>Vision-based perception systems are typically exposed to large orientation changes in different robot applications. In such conditions, their performance might be compromised due to the inherent complexity of processing data captured under challenging motion. Integration of mechanical stabilizers to compensate for the camera rotation is not always possible due to the robot payload constraints. This letter presents a processing-based stabilization approach to compensate the camera's rotational motion both on events and on frames (i.e., images). Assuming that the camera's attitude is available, we evaluate the benefits of stabilization in two perception applications: feature tracking and estimating the translation component of the camera's ego-motion. The validation is performed using synthetic data and sequences from well-known event-based vision datasets. The experiments unveil that stabilization can improve feature tracking and camera ego-motion estimation accuracy in 27.37% and 34.82%, respectively. Concurrently, stabilization can reduce the processing time of computing the camera's linear velocity by at least 25%.</description><subject>biologically-inspired robots</subject><subject>Cameras</subject><subject>computer vision for automation</subject><subject>Data processing</subject><subject>Estimation</subject><subject>Event camera</subject><subject>Motion simulation</subject><subject>Perception</subject><subject>Robot dynamics</subject><subject>Robot vision systems</subject><subject>Robots</subject><subject>sensor fusion</subject><subject>Stabilization</subject><subject>Synthetic data</subject><subject>Task analysis</subject><subject>Tracking</subject><subject>Vision</subject><subject>Visualization</subject><issn>2377-3766</issn><issn>2377-3766</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkDtrwzAUhUVpoSHN3qGDoLNTXUu2pDEJ6QNSUvpahSRfUYfETiWn0P76OiRDpnuG75wLHyHXwMYATN8tXifjnOVizEXBcs3OyCDnUmZcluX5Sb4ko5RWjDEocsl1MSDPy4Z2X0in2GCou0TbQD_rtLNr-tZZV6_rP9vVbUNDG-l9tBvMqG0qOv_BpsumNmFFXzB63O6pK3IR7Drh6HiH5ON-_j57zBbLh6fZZJF5kEWXhUqCUl55XQWhmGNlAK95kNo7IUIpAIRnGqREEDwgVxIdgHOQF9w6yYfk9rC7je33DlNnVu0uNv1Lw6EXIEWpVU-xA-Vjm1LEYLax3tj4a4CZvTfTezN7b-bora_cHCo1Ip7gpVCy4PwfDndnag</recordid><startdate>20241001</startdate><enddate>20241001</enddate><creator>Rodriguez-Gomez, J.P.</creator><creator>Dios, J.R. Martinez-de</creator><creator>Ollero, A.</creator><creator>Gallego, G.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-9431-7831</orcidid><orcidid>https://orcid.org/0000-0002-2672-9241</orcidid><orcidid>https://orcid.org/0000-0001-7628-1660</orcidid><orcidid>https://orcid.org/0000-0003-2155-2472</orcidid></search><sort><creationdate>20241001</creationdate><title>On the Benefits of Visual Stabilization for Frame- and Event-Based Perception</title><author>Rodriguez-Gomez, J.P. ; Dios, J.R. Martinez-de ; Ollero, A. ; Gallego, G.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c175t-fd7188c8c9df480b06f1c93f79cb44f64114c09177e143fe387eb11bb1253ab73</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>biologically-inspired robots</topic><topic>Cameras</topic><topic>computer vision for automation</topic><topic>Data processing</topic><topic>Estimation</topic><topic>Event camera</topic><topic>Motion simulation</topic><topic>Perception</topic><topic>Robot dynamics</topic><topic>Robot vision systems</topic><topic>Robots</topic><topic>sensor fusion</topic><topic>Stabilization</topic><topic>Synthetic data</topic><topic>Task analysis</topic><topic>Tracking</topic><topic>Vision</topic><topic>Visualization</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Rodriguez-Gomez, J.P.</creatorcontrib><creatorcontrib>Dios, J.R. Martinez-de</creatorcontrib><creatorcontrib>Ollero, A.</creatorcontrib><creatorcontrib>Gallego, G.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE robotics and automation letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Rodriguez-Gomez, J.P.</au><au>Dios, J.R. Martinez-de</au><au>Ollero, A.</au><au>Gallego, G.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>On the Benefits of Visual Stabilization for Frame- and Event-Based Perception</atitle><jtitle>IEEE robotics and automation letters</jtitle><stitle>LRA</stitle><date>2024-10-01</date><risdate>2024</risdate><volume>9</volume><issue>10</issue><spage>8802</spage><epage>8809</epage><pages>8802-8809</pages><issn>2377-3766</issn><eissn>2377-3766</eissn><coden>IRALC6</coden><abstract>Vision-based perception systems are typically exposed to large orientation changes in different robot applications. In such conditions, their performance might be compromised due to the inherent complexity of processing data captured under challenging motion. Integration of mechanical stabilizers to compensate for the camera rotation is not always possible due to the robot payload constraints. This letter presents a processing-based stabilization approach to compensate the camera's rotational motion both on events and on frames (i.e., images). Assuming that the camera's attitude is available, we evaluate the benefits of stabilization in two perception applications: feature tracking and estimating the translation component of the camera's ego-motion. The validation is performed using synthetic data and sequences from well-known event-based vision datasets. The experiments unveil that stabilization can improve feature tracking and camera ego-motion estimation accuracy in 27.37% and 34.82%, respectively. Concurrently, stabilization can reduce the processing time of computing the camera's linear velocity by at least 25%.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/LRA.2024.3450290</doi><tpages>8</tpages><orcidid>https://orcid.org/0000-0001-9431-7831</orcidid><orcidid>https://orcid.org/0000-0002-2672-9241</orcidid><orcidid>https://orcid.org/0000-0001-7628-1660</orcidid><orcidid>https://orcid.org/0000-0003-2155-2472</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 2377-3766
ispartof IEEE robotics and automation letters, 2024-10, Vol.9 (10), p.8802-8809
issn 2377-3766
2377-3766
language eng
recordid cdi_proquest_journals_3102974698
source IEEE Electronic Library (IEL)
subjects biologically-inspired robots
Cameras
computer vision for automation
Data processing
Estimation
Event camera
Motion simulation
Perception
Robot dynamics
Robot vision systems
Robots
sensor fusion
Stabilization
Synthetic data
Task analysis
Tracking
Vision
Visualization
title On the Benefits of Visual Stabilization for Frame- and Event-Based Perception
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-11T11%3A57%3A22IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=On%20the%20Benefits%20of%20Visual%20Stabilization%20for%20Frame-%20and%20Event-Based%20Perception&rft.jtitle=IEEE%20robotics%20and%20automation%20letters&rft.au=Rodriguez-Gomez,%20J.P.&rft.date=2024-10-01&rft.volume=9&rft.issue=10&rft.spage=8802&rft.epage=8809&rft.pages=8802-8809&rft.issn=2377-3766&rft.eissn=2377-3766&rft.coden=IRALC6&rft_id=info:doi/10.1109/LRA.2024.3450290&rft_dat=%3Cproquest_RIE%3E3102974698%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3102974698&rft_id=info:pmid/&rft_ieee_id=10648753&rfr_iscdi=true