Direct Visual Odometry in Low Light Using Binary Descriptors

Feature descriptors are powerful tools for photometrically and geometrically invariant image matching. To date, however, their use has been tied to sparse interest point detection, which is susceptible to noise under adverse imaging conditions. In this letter, we propose to use binary feature descri...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE robotics and automation letters 2017-04, Vol.2 (2), p.444-451
Hauptverfasser: Alismail, Hatem, Kaess, Michael, Browning, Brett, Lucey, Simon
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 451
container_issue 2
container_start_page 444
container_title IEEE robotics and automation letters
container_volume 2
creator Alismail, Hatem
Kaess, Michael
Browning, Brett
Lucey, Simon
description Feature descriptors are powerful tools for photometrically and geometrically invariant image matching. To date, however, their use has been tied to sparse interest point detection, which is susceptible to noise under adverse imaging conditions. In this letter, we propose to use binary feature descriptors in a direct tracking framework without relying on sparse interest points. This novel combination of feature descriptors and direct tracking is shown to achieve robust and efficient visual odometry with applications to poorly lit subterranean environments.
doi_str_mv 10.1109/LRA.2016.2635686
format Article
fullrecord <record><control><sourceid>crossref_RIE</sourceid><recordid>TN_cdi_ieee_primary_7769175</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>7769175</ieee_id><sourcerecordid>10_1109_LRA_2016_2635686</sourcerecordid><originalsourceid>FETCH-LOGICAL-c329t-ecb8d605e5dcfba83e262951238c7e3bff37467ae5e1032df09a0c731ca521313</originalsourceid><addsrcrecordid>eNpNkMFKAzEQhoMoWGrvgpe8wNYkY5INeKmtVWGhINZryGYnNdLulmRFfHu3tIinf-Cfbxg-Qq45m3LOzG31OpsKxtVUKJCqVGdkJEDrArRS5__mSzLJ-ZMxxqXQYOSI3C9iQt_T95i_3Jaumm6HffqhsaVV902ruPno6TrHdkMfYuuGZoHZp7jvu5SvyEVw24yTU47Jevn4Nn8uqtXTy3xWFR6E6Qv0ddkoJlE2PtSuBBRKGMkFlF4j1CGAvlPaoUTOQDSBGce8Bu6dFBw4jAk73vWpyzlhsPsUd8MzljN7EGAHAfYgwJ4EDMjNEYmI-LeutTJcS_gFoBVWLw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Direct Visual Odometry in Low Light Using Binary Descriptors</title><source>IEEE Electronic Library (IEL)</source><creator>Alismail, Hatem ; Kaess, Michael ; Browning, Brett ; Lucey, Simon</creator><creatorcontrib>Alismail, Hatem ; Kaess, Michael ; Browning, Brett ; Lucey, Simon</creatorcontrib><description>Feature descriptors are powerful tools for photometrically and geometrically invariant image matching. To date, however, their use has been tied to sparse interest point detection, which is susceptible to noise under adverse imaging conditions. In this letter, we propose to use binary feature descriptors in a direct tracking framework without relying on sparse interest points. This novel combination of feature descriptors and direct tracking is shown to achieve robust and efficient visual odometry with applications to poorly lit subterranean environments.</description><identifier>ISSN: 2377-3766</identifier><identifier>EISSN: 2377-3766</identifier><identifier>DOI: 10.1109/LRA.2016.2635686</identifier><identifier>CODEN: IRALC6</identifier><language>eng</language><publisher>IEEE</publisher><subject>Cameras ; Hamming distance ; Lighting ; Low light vision ; mapping ; Robot vision systems ; robust visual odometry ; Robustness ; SLAM ; visual tracking ; Visualization</subject><ispartof>IEEE robotics and automation letters, 2017-04, Vol.2 (2), p.444-451</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c329t-ecb8d605e5dcfba83e262951238c7e3bff37467ae5e1032df09a0c731ca521313</citedby><cites>FETCH-LOGICAL-c329t-ecb8d605e5dcfba83e262951238c7e3bff37467ae5e1032df09a0c731ca521313</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/7769175$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27915,27916,54749</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/7769175$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Alismail, Hatem</creatorcontrib><creatorcontrib>Kaess, Michael</creatorcontrib><creatorcontrib>Browning, Brett</creatorcontrib><creatorcontrib>Lucey, Simon</creatorcontrib><title>Direct Visual Odometry in Low Light Using Binary Descriptors</title><title>IEEE robotics and automation letters</title><addtitle>LRA</addtitle><description>Feature descriptors are powerful tools for photometrically and geometrically invariant image matching. To date, however, their use has been tied to sparse interest point detection, which is susceptible to noise under adverse imaging conditions. In this letter, we propose to use binary feature descriptors in a direct tracking framework without relying on sparse interest points. This novel combination of feature descriptors and direct tracking is shown to achieve robust and efficient visual odometry with applications to poorly lit subterranean environments.</description><subject>Cameras</subject><subject>Hamming distance</subject><subject>Lighting</subject><subject>Low light vision</subject><subject>mapping</subject><subject>Robot vision systems</subject><subject>robust visual odometry</subject><subject>Robustness</subject><subject>SLAM</subject><subject>visual tracking</subject><subject>Visualization</subject><issn>2377-3766</issn><issn>2377-3766</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2017</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkMFKAzEQhoMoWGrvgpe8wNYkY5INeKmtVWGhINZryGYnNdLulmRFfHu3tIinf-Cfbxg-Qq45m3LOzG31OpsKxtVUKJCqVGdkJEDrArRS5__mSzLJ-ZMxxqXQYOSI3C9iQt_T95i_3Jaumm6HffqhsaVV902ruPno6TrHdkMfYuuGZoHZp7jvu5SvyEVw24yTU47Jevn4Nn8uqtXTy3xWFR6E6Qv0ddkoJlE2PtSuBBRKGMkFlF4j1CGAvlPaoUTOQDSBGce8Bu6dFBw4jAk73vWpyzlhsPsUd8MzljN7EGAHAfYgwJ4EDMjNEYmI-LeutTJcS_gFoBVWLw</recordid><startdate>201704</startdate><enddate>201704</enddate><creator>Alismail, Hatem</creator><creator>Kaess, Michael</creator><creator>Browning, Brett</creator><creator>Lucey, Simon</creator><general>IEEE</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>201704</creationdate><title>Direct Visual Odometry in Low Light Using Binary Descriptors</title><author>Alismail, Hatem ; Kaess, Michael ; Browning, Brett ; Lucey, Simon</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c329t-ecb8d605e5dcfba83e262951238c7e3bff37467ae5e1032df09a0c731ca521313</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2017</creationdate><topic>Cameras</topic><topic>Hamming distance</topic><topic>Lighting</topic><topic>Low light vision</topic><topic>mapping</topic><topic>Robot vision systems</topic><topic>robust visual odometry</topic><topic>Robustness</topic><topic>SLAM</topic><topic>visual tracking</topic><topic>Visualization</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Alismail, Hatem</creatorcontrib><creatorcontrib>Kaess, Michael</creatorcontrib><creatorcontrib>Browning, Brett</creatorcontrib><creatorcontrib>Lucey, Simon</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><jtitle>IEEE robotics and automation letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Alismail, Hatem</au><au>Kaess, Michael</au><au>Browning, Brett</au><au>Lucey, Simon</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Direct Visual Odometry in Low Light Using Binary Descriptors</atitle><jtitle>IEEE robotics and automation letters</jtitle><stitle>LRA</stitle><date>2017-04</date><risdate>2017</risdate><volume>2</volume><issue>2</issue><spage>444</spage><epage>451</epage><pages>444-451</pages><issn>2377-3766</issn><eissn>2377-3766</eissn><coden>IRALC6</coden><abstract>Feature descriptors are powerful tools for photometrically and geometrically invariant image matching. To date, however, their use has been tied to sparse interest point detection, which is susceptible to noise under adverse imaging conditions. In this letter, we propose to use binary feature descriptors in a direct tracking framework without relying on sparse interest points. This novel combination of feature descriptors and direct tracking is shown to achieve robust and efficient visual odometry with applications to poorly lit subterranean environments.</abstract><pub>IEEE</pub><doi>10.1109/LRA.2016.2635686</doi><tpages>8</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 2377-3766
ispartof IEEE robotics and automation letters, 2017-04, Vol.2 (2), p.444-451
issn 2377-3766
2377-3766
language eng
recordid cdi_ieee_primary_7769175
source IEEE Electronic Library (IEL)
subjects Cameras
Hamming distance
Lighting
Low light vision
mapping
Robot vision systems
robust visual odometry
Robustness
SLAM
visual tracking
Visualization
title Direct Visual Odometry in Low Light Using Binary Descriptors
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-15T07%3A10%3A56IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-crossref_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Direct%20Visual%20Odometry%20in%20Low%20Light%20Using%20Binary%20Descriptors&rft.jtitle=IEEE%20robotics%20and%20automation%20letters&rft.au=Alismail,%20Hatem&rft.date=2017-04&rft.volume=2&rft.issue=2&rft.spage=444&rft.epage=451&rft.pages=444-451&rft.issn=2377-3766&rft.eissn=2377-3766&rft.coden=IRALC6&rft_id=info:doi/10.1109/LRA.2016.2635686&rft_dat=%3Ccrossref_RIE%3E10_1109_LRA_2016_2635686%3C/crossref_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=7769175&rfr_iscdi=true