Estimating the Driving State of Oncoming Vehicles From a Moving Platform Using Stereo Vision

A new image-based approach for fast and robust vehicle tracking from a moving platform is presented. Position, orientation, and full motion state, including velocity, acceleration, and yaw rate of a detected vehicle, are estimated from a tracked rigid 3-D point cloud. This point cloud represents a 3...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE Transactions on intelligent transportation systems 2009-12, Vol.10 (4), p.560-571
Hauptverfasser: Barth, A., Franke, U.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 571
container_issue 4
container_start_page 560
container_title IEEE Transactions on intelligent transportation systems
container_volume 10
creator Barth, A.
Franke, U.
description A new image-based approach for fast and robust vehicle tracking from a moving platform is presented. Position, orientation, and full motion state, including velocity, acceleration, and yaw rate of a detected vehicle, are estimated from a tracked rigid 3-D point cloud. This point cloud represents a 3-D object model and is computed by analyzing image sequences in both space and time, i.e., by fusion of stereo vision and tracked image features. Starting from an automated initial vehicle hypothesis, tracking is performed by means of an extended Kalman filter. The filter combines the knowledge about the movement of the rigid point cloud's points in the world with the dynamic model of a vehicle. Radar information is used to improve the image-based object detection at far distances. The proposed system is applied to predict the driving path of other traffic participants and currently runs at 25 Hz (640 times 480 images) on our demonstrator vehicle.
doi_str_mv 10.1109/TITS.2009.2029643
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_pascalfrancis_primary_22216705</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>5229289</ieee_id><sourcerecordid>2301180161</sourcerecordid><originalsourceid>FETCH-LOGICAL-c354t-1468e46e5b9f80d93c2b012e5401bbfe5c11b3bb537d81e608b8ffe887300d7b3</originalsourceid><addsrcrecordid>eNpdkMFq3DAQhk1JoUnaByi9iEDoyemMZNnSMSSbNpCSQjY5FYSkHTUKtpVI3kLfvnZ3yaGXmWH0_YP4quojwhki6C_r6_XdGQfQc-G6bcSb6hClVDUAtgfLzJtag4R31VEpT_O2kYiH1c9VmeJgpzj-YtMjscscfy_z3WQnYimw29GnYdk80GP0PRV2ldPALPue_oE_ejuFlAd2X3Y5ypTYQywxje-rt8H2hT7s-3F1f7VaX3yrb26_Xl-c39ReyGaqsWkVNS1Jp4OCjRaeO0BOsgF0LpD0iE44J0W3UUgtKKdCIKU6AbDpnDiuPu_uPuf0sqUymSEWT31vR0rbYlQngaPo9Eye_Ec-pW0e588ZJZXQwFs-Q7iDfE6lZArmOc-O8h-DYBbbZrFtFttmb3vOnO4P2-JtH7IdfSyvQc45th3Imfu04yIRvT5LzjVXWvwF37KHXQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>858390262</pqid></control><display><type>article</type><title>Estimating the Driving State of Oncoming Vehicles From a Moving Platform Using Stereo Vision</title><source>IEEE Electronic Library (IEL)</source><creator>Barth, A. ; Franke, U.</creator><creatorcontrib>Barth, A. ; Franke, U.</creatorcontrib><description>A new image-based approach for fast and robust vehicle tracking from a moving platform is presented. Position, orientation, and full motion state, including velocity, acceleration, and yaw rate of a detected vehicle, are estimated from a tracked rigid 3-D point cloud. This point cloud represents a 3-D object model and is computed by analyzing image sequences in both space and time, i.e., by fusion of stereo vision and tracked image features. Starting from an automated initial vehicle hypothesis, tracking is performed by means of an extended Kalman filter. The filter combines the knowledge about the movement of the rigid point cloud's points in the world with the dynamic model of a vehicle. Radar information is used to improve the image-based object detection at far distances. The proposed system is applied to predict the driving path of other traffic participants and currently runs at 25 Hz (640 times 480 images) on our demonstrator vehicle.</description><identifier>ISSN: 1524-9050</identifier><identifier>EISSN: 1558-0016</identifier><identifier>DOI: 10.1109/TITS.2009.2029643</identifier><identifier>CODEN: ITISFG</identifier><language>eng</language><publisher>Piscataway, NJ: IEEE</publisher><subject>Acceleration ; Applied sciences ; Artificial intelligence ; Clouds ; Computer science; control theory; systems ; Control theory. Systems ; Driving ; Exact sciences and technology ; Ground, air and sea transportation, marine construction ; Image sequence analysis ; Kalman filtering ; Mathematical models ; Motion detection ; object detection ; Pattern recognition. Digital image processing. Computational geometry ; Platforms ; Radar tracking ; Robotics ; Robustness ; sensor data fusion ; State estimation ; Stereo vision ; Tracking ; Vehicle detection ; Vehicle driving ; Vehicles ; Vision</subject><ispartof>IEEE Transactions on intelligent transportation systems, 2009-12, Vol.10 (4), p.560-571</ispartof><rights>2015 INIST-CNRS</rights><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2009</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c354t-1468e46e5b9f80d93c2b012e5401bbfe5c11b3bb537d81e608b8ffe887300d7b3</citedby><cites>FETCH-LOGICAL-c354t-1468e46e5b9f80d93c2b012e5401bbfe5c11b3bb537d81e608b8ffe887300d7b3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/5229289$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,314,780,784,789,790,796,23929,23930,25139,27923,27924,54757</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/5229289$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=22216705$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Barth, A.</creatorcontrib><creatorcontrib>Franke, U.</creatorcontrib><title>Estimating the Driving State of Oncoming Vehicles From a Moving Platform Using Stereo Vision</title><title>IEEE Transactions on intelligent transportation systems</title><addtitle>TITS</addtitle><description>A new image-based approach for fast and robust vehicle tracking from a moving platform is presented. Position, orientation, and full motion state, including velocity, acceleration, and yaw rate of a detected vehicle, are estimated from a tracked rigid 3-D point cloud. This point cloud represents a 3-D object model and is computed by analyzing image sequences in both space and time, i.e., by fusion of stereo vision and tracked image features. Starting from an automated initial vehicle hypothesis, tracking is performed by means of an extended Kalman filter. The filter combines the knowledge about the movement of the rigid point cloud's points in the world with the dynamic model of a vehicle. Radar information is used to improve the image-based object detection at far distances. The proposed system is applied to predict the driving path of other traffic participants and currently runs at 25 Hz (640 times 480 images) on our demonstrator vehicle.</description><subject>Acceleration</subject><subject>Applied sciences</subject><subject>Artificial intelligence</subject><subject>Clouds</subject><subject>Computer science; control theory; systems</subject><subject>Control theory. Systems</subject><subject>Driving</subject><subject>Exact sciences and technology</subject><subject>Ground, air and sea transportation, marine construction</subject><subject>Image sequence analysis</subject><subject>Kalman filtering</subject><subject>Mathematical models</subject><subject>Motion detection</subject><subject>object detection</subject><subject>Pattern recognition. Digital image processing. Computational geometry</subject><subject>Platforms</subject><subject>Radar tracking</subject><subject>Robotics</subject><subject>Robustness</subject><subject>sensor data fusion</subject><subject>State estimation</subject><subject>Stereo vision</subject><subject>Tracking</subject><subject>Vehicle detection</subject><subject>Vehicle driving</subject><subject>Vehicles</subject><subject>Vision</subject><issn>1524-9050</issn><issn>1558-0016</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2009</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpdkMFq3DAQhk1JoUnaByi9iEDoyemMZNnSMSSbNpCSQjY5FYSkHTUKtpVI3kLfvnZ3yaGXmWH0_YP4quojwhki6C_r6_XdGQfQc-G6bcSb6hClVDUAtgfLzJtag4R31VEpT_O2kYiH1c9VmeJgpzj-YtMjscscfy_z3WQnYimw29GnYdk80GP0PRV2ldPALPue_oE_ejuFlAd2X3Y5ypTYQywxje-rt8H2hT7s-3F1f7VaX3yrb26_Xl-c39ReyGaqsWkVNS1Jp4OCjRaeO0BOsgF0LpD0iE44J0W3UUgtKKdCIKU6AbDpnDiuPu_uPuf0sqUymSEWT31vR0rbYlQngaPo9Eye_Ec-pW0e588ZJZXQwFs-Q7iDfE6lZArmOc-O8h-DYBbbZrFtFttmb3vOnO4P2-JtH7IdfSyvQc45th3Imfu04yIRvT5LzjVXWvwF37KHXQ</recordid><startdate>20091201</startdate><enddate>20091201</enddate><creator>Barth, A.</creator><creator>Franke, U.</creator><general>IEEE</general><general>Institute of Electrical and Electronics Engineers</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>FR3</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>F28</scope></search><sort><creationdate>20091201</creationdate><title>Estimating the Driving State of Oncoming Vehicles From a Moving Platform Using Stereo Vision</title><author>Barth, A. ; Franke, U.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c354t-1468e46e5b9f80d93c2b012e5401bbfe5c11b3bb537d81e608b8ffe887300d7b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2009</creationdate><topic>Acceleration</topic><topic>Applied sciences</topic><topic>Artificial intelligence</topic><topic>Clouds</topic><topic>Computer science; control theory; systems</topic><topic>Control theory. Systems</topic><topic>Driving</topic><topic>Exact sciences and technology</topic><topic>Ground, air and sea transportation, marine construction</topic><topic>Image sequence analysis</topic><topic>Kalman filtering</topic><topic>Mathematical models</topic><topic>Motion detection</topic><topic>object detection</topic><topic>Pattern recognition. Digital image processing. Computational geometry</topic><topic>Platforms</topic><topic>Radar tracking</topic><topic>Robotics</topic><topic>Robustness</topic><topic>sensor data fusion</topic><topic>State estimation</topic><topic>Stereo vision</topic><topic>Tracking</topic><topic>Vehicle detection</topic><topic>Vehicle driving</topic><topic>Vehicles</topic><topic>Vision</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Barth, A.</creatorcontrib><creatorcontrib>Franke, U.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><jtitle>IEEE Transactions on intelligent transportation systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Barth, A.</au><au>Franke, U.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Estimating the Driving State of Oncoming Vehicles From a Moving Platform Using Stereo Vision</atitle><jtitle>IEEE Transactions on intelligent transportation systems</jtitle><stitle>TITS</stitle><date>2009-12-01</date><risdate>2009</risdate><volume>10</volume><issue>4</issue><spage>560</spage><epage>571</epage><pages>560-571</pages><issn>1524-9050</issn><eissn>1558-0016</eissn><coden>ITISFG</coden><abstract>A new image-based approach for fast and robust vehicle tracking from a moving platform is presented. Position, orientation, and full motion state, including velocity, acceleration, and yaw rate of a detected vehicle, are estimated from a tracked rigid 3-D point cloud. This point cloud represents a 3-D object model and is computed by analyzing image sequences in both space and time, i.e., by fusion of stereo vision and tracked image features. Starting from an automated initial vehicle hypothesis, tracking is performed by means of an extended Kalman filter. The filter combines the knowledge about the movement of the rigid point cloud's points in the world with the dynamic model of a vehicle. Radar information is used to improve the image-based object detection at far distances. The proposed system is applied to predict the driving path of other traffic participants and currently runs at 25 Hz (640 times 480 images) on our demonstrator vehicle.</abstract><cop>Piscataway, NJ</cop><pub>IEEE</pub><doi>10.1109/TITS.2009.2029643</doi><tpages>12</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1524-9050
ispartof IEEE Transactions on intelligent transportation systems, 2009-12, Vol.10 (4), p.560-571
issn 1524-9050
1558-0016
language eng
recordid cdi_pascalfrancis_primary_22216705
source IEEE Electronic Library (IEL)
subjects Acceleration
Applied sciences
Artificial intelligence
Clouds
Computer science
control theory
systems
Control theory. Systems
Driving
Exact sciences and technology
Ground, air and sea transportation, marine construction
Image sequence analysis
Kalman filtering
Mathematical models
Motion detection
object detection
Pattern recognition. Digital image processing. Computational geometry
Platforms
Radar tracking
Robotics
Robustness
sensor data fusion
State estimation
Stereo vision
Tracking
Vehicle detection
Vehicle driving
Vehicles
Vision
title Estimating the Driving State of Oncoming Vehicles From a Moving Platform Using Stereo Vision
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-12T12%3A08%3A38IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Estimating%20the%20Driving%20State%20of%20Oncoming%20Vehicles%20From%20a%20Moving%20Platform%20Using%20Stereo%20Vision&rft.jtitle=IEEE%20Transactions%20on%20intelligent%20transportation%20systems&rft.au=Barth,%20A.&rft.date=2009-12-01&rft.volume=10&rft.issue=4&rft.spage=560&rft.epage=571&rft.pages=560-571&rft.issn=1524-9050&rft.eissn=1558-0016&rft.coden=ITISFG&rft_id=info:doi/10.1109/TITS.2009.2029643&rft_dat=%3Cproquest_RIE%3E2301180161%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=858390262&rft_id=info:pmid/&rft_ieee_id=5229289&rfr_iscdi=true