Airborne vision-based collision-detection system

Machine vision represents a particularly attractive solution for sensing and detecting potential collision‐course targets due to the relatively low cost, size, weight, and power requirements of vision sensors (as opposed to radar and Traffic Alert and Collision Avoidance System). This paper describe...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of field robotics 2011-03, Vol.28 (2), p.137-157
Hauptverfasser: Lai, John, Mejias, Luis, Ford, Jason J.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 157
container_issue 2
container_start_page 137
container_title Journal of field robotics
container_volume 28
creator Lai, John
Mejias, Luis
Ford, Jason J.
description Machine vision represents a particularly attractive solution for sensing and detecting potential collision‐course targets due to the relatively low cost, size, weight, and power requirements of vision sensors (as opposed to radar and Traffic Alert and Collision Avoidance System). This paper describes the development and evaluation of a real‐time, vision‐based collision‐detection system suitable for fixed‐wing aerial robotics. Using two fixed‐wing unmanned aerial vehicles (UAVs) to recreate various collision‐course scenarios, we were able to capture highly realistic vision (from an onboard camera perspective) of the moments leading up to a collision. This type of image data is extremely scarce and was invaluable in evaluating the detection performance of two candidate target detection approaches. Based on the collected data, our detection approaches were able to detect targets at distances ranging from 400 to about 900 m. These distances (with some assumptions about closing speeds and aircraft trajectories) translate to an advance warning of between 8 and 10 s ahead of impact, which approaches the 12.5‐s response time recommended for human pilots. We overcame the challenge of achieving real‐time computational speeds by exploiting the parallel processing architectures of graphics processing units (GPUs) found on commercial‐off‐the‐shelf graphics devices. Our chosen GPU device suitable for integration onto UAV platforms can be expected to handle real‐time processing of 1,024 × 768 pixel image frames at a rate of approximately 30 Hz. Flight trials using manned Cessna aircraft in which all processing is performed onboard will be conducted in the near future, followed by further experiments with fully autonomous UAV platforms. © 2010 Wiley Periodicals, Inc.
doi_str_mv 10.1002/rob.20359
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_901668181</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>901668181</sourcerecordid><originalsourceid>FETCH-LOGICAL-c4409-152bf95c8f5490e6b74822dc3645a228f1740514a8a092c72f1ea80eb29a25493</originalsourceid><addsrcrecordid>eNp1kE1LAzEQhoMoWKsH_0Fv4mHbJJvPYy12lRYL4scxZNNZiG6bmmzV_ntXV3vzNO_A8wzMi9A5wUOCMR3FUA4pzrk-QD3CuciYFvJwn7k-RicpvWDMcqV5D-Gxj2WIaxi8--TDOittguXAhbru9iU04Jo2DdIuNbA6RUeVrROc_c4-epxeP0xusvmiuJ2M55ljDOuMcFpWmjtVcaYxiFIyRenS5YJxS6mqiGSYE2aVxZo6SSsCVmEoqba0VfI-uujubmJ420JqzMonB3Vt1xC2yWhMhFBEkZa87EgXQ0oRKrOJfmXjzhBsvksxbSnmp5SWHXXsh69h9z9o7hdXf0bWGb79_nNv2PhqhMwlN893hZnPZk-yYIVR-ReTFHEK</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>901668181</pqid></control><display><type>article</type><title>Airborne vision-based collision-detection system</title><source>Wiley Online Library Journals Frontfile Complete</source><creator>Lai, John ; Mejias, Luis ; Ford, Jason J.</creator><creatorcontrib>Lai, John ; Mejias, Luis ; Ford, Jason J.</creatorcontrib><description>Machine vision represents a particularly attractive solution for sensing and detecting potential collision‐course targets due to the relatively low cost, size, weight, and power requirements of vision sensors (as opposed to radar and Traffic Alert and Collision Avoidance System). This paper describes the development and evaluation of a real‐time, vision‐based collision‐detection system suitable for fixed‐wing aerial robotics. Using two fixed‐wing unmanned aerial vehicles (UAVs) to recreate various collision‐course scenarios, we were able to capture highly realistic vision (from an onboard camera perspective) of the moments leading up to a collision. This type of image data is extremely scarce and was invaluable in evaluating the detection performance of two candidate target detection approaches. Based on the collected data, our detection approaches were able to detect targets at distances ranging from 400 to about 900 m. These distances (with some assumptions about closing speeds and aircraft trajectories) translate to an advance warning of between 8 and 10 s ahead of impact, which approaches the 12.5‐s response time recommended for human pilots. We overcame the challenge of achieving real‐time computational speeds by exploiting the parallel processing architectures of graphics processing units (GPUs) found on commercial‐off‐the‐shelf graphics devices. Our chosen GPU device suitable for integration onto UAV platforms can be expected to handle real‐time processing of 1,024 × 768 pixel image frames at a rate of approximately 30 Hz. Flight trials using manned Cessna aircraft in which all processing is performed onboard will be conducted in the near future, followed by further experiments with fully autonomous UAV platforms. © 2010 Wiley Periodicals, Inc.</description><identifier>ISSN: 1556-4959</identifier><identifier>ISSN: 1556-4967</identifier><identifier>EISSN: 1556-4967</identifier><identifier>DOI: 10.1002/rob.20359</identifier><language>eng</language><publisher>Hoboken: Wiley Subscription Services, Inc., A Wiley Company</publisher><subject>Computation ; Devices ; Onboard ; Platforms ; Real time ; Unmanned aerial vehicles ; Vision</subject><ispartof>Journal of field robotics, 2011-03, Vol.28 (2), p.137-157</ispartof><rights>Copyright © 2010 Wiley Periodicals, Inc.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c4409-152bf95c8f5490e6b74822dc3645a228f1740514a8a092c72f1ea80eb29a25493</citedby><cites>FETCH-LOGICAL-c4409-152bf95c8f5490e6b74822dc3645a228f1740514a8a092c72f1ea80eb29a25493</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1002%2Frob.20359$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1002%2Frob.20359$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>314,776,780,1411,27903,27904,45553,45554</link.rule.ids></links><search><creatorcontrib>Lai, John</creatorcontrib><creatorcontrib>Mejias, Luis</creatorcontrib><creatorcontrib>Ford, Jason J.</creatorcontrib><title>Airborne vision-based collision-detection system</title><title>Journal of field robotics</title><addtitle>J. Field Robotics</addtitle><description>Machine vision represents a particularly attractive solution for sensing and detecting potential collision‐course targets due to the relatively low cost, size, weight, and power requirements of vision sensors (as opposed to radar and Traffic Alert and Collision Avoidance System). This paper describes the development and evaluation of a real‐time, vision‐based collision‐detection system suitable for fixed‐wing aerial robotics. Using two fixed‐wing unmanned aerial vehicles (UAVs) to recreate various collision‐course scenarios, we were able to capture highly realistic vision (from an onboard camera perspective) of the moments leading up to a collision. This type of image data is extremely scarce and was invaluable in evaluating the detection performance of two candidate target detection approaches. Based on the collected data, our detection approaches were able to detect targets at distances ranging from 400 to about 900 m. These distances (with some assumptions about closing speeds and aircraft trajectories) translate to an advance warning of between 8 and 10 s ahead of impact, which approaches the 12.5‐s response time recommended for human pilots. We overcame the challenge of achieving real‐time computational speeds by exploiting the parallel processing architectures of graphics processing units (GPUs) found on commercial‐off‐the‐shelf graphics devices. Our chosen GPU device suitable for integration onto UAV platforms can be expected to handle real‐time processing of 1,024 × 768 pixel image frames at a rate of approximately 30 Hz. Flight trials using manned Cessna aircraft in which all processing is performed onboard will be conducted in the near future, followed by further experiments with fully autonomous UAV platforms. © 2010 Wiley Periodicals, Inc.</description><subject>Computation</subject><subject>Devices</subject><subject>Onboard</subject><subject>Platforms</subject><subject>Real time</subject><subject>Unmanned aerial vehicles</subject><subject>Vision</subject><issn>1556-4959</issn><issn>1556-4967</issn><issn>1556-4967</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2011</creationdate><recordtype>article</recordtype><recordid>eNp1kE1LAzEQhoMoWKsH_0Fv4mHbJJvPYy12lRYL4scxZNNZiG6bmmzV_ntXV3vzNO_A8wzMi9A5wUOCMR3FUA4pzrk-QD3CuciYFvJwn7k-RicpvWDMcqV5D-Gxj2WIaxi8--TDOittguXAhbru9iU04Jo2DdIuNbA6RUeVrROc_c4-epxeP0xusvmiuJ2M55ljDOuMcFpWmjtVcaYxiFIyRenS5YJxS6mqiGSYE2aVxZo6SSsCVmEoqba0VfI-uujubmJ420JqzMonB3Vt1xC2yWhMhFBEkZa87EgXQ0oRKrOJfmXjzhBsvksxbSnmp5SWHXXsh69h9z9o7hdXf0bWGb79_nNv2PhqhMwlN893hZnPZk-yYIVR-ReTFHEK</recordid><startdate>201103</startdate><enddate>201103</enddate><creator>Lai, John</creator><creator>Mejias, Luis</creator><creator>Ford, Jason J.</creator><general>Wiley Subscription Services, Inc., A Wiley Company</general><scope>BSCLL</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>201103</creationdate><title>Airborne vision-based collision-detection system</title><author>Lai, John ; Mejias, Luis ; Ford, Jason J.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c4409-152bf95c8f5490e6b74822dc3645a228f1740514a8a092c72f1ea80eb29a25493</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2011</creationdate><topic>Computation</topic><topic>Devices</topic><topic>Onboard</topic><topic>Platforms</topic><topic>Real time</topic><topic>Unmanned aerial vehicles</topic><topic>Vision</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lai, John</creatorcontrib><creatorcontrib>Mejias, Luis</creatorcontrib><creatorcontrib>Ford, Jason J.</creatorcontrib><collection>Istex</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Journal of field robotics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lai, John</au><au>Mejias, Luis</au><au>Ford, Jason J.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Airborne vision-based collision-detection system</atitle><jtitle>Journal of field robotics</jtitle><addtitle>J. Field Robotics</addtitle><date>2011-03</date><risdate>2011</risdate><volume>28</volume><issue>2</issue><spage>137</spage><epage>157</epage><pages>137-157</pages><issn>1556-4959</issn><issn>1556-4967</issn><eissn>1556-4967</eissn><abstract>Machine vision represents a particularly attractive solution for sensing and detecting potential collision‐course targets due to the relatively low cost, size, weight, and power requirements of vision sensors (as opposed to radar and Traffic Alert and Collision Avoidance System). This paper describes the development and evaluation of a real‐time, vision‐based collision‐detection system suitable for fixed‐wing aerial robotics. Using two fixed‐wing unmanned aerial vehicles (UAVs) to recreate various collision‐course scenarios, we were able to capture highly realistic vision (from an onboard camera perspective) of the moments leading up to a collision. This type of image data is extremely scarce and was invaluable in evaluating the detection performance of two candidate target detection approaches. Based on the collected data, our detection approaches were able to detect targets at distances ranging from 400 to about 900 m. These distances (with some assumptions about closing speeds and aircraft trajectories) translate to an advance warning of between 8 and 10 s ahead of impact, which approaches the 12.5‐s response time recommended for human pilots. We overcame the challenge of achieving real‐time computational speeds by exploiting the parallel processing architectures of graphics processing units (GPUs) found on commercial‐off‐the‐shelf graphics devices. Our chosen GPU device suitable for integration onto UAV platforms can be expected to handle real‐time processing of 1,024 × 768 pixel image frames at a rate of approximately 30 Hz. Flight trials using manned Cessna aircraft in which all processing is performed onboard will be conducted in the near future, followed by further experiments with fully autonomous UAV platforms. © 2010 Wiley Periodicals, Inc.</abstract><cop>Hoboken</cop><pub>Wiley Subscription Services, Inc., A Wiley Company</pub><doi>10.1002/rob.20359</doi><tpages>21</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1556-4959
ispartof Journal of field robotics, 2011-03, Vol.28 (2), p.137-157
issn 1556-4959
1556-4967
1556-4967
language eng
recordid cdi_proquest_miscellaneous_901668181
source Wiley Online Library Journals Frontfile Complete
subjects Computation
Devices
Onboard
Platforms
Real time
Unmanned aerial vehicles
Vision
title Airborne vision-based collision-detection system
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-27T05%3A36%3A42IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Airborne%20vision-based%20collision-detection%20system&rft.jtitle=Journal%20of%20field%20robotics&rft.au=Lai,%20John&rft.date=2011-03&rft.volume=28&rft.issue=2&rft.spage=137&rft.epage=157&rft.pages=137-157&rft.issn=1556-4959&rft.eissn=1556-4967&rft_id=info:doi/10.1002/rob.20359&rft_dat=%3Cproquest_cross%3E901668181%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=901668181&rft_id=info:pmid/&rfr_iscdi=true