Towards semi-autonomous operation of under-actuated underwater vehicles: sensor fusion, on-line identification and visual servo control
In this paper we propose a framework for semi-autonomous operation of an under-actuated underwater vehicle. The contributions of this paper are twofold: The first contribution is a visual servoing control scheme that is designed to provide a human operator the capability to steer the vehicle without...
Gespeichert in:
Veröffentlicht in: | Autonomous robots 2011-07, Vol.31 (1), p.67-86 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 86 |
---|---|
container_issue | 1 |
container_start_page | 67 |
container_title | Autonomous robots |
container_volume | 31 |
creator | Karras, George C. Loizou, Savvas G. Kyriakopoulos, Kostas J. |
description | In this paper we propose a framework for semi-autonomous operation of an under-actuated underwater vehicle. The contributions of this paper are twofold: The first contribution is a visual servoing control scheme that is designed to provide a human operator the capability to steer the vehicle without loosing the target from the vision system’s field of view. It is shown that the under-actuated degree of freedom is input-to-state stable (ISS) and a shaping of the user input with stability guarantees is implemented. The resulting control scheme has formally guaranteed stability and convergence properties. The second contribution is an asynchronous Modified Dual Unscented Kalman Filter (MDUKF) for the on-line state and parameter estimation of the vehicle by fusing data from a Laser Vision System (LVS) and an Inertial Measurement Unit (IMU). The MDUKF has been developed in order to experimentally verify the performance of the proposed visual servoing control scheme.
Experimental results of the visual servoing control scheme integrated with the asynchronous MDUKF indicate the feasibility and applicability of the proposed control scheme. Experiments have been carried out on a small under-actuated Remotely Operated Vehicle (ROV) in a test tank. |
doi_str_mv | 10.1007/s10514-011-9231-6 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_919929647</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2361199901</sourcerecordid><originalsourceid>FETCH-LOGICAL-c445t-46e090242e3f80a2eee1786385f1a3c592f223b0b4f87330230da66db7054d4f3</originalsourceid><addsrcrecordid>eNqNksuKFDEUhoMo2LY-gLvgRhdGT-4VdzJ4gwE34zqkq040Q3XSJlUz-AS-tmlKEAQvqyTwfX_IyU_IYw4vOIB92ThorhhwzpyQnJk7ZMe1lcxqYe-SHTjhmNZO3icPWrsGAGcBduT7VbkNdWq04TGxsC4ll2NZGy0nrGFJJdMS6ZonrCyMyxoWnLbjbd9WeoNf0jhje9UDciuVxrV16Tktmc0pI00T5iXFNG5hIU_0JrU1zF2oN4WOJS-1zA_JvRjmho9-rnvy6e2bq4v37PLjuw8Xry_ZqJRemDIIDoQSKOMAQSAit4ORg448yFE7EYWQBzioOFgpQUiYgjHTwYJWk4pyT55uuadavq7YFn9MbcR5Dhn7s73jrk_KKPtvEqwbtOpX7Mmzv5Lcyv5FXHD4P9RIzs-pT35Dr8tacx-OH4wD45TUHeIbNNbSWsXoTzUdQ_3mOfhzMfxWDN-L4c_F8KY7YnNaZ_NnrL-C_yz9AJxHu68</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>869069435</pqid></control><display><type>article</type><title>Towards semi-autonomous operation of under-actuated underwater vehicles: sensor fusion, on-line identification and visual servo control</title><source>SpringerLink Journals</source><creator>Karras, George C. ; Loizou, Savvas G. ; Kyriakopoulos, Kostas J.</creator><creatorcontrib>Karras, George C. ; Loizou, Savvas G. ; Kyriakopoulos, Kostas J.</creatorcontrib><description>In this paper we propose a framework for semi-autonomous operation of an under-actuated underwater vehicle. The contributions of this paper are twofold: The first contribution is a visual servoing control scheme that is designed to provide a human operator the capability to steer the vehicle without loosing the target from the vision system’s field of view. It is shown that the under-actuated degree of freedom is input-to-state stable (ISS) and a shaping of the user input with stability guarantees is implemented. The resulting control scheme has formally guaranteed stability and convergence properties. The second contribution is an asynchronous Modified Dual Unscented Kalman Filter (MDUKF) for the on-line state and parameter estimation of the vehicle by fusing data from a Laser Vision System (LVS) and an Inertial Measurement Unit (IMU). The MDUKF has been developed in order to experimentally verify the performance of the proposed visual servoing control scheme.
Experimental results of the visual servoing control scheme integrated with the asynchronous MDUKF indicate the feasibility and applicability of the proposed control scheme. Experiments have been carried out on a small under-actuated Remotely Operated Vehicle (ROV) in a test tank.</description><identifier>ISSN: 0929-5593</identifier><identifier>EISSN: 1573-7527</identifier><identifier>DOI: 10.1007/s10514-011-9231-6</identifier><language>eng</language><publisher>Boston: Springer US</publisher><subject>Artificial Intelligence ; Autonomous underwater vehicles ; Computer Imaging ; Control ; Control stability ; Control systems ; Engineering ; Field of view ; Inertial platforms ; International Space Station ; Kalman filters ; Mechatronics ; On-line systems ; Parameter estimation ; Pattern Recognition and Graphics ; Remotely operated vehicles ; Robot control ; Robotics ; Robotics and Automation ; Robots ; Servocontrol ; Stability ; Underwater vehicles ; Vehicles ; Vision ; Vision systems ; Visual</subject><ispartof>Autonomous robots, 2011-07, Vol.31 (1), p.67-86</ispartof><rights>Springer Science+Business Media, LLC 2011</rights><rights>Autonomous Robots is a copyright of Springer, (2011). All Rights Reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c445t-46e090242e3f80a2eee1786385f1a3c592f223b0b4f87330230da66db7054d4f3</citedby><cites>FETCH-LOGICAL-c445t-46e090242e3f80a2eee1786385f1a3c592f223b0b4f87330230da66db7054d4f3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10514-011-9231-6$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10514-011-9231-6$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Karras, George C.</creatorcontrib><creatorcontrib>Loizou, Savvas G.</creatorcontrib><creatorcontrib>Kyriakopoulos, Kostas J.</creatorcontrib><title>Towards semi-autonomous operation of under-actuated underwater vehicles: sensor fusion, on-line identification and visual servo control</title><title>Autonomous robots</title><addtitle>Auton Robot</addtitle><description>In this paper we propose a framework for semi-autonomous operation of an under-actuated underwater vehicle. The contributions of this paper are twofold: The first contribution is a visual servoing control scheme that is designed to provide a human operator the capability to steer the vehicle without loosing the target from the vision system’s field of view. It is shown that the under-actuated degree of freedom is input-to-state stable (ISS) and a shaping of the user input with stability guarantees is implemented. The resulting control scheme has formally guaranteed stability and convergence properties. The second contribution is an asynchronous Modified Dual Unscented Kalman Filter (MDUKF) for the on-line state and parameter estimation of the vehicle by fusing data from a Laser Vision System (LVS) and an Inertial Measurement Unit (IMU). The MDUKF has been developed in order to experimentally verify the performance of the proposed visual servoing control scheme.
Experimental results of the visual servoing control scheme integrated with the asynchronous MDUKF indicate the feasibility and applicability of the proposed control scheme. Experiments have been carried out on a small under-actuated Remotely Operated Vehicle (ROV) in a test tank.</description><subject>Artificial Intelligence</subject><subject>Autonomous underwater vehicles</subject><subject>Computer Imaging</subject><subject>Control</subject><subject>Control stability</subject><subject>Control systems</subject><subject>Engineering</subject><subject>Field of view</subject><subject>Inertial platforms</subject><subject>International Space Station</subject><subject>Kalman filters</subject><subject>Mechatronics</subject><subject>On-line systems</subject><subject>Parameter estimation</subject><subject>Pattern Recognition and Graphics</subject><subject>Remotely operated vehicles</subject><subject>Robot control</subject><subject>Robotics</subject><subject>Robotics and Automation</subject><subject>Robots</subject><subject>Servocontrol</subject><subject>Stability</subject><subject>Underwater vehicles</subject><subject>Vehicles</subject><subject>Vision</subject><subject>Vision systems</subject><subject>Visual</subject><issn>0929-5593</issn><issn>1573-7527</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2011</creationdate><recordtype>article</recordtype><sourceid>AFKRA</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNksuKFDEUhoMo2LY-gLvgRhdGT-4VdzJ4gwE34zqkq040Q3XSJlUz-AS-tmlKEAQvqyTwfX_IyU_IYw4vOIB92ThorhhwzpyQnJk7ZMe1lcxqYe-SHTjhmNZO3icPWrsGAGcBduT7VbkNdWq04TGxsC4ll2NZGy0nrGFJJdMS6ZonrCyMyxoWnLbjbd9WeoNf0jhje9UDciuVxrV16Tktmc0pI00T5iXFNG5hIU_0JrU1zF2oN4WOJS-1zA_JvRjmho9-rnvy6e2bq4v37PLjuw8Xry_ZqJRemDIIDoQSKOMAQSAit4ORg448yFE7EYWQBzioOFgpQUiYgjHTwYJWk4pyT55uuadavq7YFn9MbcR5Dhn7s73jrk_KKPtvEqwbtOpX7Mmzv5Lcyv5FXHD4P9RIzs-pT35Dr8tacx-OH4wD45TUHeIbNNbSWsXoTzUdQ_3mOfhzMfxWDN-L4c_F8KY7YnNaZ_NnrL-C_yz9AJxHu68</recordid><startdate>20110701</startdate><enddate>20110701</enddate><creator>Karras, George C.</creator><creator>Loizou, Savvas G.</creator><creator>Kyriakopoulos, Kostas J.</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>F28</scope><scope>FR3</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>S0W</scope><scope>7TN</scope><scope>F1W</scope><scope>H96</scope><scope>L.G</scope></search><sort><creationdate>20110701</creationdate><title>Towards semi-autonomous operation of under-actuated underwater vehicles: sensor fusion, on-line identification and visual servo control</title><author>Karras, George C. ; Loizou, Savvas G. ; Kyriakopoulos, Kostas J.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c445t-46e090242e3f80a2eee1786385f1a3c592f223b0b4f87330230da66db7054d4f3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2011</creationdate><topic>Artificial Intelligence</topic><topic>Autonomous underwater vehicles</topic><topic>Computer Imaging</topic><topic>Control</topic><topic>Control stability</topic><topic>Control systems</topic><topic>Engineering</topic><topic>Field of view</topic><topic>Inertial platforms</topic><topic>International Space Station</topic><topic>Kalman filters</topic><topic>Mechatronics</topic><topic>On-line systems</topic><topic>Parameter estimation</topic><topic>Pattern Recognition and Graphics</topic><topic>Remotely operated vehicles</topic><topic>Robot control</topic><topic>Robotics</topic><topic>Robotics and Automation</topic><topic>Robots</topic><topic>Servocontrol</topic><topic>Stability</topic><topic>Underwater vehicles</topic><topic>Vehicles</topic><topic>Vision</topic><topic>Vision systems</topic><topic>Visual</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Karras, George C.</creatorcontrib><creatorcontrib>Loizou, Savvas G.</creatorcontrib><creatorcontrib>Kyriakopoulos, Kostas J.</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Engineering Database</collection><collection>ProQuest advanced technologies & aerospace journals</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection><collection>DELNET Engineering & Technology Collection</collection><collection>Oceanic Abstracts</collection><collection>ASFA: Aquatic Sciences and Fisheries Abstracts</collection><collection>Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources</collection><collection>Aquatic Science & Fisheries Abstracts (ASFA) Professional</collection><jtitle>Autonomous robots</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Karras, George C.</au><au>Loizou, Savvas G.</au><au>Kyriakopoulos, Kostas J.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Towards semi-autonomous operation of under-actuated underwater vehicles: sensor fusion, on-line identification and visual servo control</atitle><jtitle>Autonomous robots</jtitle><stitle>Auton Robot</stitle><date>2011-07-01</date><risdate>2011</risdate><volume>31</volume><issue>1</issue><spage>67</spage><epage>86</epage><pages>67-86</pages><issn>0929-5593</issn><eissn>1573-7527</eissn><abstract>In this paper we propose a framework for semi-autonomous operation of an under-actuated underwater vehicle. The contributions of this paper are twofold: The first contribution is a visual servoing control scheme that is designed to provide a human operator the capability to steer the vehicle without loosing the target from the vision system’s field of view. It is shown that the under-actuated degree of freedom is input-to-state stable (ISS) and a shaping of the user input with stability guarantees is implemented. The resulting control scheme has formally guaranteed stability and convergence properties. The second contribution is an asynchronous Modified Dual Unscented Kalman Filter (MDUKF) for the on-line state and parameter estimation of the vehicle by fusing data from a Laser Vision System (LVS) and an Inertial Measurement Unit (IMU). The MDUKF has been developed in order to experimentally verify the performance of the proposed visual servoing control scheme.
Experimental results of the visual servoing control scheme integrated with the asynchronous MDUKF indicate the feasibility and applicability of the proposed control scheme. Experiments have been carried out on a small under-actuated Remotely Operated Vehicle (ROV) in a test tank.</abstract><cop>Boston</cop><pub>Springer US</pub><doi>10.1007/s10514-011-9231-6</doi><tpages>20</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0929-5593 |
ispartof | Autonomous robots, 2011-07, Vol.31 (1), p.67-86 |
issn | 0929-5593 1573-7527 |
language | eng |
recordid | cdi_proquest_miscellaneous_919929647 |
source | SpringerLink Journals |
subjects | Artificial Intelligence Autonomous underwater vehicles Computer Imaging Control Control stability Control systems Engineering Field of view Inertial platforms International Space Station Kalman filters Mechatronics On-line systems Parameter estimation Pattern Recognition and Graphics Remotely operated vehicles Robot control Robotics Robotics and Automation Robots Servocontrol Stability Underwater vehicles Vehicles Vision Vision systems Visual |
title | Towards semi-autonomous operation of under-actuated underwater vehicles: sensor fusion, on-line identification and visual servo control |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T06%3A56%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Towards%20semi-autonomous%20operation%20of%20under-actuated%20underwater%20vehicles:%20sensor%20fusion,%20on-line%20identification%20and%20visual%20servo%20control&rft.jtitle=Autonomous%20robots&rft.au=Karras,%20George%20C.&rft.date=2011-07-01&rft.volume=31&rft.issue=1&rft.spage=67&rft.epage=86&rft.pages=67-86&rft.issn=0929-5593&rft.eissn=1573-7527&rft_id=info:doi/10.1007/s10514-011-9231-6&rft_dat=%3Cproquest_cross%3E2361199901%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=869069435&rft_id=info:pmid/&rfr_iscdi=true |