Adaptive robotic visual tracking: theory and experiments
The use of a vision sensor in the feedback loop is addressed within the controlled active vision framework. Algorithms are proposed for the solution of the robotic (eye-in-hand configuration) visual tracking and servoing problem. Visual tracking is stated as a problem of combining control with compu...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on automatic control 1993-03, Vol.38 (3), p.429-445 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 445 |
---|---|
container_issue | 3 |
container_start_page | 429 |
container_title | IEEE transactions on automatic control |
container_volume | 38 |
creator | Papanikolopoulos, N.P. Khosla, P.K. |
description | The use of a vision sensor in the feedback loop is addressed within the controlled active vision framework. Algorithms are proposed for the solution of the robotic (eye-in-hand configuration) visual tracking and servoing problem. Visual tracking is stated as a problem of combining control with computer vision. The sum-of-squared differences optical flow is used to compute the vector of discrete displacements. The displacements are fed to an adaptive controller (self-tuning regulator) that creates commands for a robot control system. The procedure is based on the online estimation of the relative distance of the target from the camera, but only partial knowledge of the relative distance is required, obviating the need for offline calibration. Three different adaptive control schemes have been implemented, both in simulation and in experiments. The computational complexity and the experimental results demonstrate that the proposed algorithms can be implemented in real time.< > |
doi_str_mv | 10.1109/9.210141 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_miscellaneous_28230707</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>210141</ieee_id><sourcerecordid>26009811</sourcerecordid><originalsourceid>FETCH-LOGICAL-c492t-eb0efdc5da71b059cab491c94ad541a134eb8707433c7fea5b025f8bf37b526b3</originalsourceid><addsrcrecordid>eNqN0ctLxDAQBvAgCq4P8OypBxEvXTN5tIm3ZfEFC170XJJ0qtFuW5Pu4v73Vrrs1T2FIT--gfkIuQA6BaD6Vk8ZUBBwQCYgpUqZZPyQTCgFlWqmsmNyEuPnMGZCwISoWWm63q8xCa1te--StY8rUyd9MO7LN-93Sf-BbdgkpikT_Okw-CU2fTwjR5WpI55v31Py9nD_On9KFy-Pz_PZInVCsz5FS7EqnSxNDpZK7YwVGpwWppQCDHCBVuU0F5y7vEIjLWWyUrbiuZUss_yUXI-5XWi_Vxj7Yumjw7o2DbarWDClciYk7AEZp8Om_2FGqVawR6LUOqOcDvBmhC60MQasim64kgmbAmjx10qhi7GVgV5tM010pq6CaZyPOy8ypYZqBnY5Mo-Iu99txi_36pLM</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>25996030</pqid></control><display><type>article</type><title>Adaptive robotic visual tracking: theory and experiments</title><source>IEEE Electronic Library (IEL)</source><creator>Papanikolopoulos, N.P. ; Khosla, P.K.</creator><creatorcontrib>Papanikolopoulos, N.P. ; Khosla, P.K.</creatorcontrib><description>The use of a vision sensor in the feedback loop is addressed within the controlled active vision framework. Algorithms are proposed for the solution of the robotic (eye-in-hand configuration) visual tracking and servoing problem. Visual tracking is stated as a problem of combining control with computer vision. The sum-of-squared differences optical flow is used to compute the vector of discrete displacements. The displacements are fed to an adaptive controller (self-tuning regulator) that creates commands for a robot control system. The procedure is based on the online estimation of the relative distance of the target from the camera, but only partial knowledge of the relative distance is required, obviating the need for offline calibration. Three different adaptive control schemes have been implemented, both in simulation and in experiments. The computational complexity and the experimental results demonstrate that the proposed algorithms can be implemented in real time.< ></description><identifier>ISSN: 0018-9286</identifier><identifier>EISSN: 1558-2523</identifier><identifier>DOI: 10.1109/9.210141</identifier><identifier>CODEN: IETAA9</identifier><language>eng</language><publisher>New York, NY: IEEE</publisher><subject>Adaptive control ; Applied sciences ; Computer science; control theory; systems ; Computer vision ; Control theory. Systems ; Displacement control ; Exact sciences and technology ; Feedback loop ; Image motion analysis ; Optical computing ; Optical feedback ; Optical sensors ; Programmable control ; Robot sensing systems ; Robotics</subject><ispartof>IEEE transactions on automatic control, 1993-03, Vol.38 (3), p.429-445</ispartof><rights>1993 INIST-CNRS</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c492t-eb0efdc5da71b059cab491c94ad541a134eb8707433c7fea5b025f8bf37b526b3</citedby><cites>FETCH-LOGICAL-c492t-eb0efdc5da71b059cab491c94ad541a134eb8707433c7fea5b025f8bf37b526b3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/210141$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/210141$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=4688016$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Papanikolopoulos, N.P.</creatorcontrib><creatorcontrib>Khosla, P.K.</creatorcontrib><title>Adaptive robotic visual tracking: theory and experiments</title><title>IEEE transactions on automatic control</title><addtitle>TAC</addtitle><description>The use of a vision sensor in the feedback loop is addressed within the controlled active vision framework. Algorithms are proposed for the solution of the robotic (eye-in-hand configuration) visual tracking and servoing problem. Visual tracking is stated as a problem of combining control with computer vision. The sum-of-squared differences optical flow is used to compute the vector of discrete displacements. The displacements are fed to an adaptive controller (self-tuning regulator) that creates commands for a robot control system. The procedure is based on the online estimation of the relative distance of the target from the camera, but only partial knowledge of the relative distance is required, obviating the need for offline calibration. Three different adaptive control schemes have been implemented, both in simulation and in experiments. The computational complexity and the experimental results demonstrate that the proposed algorithms can be implemented in real time.< ></description><subject>Adaptive control</subject><subject>Applied sciences</subject><subject>Computer science; control theory; systems</subject><subject>Computer vision</subject><subject>Control theory. Systems</subject><subject>Displacement control</subject><subject>Exact sciences and technology</subject><subject>Feedback loop</subject><subject>Image motion analysis</subject><subject>Optical computing</subject><subject>Optical feedback</subject><subject>Optical sensors</subject><subject>Programmable control</subject><subject>Robot sensing systems</subject><subject>Robotics</subject><issn>0018-9286</issn><issn>1558-2523</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>1993</creationdate><recordtype>article</recordtype><recordid>eNqN0ctLxDAQBvAgCq4P8OypBxEvXTN5tIm3ZfEFC170XJJ0qtFuW5Pu4v73Vrrs1T2FIT--gfkIuQA6BaD6Vk8ZUBBwQCYgpUqZZPyQTCgFlWqmsmNyEuPnMGZCwISoWWm63q8xCa1te--StY8rUyd9MO7LN-93Sf-BbdgkpikT_Okw-CU2fTwjR5WpI55v31Py9nD_On9KFy-Pz_PZInVCsz5FS7EqnSxNDpZK7YwVGpwWppQCDHCBVuU0F5y7vEIjLWWyUrbiuZUss_yUXI-5XWi_Vxj7Yumjw7o2DbarWDClciYk7AEZp8Om_2FGqVawR6LUOqOcDvBmhC60MQasim64kgmbAmjx10qhi7GVgV5tM010pq6CaZyPOy8ypYZqBnY5Mo-Iu99txi_36pLM</recordid><startdate>19930301</startdate><enddate>19930301</enddate><creator>Papanikolopoulos, N.P.</creator><creator>Khosla, P.K.</creator><general>IEEE</general><general>Institute of Electrical and Electronics Engineers</general><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>8FD</scope><scope>H8D</scope><scope>L7M</scope><scope>7SC</scope><scope>7SP</scope><scope>JQ2</scope><scope>L~C</scope><scope>L~D</scope><scope>7TB</scope><scope>FR3</scope></search><sort><creationdate>19930301</creationdate><title>Adaptive robotic visual tracking: theory and experiments</title><author>Papanikolopoulos, N.P. ; Khosla, P.K.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c492t-eb0efdc5da71b059cab491c94ad541a134eb8707433c7fea5b025f8bf37b526b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>1993</creationdate><topic>Adaptive control</topic><topic>Applied sciences</topic><topic>Computer science; control theory; systems</topic><topic>Computer vision</topic><topic>Control theory. Systems</topic><topic>Displacement control</topic><topic>Exact sciences and technology</topic><topic>Feedback loop</topic><topic>Image motion analysis</topic><topic>Optical computing</topic><topic>Optical feedback</topic><topic>Optical sensors</topic><topic>Programmable control</topic><topic>Robot sensing systems</topic><topic>Robotics</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Papanikolopoulos, N.P.</creatorcontrib><creatorcontrib>Khosla, P.K.</creatorcontrib><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>Technology Research Database</collection><collection>Aerospace Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Engineering Research Database</collection><jtitle>IEEE transactions on automatic control</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Papanikolopoulos, N.P.</au><au>Khosla, P.K.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Adaptive robotic visual tracking: theory and experiments</atitle><jtitle>IEEE transactions on automatic control</jtitle><stitle>TAC</stitle><date>1993-03-01</date><risdate>1993</risdate><volume>38</volume><issue>3</issue><spage>429</spage><epage>445</epage><pages>429-445</pages><issn>0018-9286</issn><eissn>1558-2523</eissn><coden>IETAA9</coden><abstract>The use of a vision sensor in the feedback loop is addressed within the controlled active vision framework. Algorithms are proposed for the solution of the robotic (eye-in-hand configuration) visual tracking and servoing problem. Visual tracking is stated as a problem of combining control with computer vision. The sum-of-squared differences optical flow is used to compute the vector of discrete displacements. The displacements are fed to an adaptive controller (self-tuning regulator) that creates commands for a robot control system. The procedure is based on the online estimation of the relative distance of the target from the camera, but only partial knowledge of the relative distance is required, obviating the need for offline calibration. Three different adaptive control schemes have been implemented, both in simulation and in experiments. The computational complexity and the experimental results demonstrate that the proposed algorithms can be implemented in real time.< ></abstract><cop>New York, NY</cop><pub>IEEE</pub><doi>10.1109/9.210141</doi><tpages>17</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 0018-9286 |
ispartof | IEEE transactions on automatic control, 1993-03, Vol.38 (3), p.429-445 |
issn | 0018-9286 1558-2523 |
language | eng |
recordid | cdi_proquest_miscellaneous_28230707 |
source | IEEE Electronic Library (IEL) |
subjects | Adaptive control Applied sciences Computer science control theory systems Computer vision Control theory. Systems Displacement control Exact sciences and technology Feedback loop Image motion analysis Optical computing Optical feedback Optical sensors Programmable control Robot sensing systems Robotics |
title | Adaptive robotic visual tracking: theory and experiments |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T17%3A35%3A36IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Adaptive%20robotic%20visual%20tracking:%20theory%20and%20experiments&rft.jtitle=IEEE%20transactions%20on%20automatic%20control&rft.au=Papanikolopoulos,%20N.P.&rft.date=1993-03-01&rft.volume=38&rft.issue=3&rft.spage=429&rft.epage=445&rft.pages=429-445&rft.issn=0018-9286&rft.eissn=1558-2523&rft.coden=IETAA9&rft_id=info:doi/10.1109/9.210141&rft_dat=%3Cproquest_RIE%3E26009811%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=25996030&rft_id=info:pmid/&rft_ieee_id=210141&rfr_iscdi=true |