Human and Robotic Fish Interaction Controlled Using Hand Gesture Image Processing
This paper is about the control of robotic fish movement in an aquarium via human hand gestures detected by image sensors attached in the aquarium. In this study, sensors actively interact with humans and robotic fish. Image and radio frequency sensors are used to identify the position and color of...
Gespeichert in:
Veröffentlicht in: | Sensors and materials 2020-10, Vol.32 (10), p.3479 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | 10 |
container_start_page | 3479 |
container_title | Sensors and materials |
container_volume | 32 |
creator | Angani, Amarnathvarma Lee, Jin-Wook Talluri, Teressa Lee, Jae-young Shin, Kyoo Jae |
description | This paper is about the control of robotic fish movement in an aquarium via human hand gestures detected by image sensors attached in the aquarium. In this study, sensors actively interact with humans and robotic fish. Image and radio frequency sensors are used to identify the position and color of robotic fish. Recently, we have studied human interactive control based on hand gesture recognition. Image sensors send the input signals of hand gestures obtained from real-time video images processed using tracking control algorithms, such as color mark, stop zone, and lead-lag tracking algorithms, to robotic fish. The movement of robotic fish is controlled via the movement of the two hands, where the left hand is for the fish to be controlled and the right hand is for controlling the movement of the robotic fish. Hand gesture recognition consists of hand feature segmentation and gesture recognition from the hand features. Our results show that interactive human control using hand gestures successfully controls the movement of robotic fish. |
doi_str_mv | 10.18494/SAM.2020.2925 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2470381010</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2470381010</sourcerecordid><originalsourceid>FETCH-LOGICAL-c373t-44d7c52469c36be334044050ad1650cff7104d7ed20c657bb50ba092a7741b813</originalsourceid><addsrcrecordid>eNotkDtvwjAUhT20UhFl7Wypc9LrV5yMCBUSiapPZstxHBoENrWTof--CXS6w_l0ztWH0AOBlOS84E-fy5eUAoWUFlTcoBkUhCe8YOIOLWI8AADJBWQ0m6H3cjhph7Vr8Ievfd8ZvO7iN65cb4M2fecdXnnXB3882gbvYuf2uJzwjY39ECyuTnpv8VvwxsYpvUe3rT5Gu_i_c7RbP3-tymT7uqlWy21imGR9wnkjjaA8KwzLassYB85BgG5IJsC0rSQwIrahYDIh61pAraGgWkpO6pywOXq89p6D_xnGZ9TBD8GNk4pyCSwnQGCk0itlgo8x2FadQ3fS4VcRUBdbarSlJltqssX-AB7hXQY</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2470381010</pqid></control><display><type>article</type><title>Human and Robotic Fish Interaction Controlled Using Hand Gesture Image Processing</title><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>Alma/SFX Local Collection</source><creator>Angani, Amarnathvarma ; Lee, Jin-Wook ; Talluri, Teressa ; Lee, Jae-young ; Shin, Kyoo Jae</creator><creatorcontrib>Angani, Amarnathvarma ; Lee, Jin-Wook ; Talluri, Teressa ; Lee, Jae-young ; Shin, Kyoo Jae</creatorcontrib><description>This paper is about the control of robotic fish movement in an aquarium via human hand gestures detected by image sensors attached in the aquarium. In this study, sensors actively interact with humans and robotic fish. Image and radio frequency sensors are used to identify the position and color of robotic fish. Recently, we have studied human interactive control based on hand gesture recognition. Image sensors send the input signals of hand gestures obtained from real-time video images processed using tracking control algorithms, such as color mark, stop zone, and lead-lag tracking algorithms, to robotic fish. The movement of robotic fish is controlled via the movement of the two hands, where the left hand is for the fish to be controlled and the right hand is for controlling the movement of the robotic fish. Hand gesture recognition consists of hand feature segmentation and gesture recognition from the hand features. Our results show that interactive human control using hand gestures successfully controls the movement of robotic fish.</description><identifier>ISSN: 0914-4935</identifier><identifier>DOI: 10.18494/SAM.2020.2925</identifier><language>eng</language><publisher>Tokyo: MYU Scientific Publishing Division</publisher><subject>Algorithms ; Color ; Control algorithms ; Feature recognition ; Fish ; Gesture recognition ; Human motion ; Image processing ; Image segmentation ; Interactive control ; Object recognition ; Position sensing ; Robot control ; Robotics ; Sensors ; Signal processing ; Tracking control</subject><ispartof>Sensors and materials, 2020-10, Vol.32 (10), p.3479</ispartof><rights>Copyright MYU Scientific Publishing Division 2020</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c373t-44d7c52469c36be334044050ad1650cff7104d7ed20c657bb50ba092a7741b813</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,860,27901,27902</link.rule.ids></links><search><creatorcontrib>Angani, Amarnathvarma</creatorcontrib><creatorcontrib>Lee, Jin-Wook</creatorcontrib><creatorcontrib>Talluri, Teressa</creatorcontrib><creatorcontrib>Lee, Jae-young</creatorcontrib><creatorcontrib>Shin, Kyoo Jae</creatorcontrib><title>Human and Robotic Fish Interaction Controlled Using Hand Gesture Image Processing</title><title>Sensors and materials</title><description>This paper is about the control of robotic fish movement in an aquarium via human hand gestures detected by image sensors attached in the aquarium. In this study, sensors actively interact with humans and robotic fish. Image and radio frequency sensors are used to identify the position and color of robotic fish. Recently, we have studied human interactive control based on hand gesture recognition. Image sensors send the input signals of hand gestures obtained from real-time video images processed using tracking control algorithms, such as color mark, stop zone, and lead-lag tracking algorithms, to robotic fish. The movement of robotic fish is controlled via the movement of the two hands, where the left hand is for the fish to be controlled and the right hand is for controlling the movement of the robotic fish. Hand gesture recognition consists of hand feature segmentation and gesture recognition from the hand features. Our results show that interactive human control using hand gestures successfully controls the movement of robotic fish.</description><subject>Algorithms</subject><subject>Color</subject><subject>Control algorithms</subject><subject>Feature recognition</subject><subject>Fish</subject><subject>Gesture recognition</subject><subject>Human motion</subject><subject>Image processing</subject><subject>Image segmentation</subject><subject>Interactive control</subject><subject>Object recognition</subject><subject>Position sensing</subject><subject>Robot control</subject><subject>Robotics</subject><subject>Sensors</subject><subject>Signal processing</subject><subject>Tracking control</subject><issn>0914-4935</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><recordid>eNotkDtvwjAUhT20UhFl7Wypc9LrV5yMCBUSiapPZstxHBoENrWTof--CXS6w_l0ztWH0AOBlOS84E-fy5eUAoWUFlTcoBkUhCe8YOIOLWI8AADJBWQ0m6H3cjhph7Vr8Ievfd8ZvO7iN65cb4M2fecdXnnXB3882gbvYuf2uJzwjY39ECyuTnpv8VvwxsYpvUe3rT5Gu_i_c7RbP3-tymT7uqlWy21imGR9wnkjjaA8KwzLassYB85BgG5IJsC0rSQwIrahYDIh61pAraGgWkpO6pywOXq89p6D_xnGZ9TBD8GNk4pyCSwnQGCk0itlgo8x2FadQ3fS4VcRUBdbarSlJltqssX-AB7hXQY</recordid><startdate>20201030</startdate><enddate>20201030</enddate><creator>Angani, Amarnathvarma</creator><creator>Lee, Jin-Wook</creator><creator>Talluri, Teressa</creator><creator>Lee, Jae-young</creator><creator>Shin, Kyoo Jae</creator><general>MYU Scientific Publishing Division</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>7SR</scope><scope>7TB</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>FR3</scope><scope>JG9</scope><scope>L7M</scope></search><sort><creationdate>20201030</creationdate><title>Human and Robotic Fish Interaction Controlled Using Hand Gesture Image Processing</title><author>Angani, Amarnathvarma ; Lee, Jin-Wook ; Talluri, Teressa ; Lee, Jae-young ; Shin, Kyoo Jae</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c373t-44d7c52469c36be334044050ad1650cff7104d7ed20c657bb50ba092a7741b813</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Algorithms</topic><topic>Color</topic><topic>Control algorithms</topic><topic>Feature recognition</topic><topic>Fish</topic><topic>Gesture recognition</topic><topic>Human motion</topic><topic>Image processing</topic><topic>Image segmentation</topic><topic>Interactive control</topic><topic>Object recognition</topic><topic>Position sensing</topic><topic>Robot control</topic><topic>Robotics</topic><topic>Sensors</topic><topic>Signal processing</topic><topic>Tracking control</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Angani, Amarnathvarma</creatorcontrib><creatorcontrib>Lee, Jin-Wook</creatorcontrib><creatorcontrib>Talluri, Teressa</creatorcontrib><creatorcontrib>Lee, Jae-young</creatorcontrib><creatorcontrib>Shin, Kyoo Jae</creatorcontrib><collection>CrossRef</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>Materials Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>Sensors and materials</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Angani, Amarnathvarma</au><au>Lee, Jin-Wook</au><au>Talluri, Teressa</au><au>Lee, Jae-young</au><au>Shin, Kyoo Jae</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Human and Robotic Fish Interaction Controlled Using Hand Gesture Image Processing</atitle><jtitle>Sensors and materials</jtitle><date>2020-10-30</date><risdate>2020</risdate><volume>32</volume><issue>10</issue><spage>3479</spage><pages>3479-</pages><issn>0914-4935</issn><abstract>This paper is about the control of robotic fish movement in an aquarium via human hand gestures detected by image sensors attached in the aquarium. In this study, sensors actively interact with humans and robotic fish. Image and radio frequency sensors are used to identify the position and color of robotic fish. Recently, we have studied human interactive control based on hand gesture recognition. Image sensors send the input signals of hand gestures obtained from real-time video images processed using tracking control algorithms, such as color mark, stop zone, and lead-lag tracking algorithms, to robotic fish. The movement of robotic fish is controlled via the movement of the two hands, where the left hand is for the fish to be controlled and the right hand is for controlling the movement of the robotic fish. Hand gesture recognition consists of hand feature segmentation and gesture recognition from the hand features. Our results show that interactive human control using hand gestures successfully controls the movement of robotic fish.</abstract><cop>Tokyo</cop><pub>MYU Scientific Publishing Division</pub><doi>10.18494/SAM.2020.2925</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0914-4935 |
ispartof | Sensors and materials, 2020-10, Vol.32 (10), p.3479 |
issn | 0914-4935 |
language | eng |
recordid | cdi_proquest_journals_2470381010 |
source | DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; Alma/SFX Local Collection |
subjects | Algorithms Color Control algorithms Feature recognition Fish Gesture recognition Human motion Image processing Image segmentation Interactive control Object recognition Position sensing Robot control Robotics Sensors Signal processing Tracking control |
title | Human and Robotic Fish Interaction Controlled Using Hand Gesture Image Processing |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-13T01%3A53%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Human%20and%20Robotic%20Fish%20Interaction%20Controlled%20Using%20Hand%20Gesture%20Image%20Processing&rft.jtitle=Sensors%20and%20materials&rft.au=Angani,%20Amarnathvarma&rft.date=2020-10-30&rft.volume=32&rft.issue=10&rft.spage=3479&rft.pages=3479-&rft.issn=0914-4935&rft_id=info:doi/10.18494/SAM.2020.2925&rft_dat=%3Cproquest_cross%3E2470381010%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2470381010&rft_id=info:pmid/&rfr_iscdi=true |