Smart wearable robot glasses for human visual augmentation based on human intention and scene understanding
This article focuses on the design and implementation of smart wearable robot glasses for human visual augmentation, which take a role to provide the refined visual recognition result to users of wearing the proposed system. The proposed system consists of a glass-type wearable device with a front l...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 5 |
---|---|
container_issue | |
container_start_page | 1 |
container_title | |
container_volume | |
creator | Hyun-Woo Kim Min Young Kim Seung-Ho Yang Kyo-Yeol Kim Hyung-Min Son Yun-Jung Lee |
description | This article focuses on the design and implementation of smart wearable robot glasses for human visual augmentation, which take a role to provide the refined visual recognition result to users of wearing the proposed system. The proposed system consists of a glass-type wearable device with a front looking camera, an eye looking camera, and an earphone, and signal processing units. The scene-analyzing process on the input image acquired by the front view camera is supported by an eye view camera of monitoring the eye position of user for efficient in formation processing, which is used to catch the user's visual intention and attention in given situations. The recognized results are transformed into the audio information for the user friendly information service without obstructing the users' own visual information gathering and processing, and then the result is transferred into the user earphone finally. This device can be used for the augmentation of human visual capability in various areas, museum, conference, meetings, etc. For the proposed device's feasibility, a series of experiments are performed, and the evaluation results are discussed in detail. |
doi_str_mv | 10.1109/ISOT.2010.5687362 |
format | Conference Proceeding |
fullrecord | <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_5687362</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>5687362</ieee_id><sourcerecordid>5687362</sourcerecordid><originalsourceid>FETCH-LOGICAL-i90t-e2c3717d630dcf3baa08196b37b0841c259bddcb653ff996ca0c71e961c85ecf3</originalsourceid><addsrcrecordid>eNo1UMtOAzEMDEJIQOkHIC75gZa8No8jqoBWqtRDe6_y8JbANos2WRB_T1CLL-PxzFiyEbqnZE4pMY-r7WY3Z6TSRmrFJbtAt1QwIZTUTF-iqVH6nwt9jaY5v5NaDVOCiBv0sT3aoeBvsIN1HeChd33Bh87mDBm3_YDfxqNN-Cvm0XbYjocjpGJL7BN2NkPAtTlZYipV-hNsCjh7SIDHFGDIpQ5iOtyhq9Z2GaZnnKDdy_NusZytN6-rxdN6Fg0pM2CeK6qC5CT4ljtriaZGOq4c0YJ61hgXgney4W1rjPSWeEXBSOp1AzUxQQ-ntREA9p9DrBf-7M_v4b9SLlue</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Smart wearable robot glasses for human visual augmentation based on human intention and scene understanding</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Hyun-Woo Kim ; Min Young Kim ; Seung-Ho Yang ; Kyo-Yeol Kim ; Hyung-Min Son ; Yun-Jung Lee</creator><creatorcontrib>Hyun-Woo Kim ; Min Young Kim ; Seung-Ho Yang ; Kyo-Yeol Kim ; Hyung-Min Son ; Yun-Jung Lee</creatorcontrib><description>This article focuses on the design and implementation of smart wearable robot glasses for human visual augmentation, which take a role to provide the refined visual recognition result to users of wearing the proposed system. The proposed system consists of a glass-type wearable device with a front looking camera, an eye looking camera, and an earphone, and signal processing units. The scene-analyzing process on the input image acquired by the front view camera is supported by an eye view camera of monitoring the eye position of user for efficient in formation processing, which is used to catch the user's visual intention and attention in given situations. The recognized results are transformed into the audio information for the user friendly information service without obstructing the users' own visual information gathering and processing, and then the result is transferred into the user earphone finally. This device can be used for the augmentation of human visual capability in various areas, museum, conference, meetings, etc. For the proposed device's feasibility, a series of experiments are performed, and the evaluation results are discussed in detail.</description><identifier>ISBN: 9781424476848</identifier><identifier>ISBN: 1424476844</identifier><identifier>EISBN: 1424476828</identifier><identifier>EISBN: 9781424476824</identifier><identifier>EISBN: 9781424476831</identifier><identifier>EISBN: 1424476836</identifier><identifier>DOI: 10.1109/ISOT.2010.5687362</identifier><language>eng</language><publisher>IEEE</publisher><subject>Cameras ; egocentric vision ; eye tracking ; Face ; Face recognition ; first person vision ; human augmentation ; Humans ; Image recognition ; Robots ; smart glasses ; visual recognition ; Visualization ; wearable robot</subject><ispartof>2010 International Symposium on Optomechatronic Technologies, 2010, p.1-5</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/5687362$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,2058,27925,54920</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/5687362$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Hyun-Woo Kim</creatorcontrib><creatorcontrib>Min Young Kim</creatorcontrib><creatorcontrib>Seung-Ho Yang</creatorcontrib><creatorcontrib>Kyo-Yeol Kim</creatorcontrib><creatorcontrib>Hyung-Min Son</creatorcontrib><creatorcontrib>Yun-Jung Lee</creatorcontrib><title>Smart wearable robot glasses for human visual augmentation based on human intention and scene understanding</title><title>2010 International Symposium on Optomechatronic Technologies</title><addtitle>ISOT</addtitle><description>This article focuses on the design and implementation of smart wearable robot glasses for human visual augmentation, which take a role to provide the refined visual recognition result to users of wearing the proposed system. The proposed system consists of a glass-type wearable device with a front looking camera, an eye looking camera, and an earphone, and signal processing units. The scene-analyzing process on the input image acquired by the front view camera is supported by an eye view camera of monitoring the eye position of user for efficient in formation processing, which is used to catch the user's visual intention and attention in given situations. The recognized results are transformed into the audio information for the user friendly information service without obstructing the users' own visual information gathering and processing, and then the result is transferred into the user earphone finally. This device can be used for the augmentation of human visual capability in various areas, museum, conference, meetings, etc. For the proposed device's feasibility, a series of experiments are performed, and the evaluation results are discussed in detail.</description><subject>Cameras</subject><subject>egocentric vision</subject><subject>eye tracking</subject><subject>Face</subject><subject>Face recognition</subject><subject>first person vision</subject><subject>human augmentation</subject><subject>Humans</subject><subject>Image recognition</subject><subject>Robots</subject><subject>smart glasses</subject><subject>visual recognition</subject><subject>Visualization</subject><subject>wearable robot</subject><isbn>9781424476848</isbn><isbn>1424476844</isbn><isbn>1424476828</isbn><isbn>9781424476824</isbn><isbn>9781424476831</isbn><isbn>1424476836</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2010</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNo1UMtOAzEMDEJIQOkHIC75gZa8No8jqoBWqtRDe6_y8JbANos2WRB_T1CLL-PxzFiyEbqnZE4pMY-r7WY3Z6TSRmrFJbtAt1QwIZTUTF-iqVH6nwt9jaY5v5NaDVOCiBv0sT3aoeBvsIN1HeChd33Bh87mDBm3_YDfxqNN-Cvm0XbYjocjpGJL7BN2NkPAtTlZYipV-hNsCjh7SIDHFGDIpQ5iOtyhq9Z2GaZnnKDdy_NusZytN6-rxdN6Fg0pM2CeK6qC5CT4ljtriaZGOq4c0YJ61hgXgney4W1rjPSWeEXBSOp1AzUxQQ-ntREA9p9DrBf-7M_v4b9SLlue</recordid><startdate>201010</startdate><enddate>201010</enddate><creator>Hyun-Woo Kim</creator><creator>Min Young Kim</creator><creator>Seung-Ho Yang</creator><creator>Kyo-Yeol Kim</creator><creator>Hyung-Min Son</creator><creator>Yun-Jung Lee</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>201010</creationdate><title>Smart wearable robot glasses for human visual augmentation based on human intention and scene understanding</title><author>Hyun-Woo Kim ; Min Young Kim ; Seung-Ho Yang ; Kyo-Yeol Kim ; Hyung-Min Son ; Yun-Jung Lee</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i90t-e2c3717d630dcf3baa08196b37b0841c259bddcb653ff996ca0c71e961c85ecf3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2010</creationdate><topic>Cameras</topic><topic>egocentric vision</topic><topic>eye tracking</topic><topic>Face</topic><topic>Face recognition</topic><topic>first person vision</topic><topic>human augmentation</topic><topic>Humans</topic><topic>Image recognition</topic><topic>Robots</topic><topic>smart glasses</topic><topic>visual recognition</topic><topic>Visualization</topic><topic>wearable robot</topic><toplevel>online_resources</toplevel><creatorcontrib>Hyun-Woo Kim</creatorcontrib><creatorcontrib>Min Young Kim</creatorcontrib><creatorcontrib>Seung-Ho Yang</creatorcontrib><creatorcontrib>Kyo-Yeol Kim</creatorcontrib><creatorcontrib>Hyung-Min Son</creatorcontrib><creatorcontrib>Yun-Jung Lee</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Hyun-Woo Kim</au><au>Min Young Kim</au><au>Seung-Ho Yang</au><au>Kyo-Yeol Kim</au><au>Hyung-Min Son</au><au>Yun-Jung Lee</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Smart wearable robot glasses for human visual augmentation based on human intention and scene understanding</atitle><btitle>2010 International Symposium on Optomechatronic Technologies</btitle><stitle>ISOT</stitle><date>2010-10</date><risdate>2010</risdate><spage>1</spage><epage>5</epage><pages>1-5</pages><isbn>9781424476848</isbn><isbn>1424476844</isbn><eisbn>1424476828</eisbn><eisbn>9781424476824</eisbn><eisbn>9781424476831</eisbn><eisbn>1424476836</eisbn><abstract>This article focuses on the design and implementation of smart wearable robot glasses for human visual augmentation, which take a role to provide the refined visual recognition result to users of wearing the proposed system. The proposed system consists of a glass-type wearable device with a front looking camera, an eye looking camera, and an earphone, and signal processing units. The scene-analyzing process on the input image acquired by the front view camera is supported by an eye view camera of monitoring the eye position of user for efficient in formation processing, which is used to catch the user's visual intention and attention in given situations. The recognized results are transformed into the audio information for the user friendly information service without obstructing the users' own visual information gathering and processing, and then the result is transferred into the user earphone finally. This device can be used for the augmentation of human visual capability in various areas, museum, conference, meetings, etc. For the proposed device's feasibility, a series of experiments are performed, and the evaluation results are discussed in detail.</abstract><pub>IEEE</pub><doi>10.1109/ISOT.2010.5687362</doi><tpages>5</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISBN: 9781424476848 |
ispartof | 2010 International Symposium on Optomechatronic Technologies, 2010, p.1-5 |
issn | |
language | eng |
recordid | cdi_ieee_primary_5687362 |
source | IEEE Electronic Library (IEL) Conference Proceedings |
subjects | Cameras egocentric vision eye tracking Face Face recognition first person vision human augmentation Humans Image recognition Robots smart glasses visual recognition Visualization wearable robot |
title | Smart wearable robot glasses for human visual augmentation based on human intention and scene understanding |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-20T20%3A37%3A47IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Smart%20wearable%20robot%20glasses%20for%20human%20visual%20augmentation%20based%20on%20human%20intention%20and%20scene%20understanding&rft.btitle=2010%20International%20Symposium%20on%20Optomechatronic%20Technologies&rft.au=Hyun-Woo%20Kim&rft.date=2010-10&rft.spage=1&rft.epage=5&rft.pages=1-5&rft.isbn=9781424476848&rft.isbn_list=1424476844&rft_id=info:doi/10.1109/ISOT.2010.5687362&rft_dat=%3Cieee_6IE%3E5687362%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=1424476828&rft.eisbn_list=9781424476824&rft.eisbn_list=9781424476831&rft.eisbn_list=1424476836&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=5687362&rfr_iscdi=true |