Non-verbal Communication and Touchless Activation of a Radio-Controlled Car via Facial Activity Recognition

Many smart glasses technologies are being developed to improve the working efficiency or quality of life in various fields. In some enterprises, these technologies are used to help improve the working quality and productivity and minimize data loss. In real life, smart glasses are applied as an ente...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of precision engineering and manufacturing 2020-06, Vol.21 (6), p.1035-1046
Hauptverfasser: Han, Dong Yeol, Park, Bi Oh, Kim, Jae Won, Lee, Ji Hoon, Lee, Won Gu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1046
container_issue 6
container_start_page 1035
container_title International journal of precision engineering and manufacturing
container_volume 21
creator Han, Dong Yeol
Park, Bi Oh
Kim, Jae Won
Lee, Ji Hoon
Lee, Won Gu
description Many smart glasses technologies are being developed to improve the working efficiency or quality of life in various fields. In some enterprises, these technologies are used to help improve the working quality and productivity and minimize data loss. In real life, smart glasses are applied as an entertainment device with augmented/virtual reality or as an assistive manipulator for the physically challenged. Thus, these technologies have mainly adopted various operating systems depending on usages, such as a touchpad, remote control, and voice recognition. However, conventional operating methods have limitations in non-verbal and noisy situations where people cannot use both hands. In this study, we present a method of detecting a facial signal for touchless activation using a transducer. We acquired a facial signal amplified by a lever mechanism using a load cell on the hinge of an eyewear. We then classified the signal and obtained their accuracy by calculating the confusion matrix with classified categories through a machine learning technique, i.e., the support vector machine. We can activate an actuator, such as a radio-controlled car, through a classified facial signal by using an eyewear-type signal transducer. Overall, our operating system can be useful for activating the actuator or transmitting a message through the classified facial activities in non-verbal situations and in situations where both hands cannot be used.
doi_str_mv 10.1007/s12541-019-00291-x
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2407767681</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2407767681</sourcerecordid><originalsourceid>FETCH-LOGICAL-c406t-b5ec2bc33168d9749e63f93c99400ddd3b6490c94d8f494484f76c7ac85d6323</originalsourceid><addsrcrecordid>eNp9kM1KAzEURoMoWLQv4CrgOpq_SSbLMlgVikLpPmSSTI1OJzWZlvbtnXYEd67uhXu-78IB4I7gB4KxfMyEFpwgTBTCmCqCDhdgQjEuEBeYXg47ZRzJQrFrMM051JgRKlhRign4eosd2vtUmxZWcbPZdcGaPsQOms7BVdzZj9bnDGe2D_vxEBto4NK4EFEVuz7FtvUOVibBfTBwbmwYus586I9w6W1cd-GUvAVXjWmzn_7OG7CaP62qF7R4f36tZgtkORY9qgtvaW0ZI6J0SnLlBWsUs0pxjJ1zrBZcYau4KxuuOC95I4WVxpaFE4yyG3A_1m5T_N753OvPuEvd8FFTjqUUUpRkoOhI2RRzTr7R2xQ2Jh01wfqkVY9a9aBVn7XqwxBiYygPcLf26a_6n9QPTKB7aQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2407767681</pqid></control><display><type>article</type><title>Non-verbal Communication and Touchless Activation of a Radio-Controlled Car via Facial Activity Recognition</title><source>SpringerLink Journals - AutoHoldings</source><creator>Han, Dong Yeol ; Park, Bi Oh ; Kim, Jae Won ; Lee, Ji Hoon ; Lee, Won Gu</creator><creatorcontrib>Han, Dong Yeol ; Park, Bi Oh ; Kim, Jae Won ; Lee, Ji Hoon ; Lee, Won Gu</creatorcontrib><description>Many smart glasses technologies are being developed to improve the working efficiency or quality of life in various fields. In some enterprises, these technologies are used to help improve the working quality and productivity and minimize data loss. In real life, smart glasses are applied as an entertainment device with augmented/virtual reality or as an assistive manipulator for the physically challenged. Thus, these technologies have mainly adopted various operating systems depending on usages, such as a touchpad, remote control, and voice recognition. However, conventional operating methods have limitations in non-verbal and noisy situations where people cannot use both hands. In this study, we present a method of detecting a facial signal for touchless activation using a transducer. We acquired a facial signal amplified by a lever mechanism using a load cell on the hinge of an eyewear. We then classified the signal and obtained their accuracy by calculating the confusion matrix with classified categories through a machine learning technique, i.e., the support vector machine. We can activate an actuator, such as a radio-controlled car, through a classified facial signal by using an eyewear-type signal transducer. Overall, our operating system can be useful for activating the actuator or transmitting a message through the classified facial activities in non-verbal situations and in situations where both hands cannot be used.</description><identifier>ISSN: 2234-7593</identifier><identifier>EISSN: 2005-4602</identifier><identifier>DOI: 10.1007/s12541-019-00291-x</identifier><language>eng</language><publisher>Seoul: Korean Society for Precision Engineering</publisher><subject>Activation ; Activity recognition ; Actuators ; Data loss ; Engineering ; Eyewear ; Face recognition ; Industrial and Production Engineering ; Load cells ; Machine learning ; Materials Science ; Operating systems ; Radio control ; Regular Paper ; Remote control ; Signal classification ; Support vector machines ; Verbal communication ; Virtual reality ; Voice communication ; Voice control ; Voice recognition</subject><ispartof>International journal of precision engineering and manufacturing, 2020-06, Vol.21 (6), p.1035-1046</ispartof><rights>Korean Society for Precision Engineering 2020</rights><rights>Korean Society for Precision Engineering 2020.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c406t-b5ec2bc33168d9749e63f93c99400ddd3b6490c94d8f494484f76c7ac85d6323</citedby><cites>FETCH-LOGICAL-c406t-b5ec2bc33168d9749e63f93c99400ddd3b6490c94d8f494484f76c7ac85d6323</cites><orcidid>0000-0003-1473-0672</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s12541-019-00291-x$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s12541-019-00291-x$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27923,27924,41487,42556,51318</link.rule.ids></links><search><creatorcontrib>Han, Dong Yeol</creatorcontrib><creatorcontrib>Park, Bi Oh</creatorcontrib><creatorcontrib>Kim, Jae Won</creatorcontrib><creatorcontrib>Lee, Ji Hoon</creatorcontrib><creatorcontrib>Lee, Won Gu</creatorcontrib><title>Non-verbal Communication and Touchless Activation of a Radio-Controlled Car via Facial Activity Recognition</title><title>International journal of precision engineering and manufacturing</title><addtitle>Int. J. Precis. Eng. Manuf</addtitle><description>Many smart glasses technologies are being developed to improve the working efficiency or quality of life in various fields. In some enterprises, these technologies are used to help improve the working quality and productivity and minimize data loss. In real life, smart glasses are applied as an entertainment device with augmented/virtual reality or as an assistive manipulator for the physically challenged. Thus, these technologies have mainly adopted various operating systems depending on usages, such as a touchpad, remote control, and voice recognition. However, conventional operating methods have limitations in non-verbal and noisy situations where people cannot use both hands. In this study, we present a method of detecting a facial signal for touchless activation using a transducer. We acquired a facial signal amplified by a lever mechanism using a load cell on the hinge of an eyewear. We then classified the signal and obtained their accuracy by calculating the confusion matrix with classified categories through a machine learning technique, i.e., the support vector machine. We can activate an actuator, such as a radio-controlled car, through a classified facial signal by using an eyewear-type signal transducer. Overall, our operating system can be useful for activating the actuator or transmitting a message through the classified facial activities in non-verbal situations and in situations where both hands cannot be used.</description><subject>Activation</subject><subject>Activity recognition</subject><subject>Actuators</subject><subject>Data loss</subject><subject>Engineering</subject><subject>Eyewear</subject><subject>Face recognition</subject><subject>Industrial and Production Engineering</subject><subject>Load cells</subject><subject>Machine learning</subject><subject>Materials Science</subject><subject>Operating systems</subject><subject>Radio control</subject><subject>Regular Paper</subject><subject>Remote control</subject><subject>Signal classification</subject><subject>Support vector machines</subject><subject>Verbal communication</subject><subject>Virtual reality</subject><subject>Voice communication</subject><subject>Voice control</subject><subject>Voice recognition</subject><issn>2234-7593</issn><issn>2005-4602</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><recordid>eNp9kM1KAzEURoMoWLQv4CrgOpq_SSbLMlgVikLpPmSSTI1OJzWZlvbtnXYEd67uhXu-78IB4I7gB4KxfMyEFpwgTBTCmCqCDhdgQjEuEBeYXg47ZRzJQrFrMM051JgRKlhRign4eosd2vtUmxZWcbPZdcGaPsQOms7BVdzZj9bnDGe2D_vxEBto4NK4EFEVuz7FtvUOVibBfTBwbmwYus586I9w6W1cd-GUvAVXjWmzn_7OG7CaP62qF7R4f36tZgtkORY9qgtvaW0ZI6J0SnLlBWsUs0pxjJ1zrBZcYau4KxuuOC95I4WVxpaFE4yyG3A_1m5T_N753OvPuEvd8FFTjqUUUpRkoOhI2RRzTr7R2xQ2Jh01wfqkVY9a9aBVn7XqwxBiYygPcLf26a_6n9QPTKB7aQ</recordid><startdate>20200601</startdate><enddate>20200601</enddate><creator>Han, Dong Yeol</creator><creator>Park, Bi Oh</creator><creator>Kim, Jae Won</creator><creator>Lee, Ji Hoon</creator><creator>Lee, Won Gu</creator><general>Korean Society for Precision Engineering</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0003-1473-0672</orcidid></search><sort><creationdate>20200601</creationdate><title>Non-verbal Communication and Touchless Activation of a Radio-Controlled Car via Facial Activity Recognition</title><author>Han, Dong Yeol ; Park, Bi Oh ; Kim, Jae Won ; Lee, Ji Hoon ; Lee, Won Gu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c406t-b5ec2bc33168d9749e63f93c99400ddd3b6490c94d8f494484f76c7ac85d6323</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Activation</topic><topic>Activity recognition</topic><topic>Actuators</topic><topic>Data loss</topic><topic>Engineering</topic><topic>Eyewear</topic><topic>Face recognition</topic><topic>Industrial and Production Engineering</topic><topic>Load cells</topic><topic>Machine learning</topic><topic>Materials Science</topic><topic>Operating systems</topic><topic>Radio control</topic><topic>Regular Paper</topic><topic>Remote control</topic><topic>Signal classification</topic><topic>Support vector machines</topic><topic>Verbal communication</topic><topic>Virtual reality</topic><topic>Voice communication</topic><topic>Voice control</topic><topic>Voice recognition</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Han, Dong Yeol</creatorcontrib><creatorcontrib>Park, Bi Oh</creatorcontrib><creatorcontrib>Kim, Jae Won</creatorcontrib><creatorcontrib>Lee, Ji Hoon</creatorcontrib><creatorcontrib>Lee, Won Gu</creatorcontrib><collection>CrossRef</collection><jtitle>International journal of precision engineering and manufacturing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Han, Dong Yeol</au><au>Park, Bi Oh</au><au>Kim, Jae Won</au><au>Lee, Ji Hoon</au><au>Lee, Won Gu</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Non-verbal Communication and Touchless Activation of a Radio-Controlled Car via Facial Activity Recognition</atitle><jtitle>International journal of precision engineering and manufacturing</jtitle><stitle>Int. J. Precis. Eng. Manuf</stitle><date>2020-06-01</date><risdate>2020</risdate><volume>21</volume><issue>6</issue><spage>1035</spage><epage>1046</epage><pages>1035-1046</pages><issn>2234-7593</issn><eissn>2005-4602</eissn><abstract>Many smart glasses technologies are being developed to improve the working efficiency or quality of life in various fields. In some enterprises, these technologies are used to help improve the working quality and productivity and minimize data loss. In real life, smart glasses are applied as an entertainment device with augmented/virtual reality or as an assistive manipulator for the physically challenged. Thus, these technologies have mainly adopted various operating systems depending on usages, such as a touchpad, remote control, and voice recognition. However, conventional operating methods have limitations in non-verbal and noisy situations where people cannot use both hands. In this study, we present a method of detecting a facial signal for touchless activation using a transducer. We acquired a facial signal amplified by a lever mechanism using a load cell on the hinge of an eyewear. We then classified the signal and obtained their accuracy by calculating the confusion matrix with classified categories through a machine learning technique, i.e., the support vector machine. We can activate an actuator, such as a radio-controlled car, through a classified facial signal by using an eyewear-type signal transducer. Overall, our operating system can be useful for activating the actuator or transmitting a message through the classified facial activities in non-verbal situations and in situations where both hands cannot be used.</abstract><cop>Seoul</cop><pub>Korean Society for Precision Engineering</pub><doi>10.1007/s12541-019-00291-x</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0003-1473-0672</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 2234-7593
ispartof International journal of precision engineering and manufacturing, 2020-06, Vol.21 (6), p.1035-1046
issn 2234-7593
2005-4602
language eng
recordid cdi_proquest_journals_2407767681
source SpringerLink Journals - AutoHoldings
subjects Activation
Activity recognition
Actuators
Data loss
Engineering
Eyewear
Face recognition
Industrial and Production Engineering
Load cells
Machine learning
Materials Science
Operating systems
Radio control
Regular Paper
Remote control
Signal classification
Support vector machines
Verbal communication
Virtual reality
Voice communication
Voice control
Voice recognition
title Non-verbal Communication and Touchless Activation of a Radio-Controlled Car via Facial Activity Recognition
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-11T23%3A56%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Non-verbal%20Communication%20and%20Touchless%20Activation%20of%20a%20Radio-Controlled%20Car%20via%20Facial%20Activity%20Recognition&rft.jtitle=International%20journal%20of%20precision%20engineering%20and%20manufacturing&rft.au=Han,%20Dong%20Yeol&rft.date=2020-06-01&rft.volume=21&rft.issue=6&rft.spage=1035&rft.epage=1046&rft.pages=1035-1046&rft.issn=2234-7593&rft.eissn=2005-4602&rft_id=info:doi/10.1007/s12541-019-00291-x&rft_dat=%3Cproquest_cross%3E2407767681%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2407767681&rft_id=info:pmid/&rfr_iscdi=true