AIR-Act2Act: Human–human interaction dataset for teaching non-verbal social behaviors to robots

To better interact with users, a social robot should understand the users’ behavior, infer the intention, and respond appropriately. Machine learning is one way of implementing robot intelligence. It provides the ability to automatically learn and improve from experience instead of explicitly tellin...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The International journal of robotics research 2021-04, Vol.40 (4-5), p.691-697
Hauptverfasser: Ko, Woo-Ri, Jang, Minsu, Lee, Jaeyeon, Kim, Jaehong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 697
container_issue 4-5
container_start_page 691
container_title The International journal of robotics research
container_volume 40
creator Ko, Woo-Ri
Jang, Minsu
Lee, Jaeyeon
Kim, Jaehong
description To better interact with users, a social robot should understand the users’ behavior, infer the intention, and respond appropriately. Machine learning is one way of implementing robot intelligence. It provides the ability to automatically learn and improve from experience instead of explicitly telling the robot what to do. Social skills can also be learned through watching human–human interaction videos. However, human–human interaction datasets are relatively scarce to learn interactions that occur in various situations. Moreover, we aim to use service robots in the elderly care domain; however, there has been no interaction dataset collected for this domain. For this reason, we introduce a human–human interaction dataset for teaching non-verbal social behaviors to robots. It is the only interaction dataset that elderly people have participated in as performers. We recruited 100 elderly people and 2 college students to perform 10 interactions in an indoor environment. The entire dataset has 5,000 interaction samples, each of which contains depth maps, body indexes, and 3D skeletal data that are captured with three Microsoft Kinect v2 sensors. In addition, we provide the joint angles of a humanoid NAO robot which are converted from the human behavior that robots need to learn. The dataset and useful Python scripts are available for download at https://github.com/ai4r/AIR-Act2Act. It can be used to not only teach social skills to robots but also benchmark action recognition algorithms.
doi_str_mv 10.1177/0278364921990671
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2523960920</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sage_id>10.1177_0278364921990671</sage_id><sourcerecordid>2523960920</sourcerecordid><originalsourceid>FETCH-LOGICAL-c362t-1bc9b466ae59cd00b1d723fe5deccb76978264beec1954b6dcbe983ae0bf4b8a3</originalsourceid><addsrcrecordid>eNp1UMtKAzEUDaJgre5dBlxH85hJJu5KUVsQBNH1kGTutFPapCZpwZ3_4B_6Jc5QQRBcXM7iPC7nIHTJ6DVjSt1QriohC82Z1lQqdoRGTBWMCKbkMRoNNBn4U3SW0opSKiTVI2Qm82cycZn3d4tnu43xXx-fywFx5zNE43IXPG5MNgkybkPEGYxbdn6BffBkD9GaNU7BdT1YWJp9F2LCOeAYbMjpHJ20Zp3g4gfH6PX-7mU6I49PD_Pp5JE4IXkmzDptCykNlNo1lFrWKC5aKBtwziqpVcVlYQEc02VhZeMs6EoYoLYtbGXEGF0dcrcxvO0g5XoVdtH3L2tecqH7upz2KnpQuRhSitDW29htTHyvGa2HIeu_Q_YWcrAks4Df0H_139pndJ8</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2523960920</pqid></control><display><type>article</type><title>AIR-Act2Act: Human–human interaction dataset for teaching non-verbal social behaviors to robots</title><source>Access via SAGE</source><creator>Ko, Woo-Ri ; Jang, Minsu ; Lee, Jaeyeon ; Kim, Jaehong</creator><creatorcontrib>Ko, Woo-Ri ; Jang, Minsu ; Lee, Jaeyeon ; Kim, Jaehong</creatorcontrib><description>To better interact with users, a social robot should understand the users’ behavior, infer the intention, and respond appropriately. Machine learning is one way of implementing robot intelligence. It provides the ability to automatically learn and improve from experience instead of explicitly telling the robot what to do. Social skills can also be learned through watching human–human interaction videos. However, human–human interaction datasets are relatively scarce to learn interactions that occur in various situations. Moreover, we aim to use service robots in the elderly care domain; however, there has been no interaction dataset collected for this domain. For this reason, we introduce a human–human interaction dataset for teaching non-verbal social behaviors to robots. It is the only interaction dataset that elderly people have participated in as performers. We recruited 100 elderly people and 2 college students to perform 10 interactions in an indoor environment. The entire dataset has 5,000 interaction samples, each of which contains depth maps, body indexes, and 3D skeletal data that are captured with three Microsoft Kinect v2 sensors. In addition, we provide the joint angles of a humanoid NAO robot which are converted from the human behavior that robots need to learn. The dataset and useful Python scripts are available for download at https://github.com/ai4r/AIR-Act2Act. It can be used to not only teach social skills to robots but also benchmark action recognition algorithms.</description><identifier>ISSN: 0278-3649</identifier><identifier>EISSN: 1741-3176</identifier><identifier>DOI: 10.1177/0278364921990671</identifier><language>eng</language><publisher>London, England: SAGE Publications</publisher><subject>Algorithms ; Automation ; Behavior ; Colleges &amp; universities ; Datasets ; Domains ; Human behavior ; Human relations ; Humanoid ; Indoor environments ; Machine learning ; Older people ; Programming languages ; Robots ; Service robots ; Skills ; Social skills ; Teaching</subject><ispartof>The International journal of robotics research, 2021-04, Vol.40 (4-5), p.691-697</ispartof><rights>The Author(s) 2021</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c362t-1bc9b466ae59cd00b1d723fe5deccb76978264beec1954b6dcbe983ae0bf4b8a3</citedby><cites>FETCH-LOGICAL-c362t-1bc9b466ae59cd00b1d723fe5deccb76978264beec1954b6dcbe983ae0bf4b8a3</cites><orcidid>0000-0001-7534-8835</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://journals.sagepub.com/doi/pdf/10.1177/0278364921990671$$EPDF$$P50$$Gsage$$H</linktopdf><linktohtml>$$Uhttps://journals.sagepub.com/doi/10.1177/0278364921990671$$EHTML$$P50$$Gsage$$H</linktohtml><link.rule.ids>314,780,784,21819,27924,27925,43621,43622</link.rule.ids></links><search><creatorcontrib>Ko, Woo-Ri</creatorcontrib><creatorcontrib>Jang, Minsu</creatorcontrib><creatorcontrib>Lee, Jaeyeon</creatorcontrib><creatorcontrib>Kim, Jaehong</creatorcontrib><title>AIR-Act2Act: Human–human interaction dataset for teaching non-verbal social behaviors to robots</title><title>The International journal of robotics research</title><description>To better interact with users, a social robot should understand the users’ behavior, infer the intention, and respond appropriately. Machine learning is one way of implementing robot intelligence. It provides the ability to automatically learn and improve from experience instead of explicitly telling the robot what to do. Social skills can also be learned through watching human–human interaction videos. However, human–human interaction datasets are relatively scarce to learn interactions that occur in various situations. Moreover, we aim to use service robots in the elderly care domain; however, there has been no interaction dataset collected for this domain. For this reason, we introduce a human–human interaction dataset for teaching non-verbal social behaviors to robots. It is the only interaction dataset that elderly people have participated in as performers. We recruited 100 elderly people and 2 college students to perform 10 interactions in an indoor environment. The entire dataset has 5,000 interaction samples, each of which contains depth maps, body indexes, and 3D skeletal data that are captured with three Microsoft Kinect v2 sensors. In addition, we provide the joint angles of a humanoid NAO robot which are converted from the human behavior that robots need to learn. The dataset and useful Python scripts are available for download at https://github.com/ai4r/AIR-Act2Act. It can be used to not only teach social skills to robots but also benchmark action recognition algorithms.</description><subject>Algorithms</subject><subject>Automation</subject><subject>Behavior</subject><subject>Colleges &amp; universities</subject><subject>Datasets</subject><subject>Domains</subject><subject>Human behavior</subject><subject>Human relations</subject><subject>Humanoid</subject><subject>Indoor environments</subject><subject>Machine learning</subject><subject>Older people</subject><subject>Programming languages</subject><subject>Robots</subject><subject>Service robots</subject><subject>Skills</subject><subject>Social skills</subject><subject>Teaching</subject><issn>0278-3649</issn><issn>1741-3176</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><recordid>eNp1UMtKAzEUDaJgre5dBlxH85hJJu5KUVsQBNH1kGTutFPapCZpwZ3_4B_6Jc5QQRBcXM7iPC7nIHTJ6DVjSt1QriohC82Z1lQqdoRGTBWMCKbkMRoNNBn4U3SW0opSKiTVI2Qm82cycZn3d4tnu43xXx-fywFx5zNE43IXPG5MNgkybkPEGYxbdn6BffBkD9GaNU7BdT1YWJp9F2LCOeAYbMjpHJ20Zp3g4gfH6PX-7mU6I49PD_Pp5JE4IXkmzDptCykNlNo1lFrWKC5aKBtwziqpVcVlYQEc02VhZeMs6EoYoLYtbGXEGF0dcrcxvO0g5XoVdtH3L2tecqH7upz2KnpQuRhSitDW29htTHyvGa2HIeu_Q_YWcrAks4Df0H_139pndJ8</recordid><startdate>20210401</startdate><enddate>20210401</enddate><creator>Ko, Woo-Ri</creator><creator>Jang, Minsu</creator><creator>Lee, Jaeyeon</creator><creator>Kim, Jaehong</creator><general>SAGE Publications</general><general>SAGE PUBLICATIONS, INC</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-7534-8835</orcidid></search><sort><creationdate>20210401</creationdate><title>AIR-Act2Act: Human–human interaction dataset for teaching non-verbal social behaviors to robots</title><author>Ko, Woo-Ri ; Jang, Minsu ; Lee, Jaeyeon ; Kim, Jaehong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c362t-1bc9b466ae59cd00b1d723fe5deccb76978264beec1954b6dcbe983ae0bf4b8a3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Algorithms</topic><topic>Automation</topic><topic>Behavior</topic><topic>Colleges &amp; universities</topic><topic>Datasets</topic><topic>Domains</topic><topic>Human behavior</topic><topic>Human relations</topic><topic>Humanoid</topic><topic>Indoor environments</topic><topic>Machine learning</topic><topic>Older people</topic><topic>Programming languages</topic><topic>Robots</topic><topic>Service robots</topic><topic>Skills</topic><topic>Social skills</topic><topic>Teaching</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ko, Woo-Ri</creatorcontrib><creatorcontrib>Jang, Minsu</creatorcontrib><creatorcontrib>Lee, Jaeyeon</creatorcontrib><creatorcontrib>Kim, Jaehong</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>The International journal of robotics research</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ko, Woo-Ri</au><au>Jang, Minsu</au><au>Lee, Jaeyeon</au><au>Kim, Jaehong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>AIR-Act2Act: Human–human interaction dataset for teaching non-verbal social behaviors to robots</atitle><jtitle>The International journal of robotics research</jtitle><date>2021-04-01</date><risdate>2021</risdate><volume>40</volume><issue>4-5</issue><spage>691</spage><epage>697</epage><pages>691-697</pages><issn>0278-3649</issn><eissn>1741-3176</eissn><abstract>To better interact with users, a social robot should understand the users’ behavior, infer the intention, and respond appropriately. Machine learning is one way of implementing robot intelligence. It provides the ability to automatically learn and improve from experience instead of explicitly telling the robot what to do. Social skills can also be learned through watching human–human interaction videos. However, human–human interaction datasets are relatively scarce to learn interactions that occur in various situations. Moreover, we aim to use service robots in the elderly care domain; however, there has been no interaction dataset collected for this domain. For this reason, we introduce a human–human interaction dataset for teaching non-verbal social behaviors to robots. It is the only interaction dataset that elderly people have participated in as performers. We recruited 100 elderly people and 2 college students to perform 10 interactions in an indoor environment. The entire dataset has 5,000 interaction samples, each of which contains depth maps, body indexes, and 3D skeletal data that are captured with three Microsoft Kinect v2 sensors. In addition, we provide the joint angles of a humanoid NAO robot which are converted from the human behavior that robots need to learn. The dataset and useful Python scripts are available for download at https://github.com/ai4r/AIR-Act2Act. It can be used to not only teach social skills to robots but also benchmark action recognition algorithms.</abstract><cop>London, England</cop><pub>SAGE Publications</pub><doi>10.1177/0278364921990671</doi><tpages>7</tpages><orcidid>https://orcid.org/0000-0001-7534-8835</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0278-3649
ispartof The International journal of robotics research, 2021-04, Vol.40 (4-5), p.691-697
issn 0278-3649
1741-3176
language eng
recordid cdi_proquest_journals_2523960920
source Access via SAGE
subjects Algorithms
Automation
Behavior
Colleges & universities
Datasets
Domains
Human behavior
Human relations
Humanoid
Indoor environments
Machine learning
Older people
Programming languages
Robots
Service robots
Skills
Social skills
Teaching
title AIR-Act2Act: Human–human interaction dataset for teaching non-verbal social behaviors to robots
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-24T00%3A37%3A35IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=AIR-Act2Act:%20Human%E2%80%93human%20interaction%20dataset%20for%20teaching%20non-verbal%20social%20behaviors%20to%20robots&rft.jtitle=The%20International%20journal%20of%20robotics%20research&rft.au=Ko,%20Woo-Ri&rft.date=2021-04-01&rft.volume=40&rft.issue=4-5&rft.spage=691&rft.epage=697&rft.pages=691-697&rft.issn=0278-3649&rft.eissn=1741-3176&rft_id=info:doi/10.1177/0278364921990671&rft_dat=%3Cproquest_cross%3E2523960920%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2523960920&rft_id=info:pmid/&rft_sage_id=10.1177_0278364921990671&rfr_iscdi=true