Gender recognition using motion data from multiple smart devices

•A thorough study of gender recognition using motion data from multiple devices.•A methodological framework for analyzing motion data from multiple devices.•Motion features are extracted from time, frequency and wavelet domains.•Using motion data from multiple devices can significantly improve the a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Expert systems with applications 2020-06, Vol.147, p.113195, Article 113195
Hauptverfasser: Dong, Jianmin, Du, Youtian, Cai, Zhongmin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page 113195
container_title Expert systems with applications
container_volume 147
creator Dong, Jianmin
Du, Youtian
Cai, Zhongmin
description •A thorough study of gender recognition using motion data from multiple devices.•A methodological framework for analyzing motion data from multiple devices.•Motion features are extracted from time, frequency and wavelet domains.•Using motion data from multiple devices can significantly improve the accuracy.•A motion dataset of 56 subjects is established for gender recognition. Using multiple smart devices, such as smartphone and smartwatch simultaneously, is becoming a popular life style with the popularity of wearables. This multiple-sensor setting provides new opportunities for enhanced user trait analysis via multiple data fusion. In this study, we explore the task of gender recognition by using motion data collected from multiple smart devices. Specifically, motion data are collected from smartphone and smart band simultaneously. Motion features are extracted from the collected motion data according to three aspects: time, frequency, and wavelet domains. We present a feature selection method considering the redundancies between motion features. Gender recognition is performed using four supervised learning methods. Experimental results demonstrate that using motion data collected from multiple smart devices can significantly improve the accuracy of gender recognition. Evaluation of our method on a dataset of 56 subjects shows that it can reach an accuracy of 98.7% compared with the accuracies of 93.7% and 88.2% when using smartphone and smart band individually.
doi_str_mv 10.1016/j.eswa.2020.113195
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2440490678</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S095741742030021X</els_id><sourcerecordid>2440490678</sourcerecordid><originalsourceid>FETCH-LOGICAL-c328t-a937dc63bc763e59533db71ea3b5bd69107a52126f39ab15f7e1698ad36be6873</originalsourceid><addsrcrecordid>eNp9kE9LxDAQxYMouK5-AU8Bz12Tpkka8KAs_oMFL3oOaTJdUrbNmrQrfnuz1rOnYYb3Zt78ELqmZEUJFbfdCtKXWZWkzAPKqOInaEFryQohFTtFC6K4LCoqq3N0kVJHCJWEyAW6f4bBQcQRbNgOfvRhwFPywxb34bdxZjS4jaHH_bQb_X4HOPUmjtjBwVtIl-isNbsEV391iT6eHt_XL8Xm7fl1_bApLCvrsTCKSWcFa6wUDLjijLlGUjCs4Y0TihJpeElL0TJlGspbCVSo2jgmGhD5kSW6mffuY_icII26C1Mc8kldVhWpFBGyzqpyVtkYUorQ6n30Oe63pkQfSelOH0npIyk9k8qmu9kEOf_BQ9TJehgsOJ-xjNoF_5_9B7hQcTs</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2440490678</pqid></control><display><type>article</type><title>Gender recognition using motion data from multiple smart devices</title><source>Elsevier ScienceDirect Journals Complete</source><creator>Dong, Jianmin ; Du, Youtian ; Cai, Zhongmin</creator><creatorcontrib>Dong, Jianmin ; Du, Youtian ; Cai, Zhongmin</creatorcontrib><description>•A thorough study of gender recognition using motion data from multiple devices.•A methodological framework for analyzing motion data from multiple devices.•Motion features are extracted from time, frequency and wavelet domains.•Using motion data from multiple devices can significantly improve the accuracy.•A motion dataset of 56 subjects is established for gender recognition. Using multiple smart devices, such as smartphone and smartwatch simultaneously, is becoming a popular life style with the popularity of wearables. This multiple-sensor setting provides new opportunities for enhanced user trait analysis via multiple data fusion. In this study, we explore the task of gender recognition by using motion data collected from multiple smart devices. Specifically, motion data are collected from smartphone and smart band simultaneously. Motion features are extracted from the collected motion data according to three aspects: time, frequency, and wavelet domains. We present a feature selection method considering the redundancies between motion features. Gender recognition is performed using four supervised learning methods. Experimental results demonstrate that using motion data collected from multiple smart devices can significantly improve the accuracy of gender recognition. Evaluation of our method on a dataset of 56 subjects shows that it can reach an accuracy of 98.7% compared with the accuracies of 93.7% and 88.2% when using smartphone and smart band individually.</description><identifier>ISSN: 0957-4174</identifier><identifier>EISSN: 1873-6793</identifier><identifier>DOI: 10.1016/j.eswa.2020.113195</identifier><language>eng</language><publisher>New York: Elsevier Ltd</publisher><subject>Data integration ; Electronic devices ; Feature extraction ; Feature recognition ; Gender ; Gender recognition ; Motion perception ; Motion sensor ; Multiple smart devices ; Performance evaluation ; Smartphones ; Smartwatches ; Walking behavior</subject><ispartof>Expert systems with applications, 2020-06, Vol.147, p.113195, Article 113195</ispartof><rights>2020</rights><rights>Copyright Elsevier BV Jun 1, 2020</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c328t-a937dc63bc763e59533db71ea3b5bd69107a52126f39ab15f7e1698ad36be6873</citedby><cites>FETCH-LOGICAL-c328t-a937dc63bc763e59533db71ea3b5bd69107a52126f39ab15f7e1698ad36be6873</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S095741742030021X$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65306</link.rule.ids></links><search><creatorcontrib>Dong, Jianmin</creatorcontrib><creatorcontrib>Du, Youtian</creatorcontrib><creatorcontrib>Cai, Zhongmin</creatorcontrib><title>Gender recognition using motion data from multiple smart devices</title><title>Expert systems with applications</title><description>•A thorough study of gender recognition using motion data from multiple devices.•A methodological framework for analyzing motion data from multiple devices.•Motion features are extracted from time, frequency and wavelet domains.•Using motion data from multiple devices can significantly improve the accuracy.•A motion dataset of 56 subjects is established for gender recognition. Using multiple smart devices, such as smartphone and smartwatch simultaneously, is becoming a popular life style with the popularity of wearables. This multiple-sensor setting provides new opportunities for enhanced user trait analysis via multiple data fusion. In this study, we explore the task of gender recognition by using motion data collected from multiple smart devices. Specifically, motion data are collected from smartphone and smart band simultaneously. Motion features are extracted from the collected motion data according to three aspects: time, frequency, and wavelet domains. We present a feature selection method considering the redundancies between motion features. Gender recognition is performed using four supervised learning methods. Experimental results demonstrate that using motion data collected from multiple smart devices can significantly improve the accuracy of gender recognition. Evaluation of our method on a dataset of 56 subjects shows that it can reach an accuracy of 98.7% compared with the accuracies of 93.7% and 88.2% when using smartphone and smart band individually.</description><subject>Data integration</subject><subject>Electronic devices</subject><subject>Feature extraction</subject><subject>Feature recognition</subject><subject>Gender</subject><subject>Gender recognition</subject><subject>Motion perception</subject><subject>Motion sensor</subject><subject>Multiple smart devices</subject><subject>Performance evaluation</subject><subject>Smartphones</subject><subject>Smartwatches</subject><subject>Walking behavior</subject><issn>0957-4174</issn><issn>1873-6793</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><recordid>eNp9kE9LxDAQxYMouK5-AU8Bz12Tpkka8KAs_oMFL3oOaTJdUrbNmrQrfnuz1rOnYYb3Zt78ELqmZEUJFbfdCtKXWZWkzAPKqOInaEFryQohFTtFC6K4LCoqq3N0kVJHCJWEyAW6f4bBQcQRbNgOfvRhwFPywxb34bdxZjS4jaHH_bQb_X4HOPUmjtjBwVtIl-isNbsEV391iT6eHt_XL8Xm7fl1_bApLCvrsTCKSWcFa6wUDLjijLlGUjCs4Y0TihJpeElL0TJlGspbCVSo2jgmGhD5kSW6mffuY_icII26C1Mc8kldVhWpFBGyzqpyVtkYUorQ6n30Oe63pkQfSelOH0npIyk9k8qmu9kEOf_BQ9TJehgsOJ-xjNoF_5_9B7hQcTs</recordid><startdate>20200601</startdate><enddate>20200601</enddate><creator>Dong, Jianmin</creator><creator>Du, Youtian</creator><creator>Cai, Zhongmin</creator><general>Elsevier Ltd</general><general>Elsevier BV</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20200601</creationdate><title>Gender recognition using motion data from multiple smart devices</title><author>Dong, Jianmin ; Du, Youtian ; Cai, Zhongmin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c328t-a937dc63bc763e59533db71ea3b5bd69107a52126f39ab15f7e1698ad36be6873</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Data integration</topic><topic>Electronic devices</topic><topic>Feature extraction</topic><topic>Feature recognition</topic><topic>Gender</topic><topic>Gender recognition</topic><topic>Motion perception</topic><topic>Motion sensor</topic><topic>Multiple smart devices</topic><topic>Performance evaluation</topic><topic>Smartphones</topic><topic>Smartwatches</topic><topic>Walking behavior</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Dong, Jianmin</creatorcontrib><creatorcontrib>Du, Youtian</creatorcontrib><creatorcontrib>Cai, Zhongmin</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Expert systems with applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Dong, Jianmin</au><au>Du, Youtian</au><au>Cai, Zhongmin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Gender recognition using motion data from multiple smart devices</atitle><jtitle>Expert systems with applications</jtitle><date>2020-06-01</date><risdate>2020</risdate><volume>147</volume><spage>113195</spage><pages>113195-</pages><artnum>113195</artnum><issn>0957-4174</issn><eissn>1873-6793</eissn><abstract>•A thorough study of gender recognition using motion data from multiple devices.•A methodological framework for analyzing motion data from multiple devices.•Motion features are extracted from time, frequency and wavelet domains.•Using motion data from multiple devices can significantly improve the accuracy.•A motion dataset of 56 subjects is established for gender recognition. Using multiple smart devices, such as smartphone and smartwatch simultaneously, is becoming a popular life style with the popularity of wearables. This multiple-sensor setting provides new opportunities for enhanced user trait analysis via multiple data fusion. In this study, we explore the task of gender recognition by using motion data collected from multiple smart devices. Specifically, motion data are collected from smartphone and smart band simultaneously. Motion features are extracted from the collected motion data according to three aspects: time, frequency, and wavelet domains. We present a feature selection method considering the redundancies between motion features. Gender recognition is performed using four supervised learning methods. Experimental results demonstrate that using motion data collected from multiple smart devices can significantly improve the accuracy of gender recognition. Evaluation of our method on a dataset of 56 subjects shows that it can reach an accuracy of 98.7% compared with the accuracies of 93.7% and 88.2% when using smartphone and smart band individually.</abstract><cop>New York</cop><pub>Elsevier Ltd</pub><doi>10.1016/j.eswa.2020.113195</doi></addata></record>
fulltext fulltext
identifier ISSN: 0957-4174
ispartof Expert systems with applications, 2020-06, Vol.147, p.113195, Article 113195
issn 0957-4174
1873-6793
language eng
recordid cdi_proquest_journals_2440490678
source Elsevier ScienceDirect Journals Complete
subjects Data integration
Electronic devices
Feature extraction
Feature recognition
Gender
Gender recognition
Motion perception
Motion sensor
Multiple smart devices
Performance evaluation
Smartphones
Smartwatches
Walking behavior
title Gender recognition using motion data from multiple smart devices
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-31T19%3A10%3A28IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Gender%20recognition%20using%20motion%20data%20from%20multiple%20smart%20devices&rft.jtitle=Expert%20systems%20with%20applications&rft.au=Dong,%20Jianmin&rft.date=2020-06-01&rft.volume=147&rft.spage=113195&rft.pages=113195-&rft.artnum=113195&rft.issn=0957-4174&rft.eissn=1873-6793&rft_id=info:doi/10.1016/j.eswa.2020.113195&rft_dat=%3Cproquest_cross%3E2440490678%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2440490678&rft_id=info:pmid/&rft_els_id=S095741742030021X&rfr_iscdi=true