Detection of driver manual distraction via image-based hand and ear recognition
•A novel deep neural network-based driving distraction detection algorithm was proposed.•The algorithm incorporated YOLO and a multi-layer perceptron.•Video clips of 20 drivers performing distracting tasks were collected on a driving simulator.•The results indicated that the algorithm is effective a...
Gespeichert in:
Veröffentlicht in: | Accident analysis and prevention 2020-03, Vol.137, p.105432-105432, Article 105432 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 105432 |
---|---|
container_issue | |
container_start_page | 105432 |
container_title | Accident analysis and prevention |
container_volume | 137 |
creator | Li, Li Zhong, Boxuan Hutmacher, Clayton Liang, Yulan Horrey, William J. Xu, Xu |
description | •A novel deep neural network-based driving distraction detection algorithm was proposed.•The algorithm incorporated YOLO and a multi-layer perceptron.•Video clips of 20 drivers performing distracting tasks were collected on a driving simulator.•The results indicated that the algorithm is effective and efficient in detecting a variety of driving distractions.
Driving distraction is a leading cause of fatal car accidents, and almost nine people are killed in the US each day because of distracting activities. Therefore, reducing the number of distraction-affected traffic accidents remains an imperative issue. A novel algorithm for detection of drivers’ manual distraction was proposed in this manuscript. The detection algorithm consists of two modules. The first module predicts the bounding boxes of the driver's right hand and right ear from RGB images. The second module takes the bounding boxes as input and predicts the type of distraction. 106,677 frames extracted from videos, which were collected from twenty participants in a driving simulator, were used for training (50%) and testing (50%). For distraction classification, the results indicated that the proposed framework could detect normal driving, using the touchscreen, and talking with a phone with F1-score 0.84, 0.69, 0.82, respectively. For overall distraction detection, it achieved F1-score of 0.74. The whole framework ran at 28 frames per second. The algorithm achieved comparable overall accuracy with similar research, and was more efficient than other methods. A demo video for the algorithm can be found at https://youtu.be/NKclK1bHRd4. |
doi_str_mv | 10.1016/j.aap.2020.105432 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2350094344</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0001457519309029</els_id><sourcerecordid>2350094344</sourcerecordid><originalsourceid>FETCH-LOGICAL-c490t-86b3e4e4781562ed54ee5aa02bc41234e6ace2d8d6d3c9a2a895c09b1573af7c3</originalsourceid><addsrcrecordid>eNp9kEtLw0AQxxdRbK1-AC-So5fUfeaBJ6lPKPSi52WyO6lb8qi7ScFvb0KqRw_DMMxv_jA_Qq4ZXTLKkrvdEmC_5JSPs5KCn5A5y9I85lSlp2ROKWWxVKmakYsQdsOYZqk6JzPBKZVZQudk84gdms61TdSWkfXugD6qoemhiqwLnYdpeXAQuRq2GBcQ0Eaf0NhoLAQfeTTttnEjeEnOSqgCXh37gnw8P72vXuP15uVt9bCOjcxpF2dJIVCiTDOmEo5WSUQFQHlhJONCYgIGuc1sYoXJgUOWK0PzgqlUQJkasSC3U-7et189hk7XLhisKmiw7YPmQlGaSyHlgLIJNb4NwWOp9354xX9rRvXoUe_04FGPHvXkcbi5Ocb3RY327-JX3ADcTwAOTx4ceh2Mw8agdYONTtvW_RP_A_n7gqg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2350094344</pqid></control><display><type>article</type><title>Detection of driver manual distraction via image-based hand and ear recognition</title><source>MEDLINE</source><source>Elsevier ScienceDirect Journals</source><creator>Li, Li ; Zhong, Boxuan ; Hutmacher, Clayton ; Liang, Yulan ; Horrey, William J. ; Xu, Xu</creator><creatorcontrib>Li, Li ; Zhong, Boxuan ; Hutmacher, Clayton ; Liang, Yulan ; Horrey, William J. ; Xu, Xu</creatorcontrib><description>•A novel deep neural network-based driving distraction detection algorithm was proposed.•The algorithm incorporated YOLO and a multi-layer perceptron.•Video clips of 20 drivers performing distracting tasks were collected on a driving simulator.•The results indicated that the algorithm is effective and efficient in detecting a variety of driving distractions.
Driving distraction is a leading cause of fatal car accidents, and almost nine people are killed in the US each day because of distracting activities. Therefore, reducing the number of distraction-affected traffic accidents remains an imperative issue. A novel algorithm for detection of drivers’ manual distraction was proposed in this manuscript. The detection algorithm consists of two modules. The first module predicts the bounding boxes of the driver's right hand and right ear from RGB images. The second module takes the bounding boxes as input and predicts the type of distraction. 106,677 frames extracted from videos, which were collected from twenty participants in a driving simulator, were used for training (50%) and testing (50%). For distraction classification, the results indicated that the proposed framework could detect normal driving, using the touchscreen, and talking with a phone with F1-score 0.84, 0.69, 0.82, respectively. For overall distraction detection, it achieved F1-score of 0.74. The whole framework ran at 28 frames per second. The algorithm achieved comparable overall accuracy with similar research, and was more efficient than other methods. A demo video for the algorithm can be found at https://youtu.be/NKclK1bHRd4.</description><identifier>ISSN: 0001-4575</identifier><identifier>EISSN: 1879-2057</identifier><identifier>DOI: 10.1016/j.aap.2020.105432</identifier><identifier>PMID: 32004860</identifier><language>eng</language><publisher>England: Elsevier Ltd</publisher><subject>Accidents, Traffic - prevention & control ; Adult ; Algorithms ; Computer vision ; Data Collection ; Deep learning ; Distracted Driving ; Driving distraction ; Ear - physiology ; Female ; Hand - physiology ; Humans ; Male ; Multi-class classification ; Neural Networks, Computer ; Pattern Recognition, Automated - methods ; Upper extremity kinematics</subject><ispartof>Accident analysis and prevention, 2020-03, Vol.137, p.105432-105432, Article 105432</ispartof><rights>2020 Elsevier Ltd</rights><rights>Copyright © 2020 Elsevier Ltd. All rights reserved.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c490t-86b3e4e4781562ed54ee5aa02bc41234e6ace2d8d6d3c9a2a895c09b1573af7c3</citedby><cites>FETCH-LOGICAL-c490t-86b3e4e4781562ed54ee5aa02bc41234e6ace2d8d6d3c9a2a895c09b1573af7c3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0001457519309029$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65306</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/32004860$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Li, Li</creatorcontrib><creatorcontrib>Zhong, Boxuan</creatorcontrib><creatorcontrib>Hutmacher, Clayton</creatorcontrib><creatorcontrib>Liang, Yulan</creatorcontrib><creatorcontrib>Horrey, William J.</creatorcontrib><creatorcontrib>Xu, Xu</creatorcontrib><title>Detection of driver manual distraction via image-based hand and ear recognition</title><title>Accident analysis and prevention</title><addtitle>Accid Anal Prev</addtitle><description>•A novel deep neural network-based driving distraction detection algorithm was proposed.•The algorithm incorporated YOLO and a multi-layer perceptron.•Video clips of 20 drivers performing distracting tasks were collected on a driving simulator.•The results indicated that the algorithm is effective and efficient in detecting a variety of driving distractions.
Driving distraction is a leading cause of fatal car accidents, and almost nine people are killed in the US each day because of distracting activities. Therefore, reducing the number of distraction-affected traffic accidents remains an imperative issue. A novel algorithm for detection of drivers’ manual distraction was proposed in this manuscript. The detection algorithm consists of two modules. The first module predicts the bounding boxes of the driver's right hand and right ear from RGB images. The second module takes the bounding boxes as input and predicts the type of distraction. 106,677 frames extracted from videos, which were collected from twenty participants in a driving simulator, were used for training (50%) and testing (50%). For distraction classification, the results indicated that the proposed framework could detect normal driving, using the touchscreen, and talking with a phone with F1-score 0.84, 0.69, 0.82, respectively. For overall distraction detection, it achieved F1-score of 0.74. The whole framework ran at 28 frames per second. The algorithm achieved comparable overall accuracy with similar research, and was more efficient than other methods. A demo video for the algorithm can be found at https://youtu.be/NKclK1bHRd4.</description><subject>Accidents, Traffic - prevention & control</subject><subject>Adult</subject><subject>Algorithms</subject><subject>Computer vision</subject><subject>Data Collection</subject><subject>Deep learning</subject><subject>Distracted Driving</subject><subject>Driving distraction</subject><subject>Ear - physiology</subject><subject>Female</subject><subject>Hand - physiology</subject><subject>Humans</subject><subject>Male</subject><subject>Multi-class classification</subject><subject>Neural Networks, Computer</subject><subject>Pattern Recognition, Automated - methods</subject><subject>Upper extremity kinematics</subject><issn>0001-4575</issn><issn>1879-2057</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNp9kEtLw0AQxxdRbK1-AC-So5fUfeaBJ6lPKPSi52WyO6lb8qi7ScFvb0KqRw_DMMxv_jA_Qq4ZXTLKkrvdEmC_5JSPs5KCn5A5y9I85lSlp2ROKWWxVKmakYsQdsOYZqk6JzPBKZVZQudk84gdms61TdSWkfXugD6qoemhiqwLnYdpeXAQuRq2GBcQ0Eaf0NhoLAQfeTTttnEjeEnOSqgCXh37gnw8P72vXuP15uVt9bCOjcxpF2dJIVCiTDOmEo5WSUQFQHlhJONCYgIGuc1sYoXJgUOWK0PzgqlUQJkasSC3U-7et189hk7XLhisKmiw7YPmQlGaSyHlgLIJNb4NwWOp9354xX9rRvXoUe_04FGPHvXkcbi5Ocb3RY327-JX3ADcTwAOTx4ceh2Mw8agdYONTtvW_RP_A_n7gqg</recordid><startdate>20200301</startdate><enddate>20200301</enddate><creator>Li, Li</creator><creator>Zhong, Boxuan</creator><creator>Hutmacher, Clayton</creator><creator>Liang, Yulan</creator><creator>Horrey, William J.</creator><creator>Xu, Xu</creator><general>Elsevier Ltd</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope></search><sort><creationdate>20200301</creationdate><title>Detection of driver manual distraction via image-based hand and ear recognition</title><author>Li, Li ; Zhong, Boxuan ; Hutmacher, Clayton ; Liang, Yulan ; Horrey, William J. ; Xu, Xu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c490t-86b3e4e4781562ed54ee5aa02bc41234e6ace2d8d6d3c9a2a895c09b1573af7c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Accidents, Traffic - prevention & control</topic><topic>Adult</topic><topic>Algorithms</topic><topic>Computer vision</topic><topic>Data Collection</topic><topic>Deep learning</topic><topic>Distracted Driving</topic><topic>Driving distraction</topic><topic>Ear - physiology</topic><topic>Female</topic><topic>Hand - physiology</topic><topic>Humans</topic><topic>Male</topic><topic>Multi-class classification</topic><topic>Neural Networks, Computer</topic><topic>Pattern Recognition, Automated - methods</topic><topic>Upper extremity kinematics</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Li</creatorcontrib><creatorcontrib>Zhong, Boxuan</creatorcontrib><creatorcontrib>Hutmacher, Clayton</creatorcontrib><creatorcontrib>Liang, Yulan</creatorcontrib><creatorcontrib>Horrey, William J.</creatorcontrib><creatorcontrib>Xu, Xu</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Accident analysis and prevention</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Li</au><au>Zhong, Boxuan</au><au>Hutmacher, Clayton</au><au>Liang, Yulan</au><au>Horrey, William J.</au><au>Xu, Xu</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Detection of driver manual distraction via image-based hand and ear recognition</atitle><jtitle>Accident analysis and prevention</jtitle><addtitle>Accid Anal Prev</addtitle><date>2020-03-01</date><risdate>2020</risdate><volume>137</volume><spage>105432</spage><epage>105432</epage><pages>105432-105432</pages><artnum>105432</artnum><issn>0001-4575</issn><eissn>1879-2057</eissn><abstract>•A novel deep neural network-based driving distraction detection algorithm was proposed.•The algorithm incorporated YOLO and a multi-layer perceptron.•Video clips of 20 drivers performing distracting tasks were collected on a driving simulator.•The results indicated that the algorithm is effective and efficient in detecting a variety of driving distractions.
Driving distraction is a leading cause of fatal car accidents, and almost nine people are killed in the US each day because of distracting activities. Therefore, reducing the number of distraction-affected traffic accidents remains an imperative issue. A novel algorithm for detection of drivers’ manual distraction was proposed in this manuscript. The detection algorithm consists of two modules. The first module predicts the bounding boxes of the driver's right hand and right ear from RGB images. The second module takes the bounding boxes as input and predicts the type of distraction. 106,677 frames extracted from videos, which were collected from twenty participants in a driving simulator, were used for training (50%) and testing (50%). For distraction classification, the results indicated that the proposed framework could detect normal driving, using the touchscreen, and talking with a phone with F1-score 0.84, 0.69, 0.82, respectively. For overall distraction detection, it achieved F1-score of 0.74. The whole framework ran at 28 frames per second. The algorithm achieved comparable overall accuracy with similar research, and was more efficient than other methods. A demo video for the algorithm can be found at https://youtu.be/NKclK1bHRd4.</abstract><cop>England</cop><pub>Elsevier Ltd</pub><pmid>32004860</pmid><doi>10.1016/j.aap.2020.105432</doi><tpages>1</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0001-4575 |
ispartof | Accident analysis and prevention, 2020-03, Vol.137, p.105432-105432, Article 105432 |
issn | 0001-4575 1879-2057 |
language | eng |
recordid | cdi_proquest_miscellaneous_2350094344 |
source | MEDLINE; Elsevier ScienceDirect Journals |
subjects | Accidents, Traffic - prevention & control Adult Algorithms Computer vision Data Collection Deep learning Distracted Driving Driving distraction Ear - physiology Female Hand - physiology Humans Male Multi-class classification Neural Networks, Computer Pattern Recognition, Automated - methods Upper extremity kinematics |
title | Detection of driver manual distraction via image-based hand and ear recognition |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-29T01%3A43%3A16IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Detection%20of%20driver%20manual%20distraction%20via%20image-based%20hand%20and%20ear%20recognition&rft.jtitle=Accident%20analysis%20and%20prevention&rft.au=Li,%20Li&rft.date=2020-03-01&rft.volume=137&rft.spage=105432&rft.epage=105432&rft.pages=105432-105432&rft.artnum=105432&rft.issn=0001-4575&rft.eissn=1879-2057&rft_id=info:doi/10.1016/j.aap.2020.105432&rft_dat=%3Cproquest_cross%3E2350094344%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2350094344&rft_id=info:pmid/32004860&rft_els_id=S0001457519309029&rfr_iscdi=true |