Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications
Activity recognition from wearable photo-cameras is crucial for lifestyle characterization and health monitoring. However, to enable its wide-spreading use in real-world applications, a high level of generalization needs to be ensured on unseen users. Currently, state-of-the-art methods have been te...
Gespeichert in:
Veröffentlicht in: | IEEE access 2020, Vol.8, p.77344-77363 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 77363 |
---|---|
container_issue | |
container_start_page | 77344 |
container_title | IEEE access |
container_volume | 8 |
creator | Cartas, Alejandro Radeva, Petia Dimiccoli, Mariella |
description | Activity recognition from wearable photo-cameras is crucial for lifestyle characterization and health monitoring. However, to enable its wide-spreading use in real-world applications, a high level of generalization needs to be ensured on unseen users. Currently, state-of-the-art methods have been tested only on relatively small datasets consisting of data collected by a few users that are partially seen during training. In this paper, we built a new egocentric dataset acquired by 15 people through a wearable photo-camera and used it to test the generalization capabilities of several state-of-the-art methods for egocentric activity recognition on unseen users and daily image sequences. In addition, we propose several variants to state-of-the-art deep learning architectures, and we show that it is possible to achieve 79.87% accuracy on users unseen during training. Furthermore, to show that the proposed dataset and approach can be useful in real-world applications, where data can be acquired by different wearable cameras and labeled data are scarcely available, we employed a domain adaptation strategy on two egocentric activity recognition benchmark datasets. These experiments show that the model learned with our dataset, can easily be transferred to other domains with a very small amount of labeled data. Taken together, those results show that activity recognition from wearable photo-cameras is mature enough to be tested in real-world applications. |
doi_str_mv | 10.1109/ACCESS.2020.2990333 |
format | Article |
fullrecord | <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_ieee_primary_9078767</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9078767</ieee_id><doaj_id>oai_doaj_org_article_624f0ebeb0c1485484a6293e80ff1d85</doaj_id><sourcerecordid>2454092246</sourcerecordid><originalsourceid>FETCH-LOGICAL-c408t-885c5c00ef2aac397a62cf088e96dc8b72350fce532d1ca5fceb955e4369df5d3</originalsourceid><addsrcrecordid>eNpNUU1L5EAQDYsLK-ov8NKw54yV_ki6vQ3xY4URQV087KGpdKqlhzg9dqLiv98eI2Jd6vGo917BK4rjChZVBeZk2bbnd3cLDhwW3BgQQvwo9nlVm1IoUe99w7-Ko3FcQx6dKdXsF_-WbgqvYQo0sujZGYbhna0ys3lk13ETpph28DUgQ_ZAmLAbiLX4RAlP2X18w9SzW8KhfIhp6Nlyux2CwynEzXhY_PQ4jHT0uQ-Kvxfn9-2fcnVzedUuV6WToKdSa-WUAyDPEZ0wDdbcedCaTN073TVcKPCOlOB95VBl2BmlSIra9F714qC4mn37iGu7TeEJ07uNGOwHEdOjxTQFN5CtufRAHXXgKqmV1DKHGUEavK96rbLX79lrm-LzC42TXceXtMnvWy6VBMO5rPOVmK9ciuOYyH-lVmB3pdi5FLsrxX6WklXHsyoQ0ZfCQKObuhH_Aa4Yh80</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2454092246</pqid></control><display><type>article</type><title>Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications</title><source>IEEE Open Access Journals</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><creator>Cartas, Alejandro ; Radeva, Petia ; Dimiccoli, Mariella</creator><creatorcontrib>Cartas, Alejandro ; Radeva, Petia ; Dimiccoli, Mariella</creatorcontrib><description>Activity recognition from wearable photo-cameras is crucial for lifestyle characterization and health monitoring. However, to enable its wide-spreading use in real-world applications, a high level of generalization needs to be ensured on unseen users. Currently, state-of-the-art methods have been tested only on relatively small datasets consisting of data collected by a few users that are partially seen during training. In this paper, we built a new egocentric dataset acquired by 15 people through a wearable photo-camera and used it to test the generalization capabilities of several state-of-the-art methods for egocentric activity recognition on unseen users and daily image sequences. In addition, we propose several variants to state-of-the-art deep learning architectures, and we show that it is possible to achieve 79.87% accuracy on users unseen during training. Furthermore, to show that the proposed dataset and approach can be useful in real-world applications, where data can be acquired by different wearable cameras and labeled data are scarcely available, we employed a domain adaptation strategy on two egocentric activity recognition benchmark datasets. These experiments show that the model learned with our dataset, can easily be transferred to other domains with a very small amount of labeled data. Taken together, those results show that activity recognition from wearable photo-cameras is mature enough to be tested in real-world applications.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2020.2990333</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Activity recognition ; Biomedical monitoring ; Cameras ; Daily activity recognition ; Data acquisition ; Datasets ; Deep learning ; domain adaptation ; Domains ; Machine learning ; Monitoring ; Object recognition ; Training ; visual lifelogs ; Visualization ; wearable cameras ; Wearable technology</subject><ispartof>IEEE access, 2020, Vol.8, p.77344-77363</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c408t-885c5c00ef2aac397a62cf088e96dc8b72350fce532d1ca5fceb955e4369df5d3</citedby><cites>FETCH-LOGICAL-c408t-885c5c00ef2aac397a62cf088e96dc8b72350fce532d1ca5fceb955e4369df5d3</cites><orcidid>0000-0002-4440-9954</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9078767$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,860,2096,4010,27610,27900,27901,27902,54908</link.rule.ids></links><search><creatorcontrib>Cartas, Alejandro</creatorcontrib><creatorcontrib>Radeva, Petia</creatorcontrib><creatorcontrib>Dimiccoli, Mariella</creatorcontrib><title>Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications</title><title>IEEE access</title><addtitle>Access</addtitle><description>Activity recognition from wearable photo-cameras is crucial for lifestyle characterization and health monitoring. However, to enable its wide-spreading use in real-world applications, a high level of generalization needs to be ensured on unseen users. Currently, state-of-the-art methods have been tested only on relatively small datasets consisting of data collected by a few users that are partially seen during training. In this paper, we built a new egocentric dataset acquired by 15 people through a wearable photo-camera and used it to test the generalization capabilities of several state-of-the-art methods for egocentric activity recognition on unseen users and daily image sequences. In addition, we propose several variants to state-of-the-art deep learning architectures, and we show that it is possible to achieve 79.87% accuracy on users unseen during training. Furthermore, to show that the proposed dataset and approach can be useful in real-world applications, where data can be acquired by different wearable cameras and labeled data are scarcely available, we employed a domain adaptation strategy on two egocentric activity recognition benchmark datasets. These experiments show that the model learned with our dataset, can easily be transferred to other domains with a very small amount of labeled data. Taken together, those results show that activity recognition from wearable photo-cameras is mature enough to be tested in real-world applications.</description><subject>Activity recognition</subject><subject>Biomedical monitoring</subject><subject>Cameras</subject><subject>Daily activity recognition</subject><subject>Data acquisition</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>domain adaptation</subject><subject>Domains</subject><subject>Machine learning</subject><subject>Monitoring</subject><subject>Object recognition</subject><subject>Training</subject><subject>visual lifelogs</subject><subject>Visualization</subject><subject>wearable cameras</subject><subject>Wearable technology</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNUU1L5EAQDYsLK-ov8NKw54yV_ki6vQ3xY4URQV087KGpdKqlhzg9dqLiv98eI2Jd6vGo917BK4rjChZVBeZk2bbnd3cLDhwW3BgQQvwo9nlVm1IoUe99w7-Ko3FcQx6dKdXsF_-WbgqvYQo0sujZGYbhna0ys3lk13ETpph28DUgQ_ZAmLAbiLX4RAlP2X18w9SzW8KhfIhp6Nlyux2CwynEzXhY_PQ4jHT0uQ-Kvxfn9-2fcnVzedUuV6WToKdSa-WUAyDPEZ0wDdbcedCaTN073TVcKPCOlOB95VBl2BmlSIra9F714qC4mn37iGu7TeEJ07uNGOwHEdOjxTQFN5CtufRAHXXgKqmV1DKHGUEavK96rbLX79lrm-LzC42TXceXtMnvWy6VBMO5rPOVmK9ciuOYyH-lVmB3pdi5FLsrxX6WklXHsyoQ0ZfCQKObuhH_Aa4Yh80</recordid><startdate>2020</startdate><enddate>2020</enddate><creator>Cartas, Alejandro</creator><creator>Radeva, Petia</creator><creator>Dimiccoli, Mariella</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-4440-9954</orcidid></search><sort><creationdate>2020</creationdate><title>Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications</title><author>Cartas, Alejandro ; Radeva, Petia ; Dimiccoli, Mariella</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c408t-885c5c00ef2aac397a62cf088e96dc8b72350fce532d1ca5fceb955e4369df5d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Activity recognition</topic><topic>Biomedical monitoring</topic><topic>Cameras</topic><topic>Daily activity recognition</topic><topic>Data acquisition</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>domain adaptation</topic><topic>Domains</topic><topic>Machine learning</topic><topic>Monitoring</topic><topic>Object recognition</topic><topic>Training</topic><topic>visual lifelogs</topic><topic>Visualization</topic><topic>wearable cameras</topic><topic>Wearable technology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Cartas, Alejandro</creatorcontrib><creatorcontrib>Radeva, Petia</creatorcontrib><creatorcontrib>Dimiccoli, Mariella</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Cartas, Alejandro</au><au>Radeva, Petia</au><au>Dimiccoli, Mariella</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2020</date><risdate>2020</risdate><volume>8</volume><spage>77344</spage><epage>77363</epage><pages>77344-77363</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>Activity recognition from wearable photo-cameras is crucial for lifestyle characterization and health monitoring. However, to enable its wide-spreading use in real-world applications, a high level of generalization needs to be ensured on unseen users. Currently, state-of-the-art methods have been tested only on relatively small datasets consisting of data collected by a few users that are partially seen during training. In this paper, we built a new egocentric dataset acquired by 15 people through a wearable photo-camera and used it to test the generalization capabilities of several state-of-the-art methods for egocentric activity recognition on unseen users and daily image sequences. In addition, we propose several variants to state-of-the-art deep learning architectures, and we show that it is possible to achieve 79.87% accuracy on users unseen during training. Furthermore, to show that the proposed dataset and approach can be useful in real-world applications, where data can be acquired by different wearable cameras and labeled data are scarcely available, we employed a domain adaptation strategy on two egocentric activity recognition benchmark datasets. These experiments show that the model learned with our dataset, can easily be transferred to other domains with a very small amount of labeled data. Taken together, those results show that activity recognition from wearable photo-cameras is mature enough to be tested in real-world applications.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2020.2990333</doi><tpages>20</tpages><orcidid>https://orcid.org/0000-0002-4440-9954</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2169-3536 |
ispartof | IEEE access, 2020, Vol.8, p.77344-77363 |
issn | 2169-3536 2169-3536 |
language | eng |
recordid | cdi_ieee_primary_9078767 |
source | IEEE Open Access Journals; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals |
subjects | Activity recognition Biomedical monitoring Cameras Daily activity recognition Data acquisition Datasets Deep learning domain adaptation Domains Machine learning Monitoring Object recognition Training visual lifelogs Visualization wearable cameras Wearable technology |
title | Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-19T00%3A44%3A06IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Activities%20of%20Daily%20Living%20Monitoring%20via%20a%20Wearable%20Camera:%20Toward%20Real-World%20Applications&rft.jtitle=IEEE%20access&rft.au=Cartas,%20Alejandro&rft.date=2020&rft.volume=8&rft.spage=77344&rft.epage=77363&rft.pages=77344-77363&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2020.2990333&rft_dat=%3Cproquest_ieee_%3E2454092246%3C/proquest_ieee_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2454092246&rft_id=info:pmid/&rft_ieee_id=9078767&rft_doaj_id=oai_doaj_org_article_624f0ebeb0c1485484a6293e80ff1d85&rfr_iscdi=true |