Occupancy Measurement in Under-Actuated Zones: YOLO-based Deep Learning Approach
The challenge of accurately detecting and identifying individuals within under-actuated zones presents a relevant research problem in occupant detection. This study aims to address the challenge of occupant detection in under-actuated zones through the utilization of the You Only Look Once version 8...
Gespeichert in:
Veröffentlicht in: | International journal of advanced computer science & applications 2024, Vol.15 (2) |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | 2 |
container_start_page | |
container_title | International journal of advanced computer science & applications |
container_volume | 15 |
creator | Syahputra, Ade -, Yaddarabullah Azhary, Mohammad Faiz Rahman, Aedah Binti Abd Saad, Amna |
description | The challenge of accurately detecting and identifying individuals within under-actuated zones presents a relevant research problem in occupant detection. This study aims to address the challenge of occupant detection in under-actuated zones through the utilization of the You Only Look Once version 8 (YOLO v8) object detection model. The research methodology involves a comprehensive evaluation of YOLO v8's performance across three distinct zones, where its precision, accuracy, and recall capabilities in identifying occupants are rigorously assessed. The outcomes of this performance evaluation, expressed through quantitative metrics, provide compelling evidence of the efficacy of the YOLO v8 model in the context of occupant detection in under-actuated zones. Across these three diverse under-actuated zones, YOLO v8 consistently exhibits remarkable mean Average Precision (mAP) scores, achieving 99.2% in Zone 1, 78.3% in Zone 2, and 96.2% in Zone 3. These mAP scores serve as a testament to the model's precision, indicating its proficiency in accurately localizing and identifying occupants within each zone. Furthermore, YOLO v8 demonstrates impressive efficiency in executing occupant detection tasks. The model boasts rapid processing times, with all three zones being analyzed in a matter of milliseconds. Specifically, YOLO v8 achieves execution times of 0.004 seconds in both Zone 1 and Zone 3, while Zone 2, which entails slightly more computational effort, still maintains an efficient execution time of 0.024 seconds. This efficiency constitutes a pivotal advantage of YOLO v8, as it ensures expeditious and effective occupant detection in the context of under-actuated zones. |
doi_str_mv | 10.14569/IJACSA.2024.0150277 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2992551171</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2992551171</sourcerecordid><originalsourceid>FETCH-LOGICAL-c274t-4e3e3f13c1a5345942ab50fcaba3734288793918a31528b80db2e4978c51a903</originalsourceid><addsrcrecordid>eNotkEtLAzEUhYMoWGr_gYsB16l5Nom7ob4qIyNYQd2ETHpHW2xmTGYW_feObc_mHg6He-BD6JKSKRVyZq4XT_n8NZ8ywsSUUEmYUidoxKicYSkVOd17jSlR7-doktKGDOKGzTQfoZfS-751we-yZ3Cpj7CF0GXrkL2FFUSc-653HayyzyZAusk-yqLElUtDcgvQZgW4GNbhK8vbNjbOf1-gs9r9JJgc7xgt7--W80dclA-LeV5gz5TosAAOvKbcUye5kEYwV0lSe1c5rrhgWivDDdWOU8l0pcmqYiCM0l5SZwgfo6vD22H1t4fU2U3TxzAsWmYMk5JSRYeWOLR8bFKKUNs2rrcu7iwldk_PHujZf3r2SI__Ab7yYLo</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2992551171</pqid></control><display><type>article</type><title>Occupancy Measurement in Under-Actuated Zones: YOLO-based Deep Learning Approach</title><source>EZB Electronic Journals Library</source><creator>Syahputra, Ade ; -, Yaddarabullah ; Azhary, Mohammad Faiz ; Rahman, Aedah Binti Abd ; Saad, Amna</creator><creatorcontrib>Syahputra, Ade ; -, Yaddarabullah ; Azhary, Mohammad Faiz ; Rahman, Aedah Binti Abd ; Saad, Amna</creatorcontrib><description>The challenge of accurately detecting and identifying individuals within under-actuated zones presents a relevant research problem in occupant detection. This study aims to address the challenge of occupant detection in under-actuated zones through the utilization of the You Only Look Once version 8 (YOLO v8) object detection model. The research methodology involves a comprehensive evaluation of YOLO v8's performance across three distinct zones, where its precision, accuracy, and recall capabilities in identifying occupants are rigorously assessed. The outcomes of this performance evaluation, expressed through quantitative metrics, provide compelling evidence of the efficacy of the YOLO v8 model in the context of occupant detection in under-actuated zones. Across these three diverse under-actuated zones, YOLO v8 consistently exhibits remarkable mean Average Precision (mAP) scores, achieving 99.2% in Zone 1, 78.3% in Zone 2, and 96.2% in Zone 3. These mAP scores serve as a testament to the model's precision, indicating its proficiency in accurately localizing and identifying occupants within each zone. Furthermore, YOLO v8 demonstrates impressive efficiency in executing occupant detection tasks. The model boasts rapid processing times, with all three zones being analyzed in a matter of milliseconds. Specifically, YOLO v8 achieves execution times of 0.004 seconds in both Zone 1 and Zone 3, while Zone 2, which entails slightly more computational effort, still maintains an efficient execution time of 0.024 seconds. This efficiency constitutes a pivotal advantage of YOLO v8, as it ensures expeditious and effective occupant detection in the context of under-actuated zones.</description><identifier>ISSN: 2158-107X</identifier><identifier>EISSN: 2156-5570</identifier><identifier>DOI: 10.14569/IJACSA.2024.0150277</identifier><language>eng</language><publisher>West Yorkshire: Science and Information (SAI) Organization Limited</publisher><subject>Accuracy ; Air conditioning ; Algorithms ; Computer science ; Context ; Cooling ; Datasets ; Deep learning ; Energy consumption ; Energy efficiency ; Environmental conditions ; HVAC ; Indoor air quality ; Object recognition ; Performance evaluation ; Research methodology ; Ventilation</subject><ispartof>International journal of advanced computer science & applications, 2024, Vol.15 (2)</ispartof><rights>2024. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,4024,27923,27924,27925</link.rule.ids></links><search><creatorcontrib>Syahputra, Ade</creatorcontrib><creatorcontrib>-, Yaddarabullah</creatorcontrib><creatorcontrib>Azhary, Mohammad Faiz</creatorcontrib><creatorcontrib>Rahman, Aedah Binti Abd</creatorcontrib><creatorcontrib>Saad, Amna</creatorcontrib><title>Occupancy Measurement in Under-Actuated Zones: YOLO-based Deep Learning Approach</title><title>International journal of advanced computer science & applications</title><description>The challenge of accurately detecting and identifying individuals within under-actuated zones presents a relevant research problem in occupant detection. This study aims to address the challenge of occupant detection in under-actuated zones through the utilization of the You Only Look Once version 8 (YOLO v8) object detection model. The research methodology involves a comprehensive evaluation of YOLO v8's performance across three distinct zones, where its precision, accuracy, and recall capabilities in identifying occupants are rigorously assessed. The outcomes of this performance evaluation, expressed through quantitative metrics, provide compelling evidence of the efficacy of the YOLO v8 model in the context of occupant detection in under-actuated zones. Across these three diverse under-actuated zones, YOLO v8 consistently exhibits remarkable mean Average Precision (mAP) scores, achieving 99.2% in Zone 1, 78.3% in Zone 2, and 96.2% in Zone 3. These mAP scores serve as a testament to the model's precision, indicating its proficiency in accurately localizing and identifying occupants within each zone. Furthermore, YOLO v8 demonstrates impressive efficiency in executing occupant detection tasks. The model boasts rapid processing times, with all three zones being analyzed in a matter of milliseconds. Specifically, YOLO v8 achieves execution times of 0.004 seconds in both Zone 1 and Zone 3, while Zone 2, which entails slightly more computational effort, still maintains an efficient execution time of 0.024 seconds. This efficiency constitutes a pivotal advantage of YOLO v8, as it ensures expeditious and effective occupant detection in the context of under-actuated zones.</description><subject>Accuracy</subject><subject>Air conditioning</subject><subject>Algorithms</subject><subject>Computer science</subject><subject>Context</subject><subject>Cooling</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Energy consumption</subject><subject>Energy efficiency</subject><subject>Environmental conditions</subject><subject>HVAC</subject><subject>Indoor air quality</subject><subject>Object recognition</subject><subject>Performance evaluation</subject><subject>Research methodology</subject><subject>Ventilation</subject><issn>2158-107X</issn><issn>2156-5570</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>8G5</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNotkEtLAzEUhYMoWGr_gYsB16l5Nom7ob4qIyNYQd2ETHpHW2xmTGYW_feObc_mHg6He-BD6JKSKRVyZq4XT_n8NZ8ywsSUUEmYUidoxKicYSkVOd17jSlR7-doktKGDOKGzTQfoZfS-751we-yZ3Cpj7CF0GXrkL2FFUSc-653HayyzyZAusk-yqLElUtDcgvQZgW4GNbhK8vbNjbOf1-gs9r9JJgc7xgt7--W80dclA-LeV5gz5TosAAOvKbcUye5kEYwV0lSe1c5rrhgWivDDdWOU8l0pcmqYiCM0l5SZwgfo6vD22H1t4fU2U3TxzAsWmYMk5JSRYeWOLR8bFKKUNs2rrcu7iwldk_PHujZf3r2SI__Ab7yYLo</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Syahputra, Ade</creator><creator>-, Yaddarabullah</creator><creator>Azhary, Mohammad Faiz</creator><creator>Rahman, Aedah Binti Abd</creator><creator>Saad, Amna</creator><general>Science and Information (SAI) Organization Limited</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7XB</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>M2O</scope><scope>MBDVC</scope><scope>P5Z</scope><scope>P62</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope></search><sort><creationdate>2024</creationdate><title>Occupancy Measurement in Under-Actuated Zones: YOLO-based Deep Learning Approach</title><author>Syahputra, Ade ; -, Yaddarabullah ; Azhary, Mohammad Faiz ; Rahman, Aedah Binti Abd ; Saad, Amna</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c274t-4e3e3f13c1a5345942ab50fcaba3734288793918a31528b80db2e4978c51a903</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Accuracy</topic><topic>Air conditioning</topic><topic>Algorithms</topic><topic>Computer science</topic><topic>Context</topic><topic>Cooling</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Energy consumption</topic><topic>Energy efficiency</topic><topic>Environmental conditions</topic><topic>HVAC</topic><topic>Indoor air quality</topic><topic>Object recognition</topic><topic>Performance evaluation</topic><topic>Research methodology</topic><topic>Ventilation</topic><toplevel>online_resources</toplevel><creatorcontrib>Syahputra, Ade</creatorcontrib><creatorcontrib>-, Yaddarabullah</creatorcontrib><creatorcontrib>Azhary, Mohammad Faiz</creatorcontrib><creatorcontrib>Rahman, Aedah Binti Abd</creatorcontrib><creatorcontrib>Saad, Amna</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest_Research Library</collection><collection>Research Library (Corporate)</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><jtitle>International journal of advanced computer science & applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Syahputra, Ade</au><au>-, Yaddarabullah</au><au>Azhary, Mohammad Faiz</au><au>Rahman, Aedah Binti Abd</au><au>Saad, Amna</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Occupancy Measurement in Under-Actuated Zones: YOLO-based Deep Learning Approach</atitle><jtitle>International journal of advanced computer science & applications</jtitle><date>2024</date><risdate>2024</risdate><volume>15</volume><issue>2</issue><issn>2158-107X</issn><eissn>2156-5570</eissn><abstract>The challenge of accurately detecting and identifying individuals within under-actuated zones presents a relevant research problem in occupant detection. This study aims to address the challenge of occupant detection in under-actuated zones through the utilization of the You Only Look Once version 8 (YOLO v8) object detection model. The research methodology involves a comprehensive evaluation of YOLO v8's performance across three distinct zones, where its precision, accuracy, and recall capabilities in identifying occupants are rigorously assessed. The outcomes of this performance evaluation, expressed through quantitative metrics, provide compelling evidence of the efficacy of the YOLO v8 model in the context of occupant detection in under-actuated zones. Across these three diverse under-actuated zones, YOLO v8 consistently exhibits remarkable mean Average Precision (mAP) scores, achieving 99.2% in Zone 1, 78.3% in Zone 2, and 96.2% in Zone 3. These mAP scores serve as a testament to the model's precision, indicating its proficiency in accurately localizing and identifying occupants within each zone. Furthermore, YOLO v8 demonstrates impressive efficiency in executing occupant detection tasks. The model boasts rapid processing times, with all three zones being analyzed in a matter of milliseconds. Specifically, YOLO v8 achieves execution times of 0.004 seconds in both Zone 1 and Zone 3, while Zone 2, which entails slightly more computational effort, still maintains an efficient execution time of 0.024 seconds. This efficiency constitutes a pivotal advantage of YOLO v8, as it ensures expeditious and effective occupant detection in the context of under-actuated zones.</abstract><cop>West Yorkshire</cop><pub>Science and Information (SAI) Organization Limited</pub><doi>10.14569/IJACSA.2024.0150277</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2158-107X |
ispartof | International journal of advanced computer science & applications, 2024, Vol.15 (2) |
issn | 2158-107X 2156-5570 |
language | eng |
recordid | cdi_proquest_journals_2992551171 |
source | EZB Electronic Journals Library |
subjects | Accuracy Air conditioning Algorithms Computer science Context Cooling Datasets Deep learning Energy consumption Energy efficiency Environmental conditions HVAC Indoor air quality Object recognition Performance evaluation Research methodology Ventilation |
title | Occupancy Measurement in Under-Actuated Zones: YOLO-based Deep Learning Approach |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-03T14%3A52%3A06IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Occupancy%20Measurement%20in%20Under-Actuated%20Zones:%20YOLO-based%20Deep%20Learning%20Approach&rft.jtitle=International%20journal%20of%20advanced%20computer%20science%20&%20applications&rft.au=Syahputra,%20Ade&rft.date=2024&rft.volume=15&rft.issue=2&rft.issn=2158-107X&rft.eissn=2156-5570&rft_id=info:doi/10.14569/IJACSA.2024.0150277&rft_dat=%3Cproquest_cross%3E2992551171%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2992551171&rft_id=info:pmid/&rfr_iscdi=true |