YOLOv5-MHSA-DS: an efficient pig detection and counting method
Accurate and efficient livestock detection and counting are crucial for agricultural intelligence. To address the obstacles created by traditional manual methods and limitations of current vision technology, we introduce YOLOv5-MHSA-DS, a novel model that integrates YOLOv5 framework with Multi-Head...
Gespeichert in:
Veröffentlicht in: | Systems science & control engineering 2024-12, Vol.12 (1) |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | 1 |
container_start_page | |
container_title | Systems science & control engineering |
container_volume | 12 |
creator | Hao, Wangli Zhang, Li Xu, Shu-ai Han, Meng Li, Fuzhong Yang, Hua |
description | Accurate and efficient livestock detection and counting are crucial for agricultural intelligence. To address the obstacles created by traditional manual methods and limitations of current vision technology, we introduce YOLOv5-MHSA-DS, a novel model that integrates YOLOv5 framework with Multi-Head Self-Attention and DySample modules. Multi-Head Self-Attention excels at capturing diverse features, enhancing pig detection and counting accuracy. On the other hand, DySample dynamically adjusts sampling strategies based on input data, allowing it to focus on the most critical parts of the image and thereby significantly improving pig detection and counting performance. To validate the generalization and robustness of our proposed model, we conducted ablation experiments. The results demonstrate that YOLOv5-MHSA-DS achieves an impressive mAP of 93.8% and counting accuracy of 95.0%, surpassing other models by significant margins of 12.2% and 19.0%, respectively. |
doi_str_mv | 10.1080/21642583.2024.2394428 |
format | Article |
fullrecord | <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_proquest_journals_3145933311</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_02730eb5595b4460b43305baa4b4a632</doaj_id><sourcerecordid>3145933311</sourcerecordid><originalsourceid>FETCH-LOGICAL-d283t-e9c890a89ec58b1345a571e1d76e4ea0c511f2419f4bcf8b44112c3fe0951f393</originalsourceid><addsrcrecordid>eNo1kE1LAzEQhoMgWGp_grDgeWuSSbaJB7HUjxYqPVQPnkI2m9SU7aZmU6X_3l2rcxmY9-VheBC6InhMsMA3lBSMcgFjiikbU5CMUXGGBv0974MLNGrbLe5GcNKlA3T3vlquvnj-Ml9P84f1baabzDrnjbdNyvZ-k1U2WZN8aLqoykw4NMk3m2xn00eoLtG503VrR397iN6eHl9n83y5el7Mpsu8ogJSbqUREmshreGiJMC45hNiSTUpLLMaG06Io4xIx0rjRMkYIdSAs1hy4kDCEC1O3CrordpHv9PxqIL26vcQ4kbpmLyprcJ0AtiWnEvecQpcMgDMS61ZyXQBtGNdn1j7GD4Ptk1qGw6x6d5XQBiXAEBI17o_tXzjQtzp7xDrSiV9rEN0UTfG93WsevHqX7zqxas_8fADCIh0aw</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3145933311</pqid></control><display><type>article</type><title>YOLOv5-MHSA-DS: an efficient pig detection and counting method</title><source>DOAJ Directory of Open Access Journals</source><source>Access via Taylor & Francis (Open Access Collection)</source><source>EZB-FREE-00999 freely available EZB journals</source><creator>Hao, Wangli ; Zhang, Li ; Xu, Shu-ai ; Han, Meng ; Li, Fuzhong ; Yang, Hua</creator><creatorcontrib>Hao, Wangli ; Zhang, Li ; Xu, Shu-ai ; Han, Meng ; Li, Fuzhong ; Yang, Hua</creatorcontrib><description>Accurate and efficient livestock detection and counting are crucial for agricultural intelligence. To address the obstacles created by traditional manual methods and limitations of current vision technology, we introduce YOLOv5-MHSA-DS, a novel model that integrates YOLOv5 framework with Multi-Head Self-Attention and DySample modules. Multi-Head Self-Attention excels at capturing diverse features, enhancing pig detection and counting accuracy. On the other hand, DySample dynamically adjusts sampling strategies based on input data, allowing it to focus on the most critical parts of the image and thereby significantly improving pig detection and counting performance. To validate the generalization and robustness of our proposed model, we conducted ablation experiments. The results demonstrate that YOLOv5-MHSA-DS achieves an impressive mAP of 93.8% and counting accuracy of 95.0%, surpassing other models by significant margins of 12.2% and 19.0%, respectively.</description><identifier>EISSN: 2164-2583</identifier><identifier>DOI: 10.1080/21642583.2024.2394428</identifier><language>eng</language><publisher>Macclesfield: Taylor & Francis</publisher><subject>Ablation ; Accuracy ; Computer vision ; Critical components ; DySample ; multi-head self-attention ; pig detection and counting ; YOLOv5-MHSA-DS</subject><ispartof>Systems science & control engineering, 2024-12, Vol.12 (1)</ispartof><rights>2024 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group. 2024</rights><rights>2024 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group. This work is licensed under the Creative Commons Attribution – Non-Commercial License http://creativecommons.org/licenses/by-nc/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.tandfonline.com/doi/pdf/10.1080/21642583.2024.2394428$$EPDF$$P50$$Ginformaworld$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.tandfonline.com/doi/full/10.1080/21642583.2024.2394428$$EHTML$$P50$$Ginformaworld$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,864,2102,27502,27924,27925,59143,59144</link.rule.ids></links><search><creatorcontrib>Hao, Wangli</creatorcontrib><creatorcontrib>Zhang, Li</creatorcontrib><creatorcontrib>Xu, Shu-ai</creatorcontrib><creatorcontrib>Han, Meng</creatorcontrib><creatorcontrib>Li, Fuzhong</creatorcontrib><creatorcontrib>Yang, Hua</creatorcontrib><title>YOLOv5-MHSA-DS: an efficient pig detection and counting method</title><title>Systems science & control engineering</title><description>Accurate and efficient livestock detection and counting are crucial for agricultural intelligence. To address the obstacles created by traditional manual methods and limitations of current vision technology, we introduce YOLOv5-MHSA-DS, a novel model that integrates YOLOv5 framework with Multi-Head Self-Attention and DySample modules. Multi-Head Self-Attention excels at capturing diverse features, enhancing pig detection and counting accuracy. On the other hand, DySample dynamically adjusts sampling strategies based on input data, allowing it to focus on the most critical parts of the image and thereby significantly improving pig detection and counting performance. To validate the generalization and robustness of our proposed model, we conducted ablation experiments. The results demonstrate that YOLOv5-MHSA-DS achieves an impressive mAP of 93.8% and counting accuracy of 95.0%, surpassing other models by significant margins of 12.2% and 19.0%, respectively.</description><subject>Ablation</subject><subject>Accuracy</subject><subject>Computer vision</subject><subject>Critical components</subject><subject>DySample</subject><subject>multi-head self-attention</subject><subject>pig detection and counting</subject><subject>YOLOv5-MHSA-DS</subject><issn>2164-2583</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>0YH</sourceid><sourceid>8G5</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><sourceid>DOA</sourceid><recordid>eNo1kE1LAzEQhoMgWGp_grDgeWuSSbaJB7HUjxYqPVQPnkI2m9SU7aZmU6X_3l2rcxmY9-VheBC6InhMsMA3lBSMcgFjiikbU5CMUXGGBv0974MLNGrbLe5GcNKlA3T3vlquvnj-Ml9P84f1baabzDrnjbdNyvZ-k1U2WZN8aLqoykw4NMk3m2xn00eoLtG503VrR397iN6eHl9n83y5el7Mpsu8ogJSbqUREmshreGiJMC45hNiSTUpLLMaG06Io4xIx0rjRMkYIdSAs1hy4kDCEC1O3CrordpHv9PxqIL26vcQ4kbpmLyprcJ0AtiWnEvecQpcMgDMS61ZyXQBtGNdn1j7GD4Ptk1qGw6x6d5XQBiXAEBI17o_tXzjQtzp7xDrSiV9rEN0UTfG93WsevHqX7zqxas_8fADCIh0aw</recordid><startdate>20241231</startdate><enddate>20241231</enddate><creator>Hao, Wangli</creator><creator>Zhang, Li</creator><creator>Xu, Shu-ai</creator><creator>Han, Meng</creator><creator>Li, Fuzhong</creator><creator>Yang, Hua</creator><general>Taylor & Francis</general><general>Taylor & Francis Ltd</general><general>Taylor & Francis Group</general><scope>0YH</scope><scope>3V.</scope><scope>7SC</scope><scope>7XB</scope><scope>8FD</scope><scope>8FK</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M2O</scope><scope>MBDVC</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><scope>DOA</scope></search><sort><creationdate>20241231</creationdate><title>YOLOv5-MHSA-DS: an efficient pig detection and counting method</title><author>Hao, Wangli ; Zhang, Li ; Xu, Shu-ai ; Han, Meng ; Li, Fuzhong ; Yang, Hua</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-d283t-e9c890a89ec58b1345a571e1d76e4ea0c511f2419f4bcf8b44112c3fe0951f393</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Ablation</topic><topic>Accuracy</topic><topic>Computer vision</topic><topic>Critical components</topic><topic>DySample</topic><topic>multi-head self-attention</topic><topic>pig detection and counting</topic><topic>YOLOv5-MHSA-DS</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Hao, Wangli</creatorcontrib><creatorcontrib>Zhang, Li</creatorcontrib><creatorcontrib>Xu, Shu-ai</creatorcontrib><creatorcontrib>Han, Meng</creatorcontrib><creatorcontrib>Li, Fuzhong</creatorcontrib><creatorcontrib>Yang, Hua</creatorcontrib><collection>Access via Taylor & Francis (Open Access Collection)</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Technology Research Database</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Research Library</collection><collection>Research Library (Corporate)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Systems science & control engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Hao, Wangli</au><au>Zhang, Li</au><au>Xu, Shu-ai</au><au>Han, Meng</au><au>Li, Fuzhong</au><au>Yang, Hua</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>YOLOv5-MHSA-DS: an efficient pig detection and counting method</atitle><jtitle>Systems science & control engineering</jtitle><date>2024-12-31</date><risdate>2024</risdate><volume>12</volume><issue>1</issue><eissn>2164-2583</eissn><abstract>Accurate and efficient livestock detection and counting are crucial for agricultural intelligence. To address the obstacles created by traditional manual methods and limitations of current vision technology, we introduce YOLOv5-MHSA-DS, a novel model that integrates YOLOv5 framework with Multi-Head Self-Attention and DySample modules. Multi-Head Self-Attention excels at capturing diverse features, enhancing pig detection and counting accuracy. On the other hand, DySample dynamically adjusts sampling strategies based on input data, allowing it to focus on the most critical parts of the image and thereby significantly improving pig detection and counting performance. To validate the generalization and robustness of our proposed model, we conducted ablation experiments. The results demonstrate that YOLOv5-MHSA-DS achieves an impressive mAP of 93.8% and counting accuracy of 95.0%, surpassing other models by significant margins of 12.2% and 19.0%, respectively.</abstract><cop>Macclesfield</cop><pub>Taylor & Francis</pub><doi>10.1080/21642583.2024.2394428</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2164-2583 |
ispartof | Systems science & control engineering, 2024-12, Vol.12 (1) |
issn | 2164-2583 |
language | eng |
recordid | cdi_proquest_journals_3145933311 |
source | DOAJ Directory of Open Access Journals; Access via Taylor & Francis (Open Access Collection); EZB-FREE-00999 freely available EZB journals |
subjects | Ablation Accuracy Computer vision Critical components DySample multi-head self-attention pig detection and counting YOLOv5-MHSA-DS |
title | YOLOv5-MHSA-DS: an efficient pig detection and counting method |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T16%3A18%3A17IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=YOLOv5-MHSA-DS:%20an%20efficient%20pig%20detection%20and%20counting%20method&rft.jtitle=Systems%20science%20&%20control%20engineering&rft.au=Hao,%20Wangli&rft.date=2024-12-31&rft.volume=12&rft.issue=1&rft.eissn=2164-2583&rft_id=info:doi/10.1080/21642583.2024.2394428&rft_dat=%3Cproquest_doaj_%3E3145933311%3C/proquest_doaj_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3145933311&rft_id=info:pmid/&rft_doaj_id=oai_doaj_org_article_02730eb5595b4460b43305baa4b4a632&rfr_iscdi=true |