A real-time remote surveillance system for fruit flies of economic importance: sensitivity and image analysis
Timely detection of an invasion event, or a pest outbreak, is an extremely challenging operation of major importance for implementing management action toward eradication and/or containment. Fruit flies—FF—(Diptera: Tephritidae) comprise important invasive and quarantine species that threaten the wo...
Gespeichert in:
Veröffentlicht in: | Journal of pest science 2023-03, Vol.96 (2), p.611-622 |
---|---|
Hauptverfasser: | , , , , , , , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 622 |
---|---|
container_issue | 2 |
container_start_page | 611 |
container_title | Journal of pest science |
container_volume | 96 |
creator | Diller, Yoshua Shamsian, Aviv Shaked, Ben Altman, Yam Danziger, Bat-Chen Manrakhan, Aruna Serfontein, Leani Bali, Elma Wernicke, Matthias Egartner, Alois Colacci, Marco Sciarretta, Andrea Chechik, Gal Alchanatis, Victor Papadopoulos, Nikos T. Nestel, David |
description | Timely detection of an invasion event, or a pest outbreak, is an extremely challenging operation of major importance for implementing management action toward eradication and/or containment. Fruit flies—FF—(Diptera: Tephritidae) comprise important invasive and quarantine species that threaten the world fruit and vegetables production. The current manuscript introduces a recently developed McPhail-type electronic trap (e-trap) and provides data on its field performance to surveil three major invasive FF (
Ceratitis capitata
,
Bactrocera dorsalis
and
B. zonata
). Using FF male lures, the e-trap attracts the flies and retains them on a sticky surface placed in the internal part of the trap. The e-trap captures frames of the trapped adults and automatically uploads the images to the remote server for identification conducted on a novel algorithm involving deep learning. Both the e-trap and the developed code were tested in the field in Greece, Austria, Italy, South Africa and Israel. The FF classification code was initially trained using a machine-learning algorithm and FF images derived from laboratory colonies of two of the species (
C. capitata
and
B. zonata
). Field tests were then conducted to investigate the electronic, communication and attractive performance of the e-trap, and the model accuracy to classify FFs. Our results demonstrated a relatively good communication, electronic performance and trapping efficacy of the e-trap. The classification model provided average precision results (93–95%) for the three target FFs from images uploaded remotely from e-traps deployed in field conditions. The developed and field tested e-trap system complies with the suggested attributes required for an advanced camera-based smart-trap. |
doi_str_mv | 10.1007/s10340-022-01528-x |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2779690922</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2779690922</sourcerecordid><originalsourceid>FETCH-LOGICAL-c363t-7d3431c94ba6d34f22833f4889dd33d3fc664f02c03fe617a158854ac7c6d0aa3</originalsourceid><addsrcrecordid>eNp9kEtPwzAQhC0EEqXwBzhZ4mzwI3ESblXFS6rEBc6WcezKVRIXr1M1_x6XIrhx2lntzGr0IXTN6C2jtLoDRkVBCeWcUFbymuxP0IxJxklRSXn6q8v6HF0AbCjlDRX1DPULHK3uSPK9zaoPyWIY4876rtODycsEyfbYhYhdHH3CrvMWcHDYmjCE3hvs-22I6eC-x2AH8MnvfJqwHtp802uble4m8HCJzpzuwF79zDl6f3x4Wz6T1evTy3KxIkZIkUjVikIw0xQfWmbpOK-FcEVdN20rRCuckbJwlBsqnJWs0qys67LQpjKypVqLObo5_t3G8DlaSGoTxphLgOJV1ciGNpxnFz-6TAwA0Tq1jblvnBSj6oBVHbGqjFV9Y1X7HBLHEGTzsLbx7_U_qS_Q1HzH</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2779690922</pqid></control><display><type>article</type><title>A real-time remote surveillance system for fruit flies of economic importance: sensitivity and image analysis</title><source>Springer journals</source><creator>Diller, Yoshua ; Shamsian, Aviv ; Shaked, Ben ; Altman, Yam ; Danziger, Bat-Chen ; Manrakhan, Aruna ; Serfontein, Leani ; Bali, Elma ; Wernicke, Matthias ; Egartner, Alois ; Colacci, Marco ; Sciarretta, Andrea ; Chechik, Gal ; Alchanatis, Victor ; Papadopoulos, Nikos T. ; Nestel, David</creator><creatorcontrib>Diller, Yoshua ; Shamsian, Aviv ; Shaked, Ben ; Altman, Yam ; Danziger, Bat-Chen ; Manrakhan, Aruna ; Serfontein, Leani ; Bali, Elma ; Wernicke, Matthias ; Egartner, Alois ; Colacci, Marco ; Sciarretta, Andrea ; Chechik, Gal ; Alchanatis, Victor ; Papadopoulos, Nikos T. ; Nestel, David</creatorcontrib><description>Timely detection of an invasion event, or a pest outbreak, is an extremely challenging operation of major importance for implementing management action toward eradication and/or containment. Fruit flies—FF—(Diptera: Tephritidae) comprise important invasive and quarantine species that threaten the world fruit and vegetables production. The current manuscript introduces a recently developed McPhail-type electronic trap (e-trap) and provides data on its field performance to surveil three major invasive FF (
Ceratitis capitata
,
Bactrocera dorsalis
and
B. zonata
). Using FF male lures, the e-trap attracts the flies and retains them on a sticky surface placed in the internal part of the trap. The e-trap captures frames of the trapped adults and automatically uploads the images to the remote server for identification conducted on a novel algorithm involving deep learning. Both the e-trap and the developed code were tested in the field in Greece, Austria, Italy, South Africa and Israel. The FF classification code was initially trained using a machine-learning algorithm and FF images derived from laboratory colonies of two of the species (
C. capitata
and
B. zonata
). Field tests were then conducted to investigate the electronic, communication and attractive performance of the e-trap, and the model accuracy to classify FFs. Our results demonstrated a relatively good communication, electronic performance and trapping efficacy of the e-trap. The classification model provided average precision results (93–95%) for the three target FFs from images uploaded remotely from e-traps deployed in field conditions. The developed and field tested e-trap system complies with the suggested attributes required for an advanced camera-based smart-trap.</description><identifier>ISSN: 1612-4758</identifier><identifier>EISSN: 1612-4766</identifier><identifier>DOI: 10.1007/s10340-022-01528-x</identifier><language>eng</language><publisher>Berlin/Heidelberg: Springer Berlin Heidelberg</publisher><subject>Agricultural research ; Agriculture ; Algorithms ; Automation ; Biomedical and Life Sciences ; Classification ; Decision making ; Deep learning ; Ecology ; Economic analysis ; Economic importance ; Entomology ; Field tests ; Forestry ; Fruit flies ; Fruits ; Image analysis ; Image processing ; Invasive species ; Laboratories ; Life Sciences ; Machine learning ; Model accuracy ; Original Paper ; Pest outbreaks ; Plant Pathology ; Plant Sciences ; Sensors ; Surveillance ; Surveillance systems ; Zoology</subject><ispartof>Journal of pest science, 2023-03, Vol.96 (2), p.611-622</ispartof><rights>The Author(s) 2022</rights><rights>The Author(s) 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c363t-7d3431c94ba6d34f22833f4889dd33d3fc664f02c03fe617a158854ac7c6d0aa3</citedby><cites>FETCH-LOGICAL-c363t-7d3431c94ba6d34f22833f4889dd33d3fc664f02c03fe617a158854ac7c6d0aa3</cites><orcidid>0000-0003-4322-4648</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10340-022-01528-x$$EPDF$$P50$$Gspringer$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10340-022-01528-x$$EHTML$$P50$$Gspringer$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Diller, Yoshua</creatorcontrib><creatorcontrib>Shamsian, Aviv</creatorcontrib><creatorcontrib>Shaked, Ben</creatorcontrib><creatorcontrib>Altman, Yam</creatorcontrib><creatorcontrib>Danziger, Bat-Chen</creatorcontrib><creatorcontrib>Manrakhan, Aruna</creatorcontrib><creatorcontrib>Serfontein, Leani</creatorcontrib><creatorcontrib>Bali, Elma</creatorcontrib><creatorcontrib>Wernicke, Matthias</creatorcontrib><creatorcontrib>Egartner, Alois</creatorcontrib><creatorcontrib>Colacci, Marco</creatorcontrib><creatorcontrib>Sciarretta, Andrea</creatorcontrib><creatorcontrib>Chechik, Gal</creatorcontrib><creatorcontrib>Alchanatis, Victor</creatorcontrib><creatorcontrib>Papadopoulos, Nikos T.</creatorcontrib><creatorcontrib>Nestel, David</creatorcontrib><title>A real-time remote surveillance system for fruit flies of economic importance: sensitivity and image analysis</title><title>Journal of pest science</title><addtitle>J Pest Sci</addtitle><description>Timely detection of an invasion event, or a pest outbreak, is an extremely challenging operation of major importance for implementing management action toward eradication and/or containment. Fruit flies—FF—(Diptera: Tephritidae) comprise important invasive and quarantine species that threaten the world fruit and vegetables production. The current manuscript introduces a recently developed McPhail-type electronic trap (e-trap) and provides data on its field performance to surveil three major invasive FF (
Ceratitis capitata
,
Bactrocera dorsalis
and
B. zonata
). Using FF male lures, the e-trap attracts the flies and retains them on a sticky surface placed in the internal part of the trap. The e-trap captures frames of the trapped adults and automatically uploads the images to the remote server for identification conducted on a novel algorithm involving deep learning. Both the e-trap and the developed code were tested in the field in Greece, Austria, Italy, South Africa and Israel. The FF classification code was initially trained using a machine-learning algorithm and FF images derived from laboratory colonies of two of the species (
C. capitata
and
B. zonata
). Field tests were then conducted to investigate the electronic, communication and attractive performance of the e-trap, and the model accuracy to classify FFs. Our results demonstrated a relatively good communication, electronic performance and trapping efficacy of the e-trap. The classification model provided average precision results (93–95%) for the three target FFs from images uploaded remotely from e-traps deployed in field conditions. The developed and field tested e-trap system complies with the suggested attributes required for an advanced camera-based smart-trap.</description><subject>Agricultural research</subject><subject>Agriculture</subject><subject>Algorithms</subject><subject>Automation</subject><subject>Biomedical and Life Sciences</subject><subject>Classification</subject><subject>Decision making</subject><subject>Deep learning</subject><subject>Ecology</subject><subject>Economic analysis</subject><subject>Economic importance</subject><subject>Entomology</subject><subject>Field tests</subject><subject>Forestry</subject><subject>Fruit flies</subject><subject>Fruits</subject><subject>Image analysis</subject><subject>Image processing</subject><subject>Invasive species</subject><subject>Laboratories</subject><subject>Life Sciences</subject><subject>Machine learning</subject><subject>Model accuracy</subject><subject>Original Paper</subject><subject>Pest outbreaks</subject><subject>Plant Pathology</subject><subject>Plant Sciences</subject><subject>Sensors</subject><subject>Surveillance</subject><subject>Surveillance systems</subject><subject>Zoology</subject><issn>1612-4758</issn><issn>1612-4766</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>C6C</sourceid><sourceid>AFKRA</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNp9kEtPwzAQhC0EEqXwBzhZ4mzwI3ESblXFS6rEBc6WcezKVRIXr1M1_x6XIrhx2lntzGr0IXTN6C2jtLoDRkVBCeWcUFbymuxP0IxJxklRSXn6q8v6HF0AbCjlDRX1DPULHK3uSPK9zaoPyWIY4876rtODycsEyfbYhYhdHH3CrvMWcHDYmjCE3hvs-22I6eC-x2AH8MnvfJqwHtp802uble4m8HCJzpzuwF79zDl6f3x4Wz6T1evTy3KxIkZIkUjVikIw0xQfWmbpOK-FcEVdN20rRCuckbJwlBsqnJWs0qys67LQpjKypVqLObo5_t3G8DlaSGoTxphLgOJV1ciGNpxnFz-6TAwA0Tq1jblvnBSj6oBVHbGqjFV9Y1X7HBLHEGTzsLbx7_U_qS_Q1HzH</recordid><startdate>20230301</startdate><enddate>20230301</enddate><creator>Diller, Yoshua</creator><creator>Shamsian, Aviv</creator><creator>Shaked, Ben</creator><creator>Altman, Yam</creator><creator>Danziger, Bat-Chen</creator><creator>Manrakhan, Aruna</creator><creator>Serfontein, Leani</creator><creator>Bali, Elma</creator><creator>Wernicke, Matthias</creator><creator>Egartner, Alois</creator><creator>Colacci, Marco</creator><creator>Sciarretta, Andrea</creator><creator>Chechik, Gal</creator><creator>Alchanatis, Victor</creator><creator>Papadopoulos, Nikos T.</creator><creator>Nestel, David</creator><general>Springer Berlin Heidelberg</general><general>Springer Nature B.V</general><scope>C6C</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SS</scope><scope>7X2</scope><scope>8FE</scope><scope>8FH</scope><scope>8FK</scope><scope>AFKRA</scope><scope>ATCPS</scope><scope>BENPR</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>M0K</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><orcidid>https://orcid.org/0000-0003-4322-4648</orcidid></search><sort><creationdate>20230301</creationdate><title>A real-time remote surveillance system for fruit flies of economic importance: sensitivity and image analysis</title><author>Diller, Yoshua ; Shamsian, Aviv ; Shaked, Ben ; Altman, Yam ; Danziger, Bat-Chen ; Manrakhan, Aruna ; Serfontein, Leani ; Bali, Elma ; Wernicke, Matthias ; Egartner, Alois ; Colacci, Marco ; Sciarretta, Andrea ; Chechik, Gal ; Alchanatis, Victor ; Papadopoulos, Nikos T. ; Nestel, David</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c363t-7d3431c94ba6d34f22833f4889dd33d3fc664f02c03fe617a158854ac7c6d0aa3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Agricultural research</topic><topic>Agriculture</topic><topic>Algorithms</topic><topic>Automation</topic><topic>Biomedical and Life Sciences</topic><topic>Classification</topic><topic>Decision making</topic><topic>Deep learning</topic><topic>Ecology</topic><topic>Economic analysis</topic><topic>Economic importance</topic><topic>Entomology</topic><topic>Field tests</topic><topic>Forestry</topic><topic>Fruit flies</topic><topic>Fruits</topic><topic>Image analysis</topic><topic>Image processing</topic><topic>Invasive species</topic><topic>Laboratories</topic><topic>Life Sciences</topic><topic>Machine learning</topic><topic>Model accuracy</topic><topic>Original Paper</topic><topic>Pest outbreaks</topic><topic>Plant Pathology</topic><topic>Plant Sciences</topic><topic>Sensors</topic><topic>Surveillance</topic><topic>Surveillance systems</topic><topic>Zoology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Diller, Yoshua</creatorcontrib><creatorcontrib>Shamsian, Aviv</creatorcontrib><creatorcontrib>Shaked, Ben</creatorcontrib><creatorcontrib>Altman, Yam</creatorcontrib><creatorcontrib>Danziger, Bat-Chen</creatorcontrib><creatorcontrib>Manrakhan, Aruna</creatorcontrib><creatorcontrib>Serfontein, Leani</creatorcontrib><creatorcontrib>Bali, Elma</creatorcontrib><creatorcontrib>Wernicke, Matthias</creatorcontrib><creatorcontrib>Egartner, Alois</creatorcontrib><creatorcontrib>Colacci, Marco</creatorcontrib><creatorcontrib>Sciarretta, Andrea</creatorcontrib><creatorcontrib>Chechik, Gal</creatorcontrib><creatorcontrib>Alchanatis, Victor</creatorcontrib><creatorcontrib>Papadopoulos, Nikos T.</creatorcontrib><creatorcontrib>Nestel, David</creatorcontrib><collection>Springer Nature OA Free Journals</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Agricultural Science Collection</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Agricultural & Environmental Science Collection</collection><collection>ProQuest Central</collection><collection>ProQuest Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection (Proquest) (PQ_SDU_P3)</collection><collection>Agriculture Science Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><jtitle>Journal of pest science</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Diller, Yoshua</au><au>Shamsian, Aviv</au><au>Shaked, Ben</au><au>Altman, Yam</au><au>Danziger, Bat-Chen</au><au>Manrakhan, Aruna</au><au>Serfontein, Leani</au><au>Bali, Elma</au><au>Wernicke, Matthias</au><au>Egartner, Alois</au><au>Colacci, Marco</au><au>Sciarretta, Andrea</au><au>Chechik, Gal</au><au>Alchanatis, Victor</au><au>Papadopoulos, Nikos T.</au><au>Nestel, David</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A real-time remote surveillance system for fruit flies of economic importance: sensitivity and image analysis</atitle><jtitle>Journal of pest science</jtitle><stitle>J Pest Sci</stitle><date>2023-03-01</date><risdate>2023</risdate><volume>96</volume><issue>2</issue><spage>611</spage><epage>622</epage><pages>611-622</pages><issn>1612-4758</issn><eissn>1612-4766</eissn><abstract>Timely detection of an invasion event, or a pest outbreak, is an extremely challenging operation of major importance for implementing management action toward eradication and/or containment. Fruit flies—FF—(Diptera: Tephritidae) comprise important invasive and quarantine species that threaten the world fruit and vegetables production. The current manuscript introduces a recently developed McPhail-type electronic trap (e-trap) and provides data on its field performance to surveil three major invasive FF (
Ceratitis capitata
,
Bactrocera dorsalis
and
B. zonata
). Using FF male lures, the e-trap attracts the flies and retains them on a sticky surface placed in the internal part of the trap. The e-trap captures frames of the trapped adults and automatically uploads the images to the remote server for identification conducted on a novel algorithm involving deep learning. Both the e-trap and the developed code were tested in the field in Greece, Austria, Italy, South Africa and Israel. The FF classification code was initially trained using a machine-learning algorithm and FF images derived from laboratory colonies of two of the species (
C. capitata
and
B. zonata
). Field tests were then conducted to investigate the electronic, communication and attractive performance of the e-trap, and the model accuracy to classify FFs. Our results demonstrated a relatively good communication, electronic performance and trapping efficacy of the e-trap. The classification model provided average precision results (93–95%) for the three target FFs from images uploaded remotely from e-traps deployed in field conditions. The developed and field tested e-trap system complies with the suggested attributes required for an advanced camera-based smart-trap.</abstract><cop>Berlin/Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><doi>10.1007/s10340-022-01528-x</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0003-4322-4648</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1612-4758 |
ispartof | Journal of pest science, 2023-03, Vol.96 (2), p.611-622 |
issn | 1612-4758 1612-4766 |
language | eng |
recordid | cdi_proquest_journals_2779690922 |
source | Springer journals |
subjects | Agricultural research Agriculture Algorithms Automation Biomedical and Life Sciences Classification Decision making Deep learning Ecology Economic analysis Economic importance Entomology Field tests Forestry Fruit flies Fruits Image analysis Image processing Invasive species Laboratories Life Sciences Machine learning Model accuracy Original Paper Pest outbreaks Plant Pathology Plant Sciences Sensors Surveillance Surveillance systems Zoology |
title | A real-time remote surveillance system for fruit flies of economic importance: sensitivity and image analysis |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T09%3A29%3A38IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20real-time%20remote%20surveillance%20system%20for%20fruit%20flies%20of%20economic%20importance:%20sensitivity%20and%20image%20analysis&rft.jtitle=Journal%20of%20pest%20science&rft.au=Diller,%20Yoshua&rft.date=2023-03-01&rft.volume=96&rft.issue=2&rft.spage=611&rft.epage=622&rft.pages=611-622&rft.issn=1612-4758&rft.eissn=1612-4766&rft_id=info:doi/10.1007/s10340-022-01528-x&rft_dat=%3Cproquest_cross%3E2779690922%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2779690922&rft_id=info:pmid/&rfr_iscdi=true |