Towards sustainable demersal fisheries: NepCon image acquisition system for automatic Nephrops norvegicus detection

Underwater video monitoring systems are being widely used in fisheries to investigate fish behavior in relation to fishing gear and fishing gear performance during fishing. Such systems can be useful to evaluate the catch composition as well. In demersal trawl fisheries, however, their applicability...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:PloS one 2021-06, Vol.16 (6), p.e0252824-e0252824
Hauptverfasser: Sokolova, Maria, Thompson, Fletcher, Mariani, Patrizio, Krag, Ludvig Ahm
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page e0252824
container_issue 6
container_start_page e0252824
container_title PloS one
container_volume 16
creator Sokolova, Maria
Thompson, Fletcher
Mariani, Patrizio
Krag, Ludvig Ahm
description Underwater video monitoring systems are being widely used in fisheries to investigate fish behavior in relation to fishing gear and fishing gear performance during fishing. Such systems can be useful to evaluate the catch composition as well. In demersal trawl fisheries, however, their applicability can be challenged by low light conditions, mobilized sediment and scattering in murky waters. In this study, we introduce a novel observation system (called NepCon) which aims at reducing current limitations by combining an optimized image acquisition setup and tailored image analyses software. The NepCon system includes a high-contrast background to enhance the visibility of the target objects, a compact camera and an artificial light source. The image analysis software includes a machine learning algorithm which is evaluated here to test automatic detection and count of Norway lobster (Nephrops norvegicus). NepCon is specifically designed for applications in demersal trawls and this first phase aims at increasing the accuracy of N. norvegicus detection at the data acquisition level. To find the best contrasting background for the purpose we compared the output of four image segmentation methods applied to static images of N. norvegicus fixed in front of four test background colors. The background color with the best performance was then used to evaluate computer vision and deep learning approaches for automatic detection, tracking and counting of N. norvegicus in the videos. In this initial phase we tested the system in an experimental setting to understand the feasibility of the system for future implementation in real demersal fishing conditions. The N. norvegicus directed trawl fishery typically has no assistance from underwater observation technology and therefore are largely conducted blindly. The demonstrated perception system achieves 76% accuracy (F-score) in automatic detection and count of N. norvegicus, which provides a significant elevation of the current benchmark.
doi_str_mv 10.1371/journal.pone.0252824
format Article
fullrecord <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_2541759206</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A665563830</galeid><doaj_id>oai_doaj_org_article_5ada4e32d6cc41dface0860224d446d0</doaj_id><sourcerecordid>A665563830</sourcerecordid><originalsourceid>FETCH-LOGICAL-c720t-f3ab17171b6910614208cfa56464bd319af4e0bf9ada677c75190c048a67ef453</originalsourceid><addsrcrecordid>eNqNk11r1UAQhoMotlb_gWBAEL04x_1O4oVQDn4cKBa0ertsNpOcLUn2dCep9t-76YnSSC9kL5JMnnln32EmSZ5TsqY8o28v_Rh60673voc1YZLlTDxIjmnB2Uoxwh_eeT9KniBeEiJ5rtTj5IgLyrkQ-XGCF_6nCRWmOOJgXG_KFtIKOgho2rR2uIPgAN-lX2C_8X3qOtNAauzV6NANLkbwBgfo0tqH1IyD78zg7ETvgt9j2vtwDY2zI0bVAeyU8jR5VJsW4dn8PEm-f_xwsfm8Ojv_tN2cnq1sxsiwqrkpaRZPqQpKFBWM5LY2UgklyorTwtQCSFkXpjIqy2wmaUEsEXn8glpIfpK8OOjuW4967hdqJgXNZMGIisT2QFTeXOp9iO7CjfbG6duAD402IfppQctYRQBnlbJW0Ko2FkiuCGOiEkJVJGq9n6uNZQeVhX4Ipl2ILv_0bqcbf63z6EvKPAq8ngWCvxoBB905tNC2pgc_3t6bcZnlOY3oy3_Q-93NVGOiAdfXPta1k6g-VUpKxXM-3Xt9DxVPnAJn43DVLsYXCW8WCZEZ4NfQmBFRb799_X_2_MeSfXWH3YFphx36dpxGBpegOIA2eMQA9d8mU6Kn3fjTDT3thp53g_8G-ygBdg</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2541759206</pqid></control><display><type>article</type><title>Towards sustainable demersal fisheries: NepCon image acquisition system for automatic Nephrops norvegicus detection</title><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>Public Library of Science (PLoS)</source><source>PubMed Central</source><source>Free Full-Text Journals in Chemistry</source><creator>Sokolova, Maria ; Thompson, Fletcher ; Mariani, Patrizio ; Krag, Ludvig Ahm</creator><contributor>Gadekallu, Thippa Reddy</contributor><creatorcontrib>Sokolova, Maria ; Thompson, Fletcher ; Mariani, Patrizio ; Krag, Ludvig Ahm ; Gadekallu, Thippa Reddy</creatorcontrib><description>Underwater video monitoring systems are being widely used in fisheries to investigate fish behavior in relation to fishing gear and fishing gear performance during fishing. Such systems can be useful to evaluate the catch composition as well. In demersal trawl fisheries, however, their applicability can be challenged by low light conditions, mobilized sediment and scattering in murky waters. In this study, we introduce a novel observation system (called NepCon) which aims at reducing current limitations by combining an optimized image acquisition setup and tailored image analyses software. The NepCon system includes a high-contrast background to enhance the visibility of the target objects, a compact camera and an artificial light source. The image analysis software includes a machine learning algorithm which is evaluated here to test automatic detection and count of Norway lobster (Nephrops norvegicus). NepCon is specifically designed for applications in demersal trawls and this first phase aims at increasing the accuracy of N. norvegicus detection at the data acquisition level. To find the best contrasting background for the purpose we compared the output of four image segmentation methods applied to static images of N. norvegicus fixed in front of four test background colors. The background color with the best performance was then used to evaluate computer vision and deep learning approaches for automatic detection, tracking and counting of N. norvegicus in the videos. In this initial phase we tested the system in an experimental setting to understand the feasibility of the system for future implementation in real demersal fishing conditions. The N. norvegicus directed trawl fishery typically has no assistance from underwater observation technology and therefore are largely conducted blindly. The demonstrated perception system achieves 76% accuracy (F-score) in automatic detection and count of N. norvegicus, which provides a significant elevation of the current benchmark.</description><identifier>ISSN: 1932-6203</identifier><identifier>EISSN: 1932-6203</identifier><identifier>DOI: 10.1371/journal.pone.0252824</identifier><identifier>PMID: 34133448</identifier><language>eng</language><publisher>San Francisco: Public Library of Science</publisher><subject>Analysis ; Animal behavior ; Automation ; Biology and Life Sciences ; Bycatch ; Cameras ; Commercial fishing ; Computer and Information Sciences ; Crustaceans ; Data collection ; Decision support systems ; Deep learning ; Earth Sciences ; Engineering and Technology ; Evaluation ; Fish ; Fish industry ; Fisheries ; Fishing ; Fishing equipment ; Image acquisition ; Image analysis ; Image processing ; Methods ; Monitoring ; Motivation ; Nephrops norvegicus ; Ocean floor ; Physical Sciences ; Research and Analysis Methods ; Sediments ; Sound detecting and ranging ; Survival ; Sustainable fisheries ; Underwater</subject><ispartof>PloS one, 2021-06, Vol.16 (6), p.e0252824-e0252824</ispartof><rights>COPYRIGHT 2021 Public Library of Science</rights><rights>2021 Sokolova et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2021 Sokolova et al 2021 Sokolova et al</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c720t-f3ab17171b6910614208cfa56464bd319af4e0bf9ada677c75190c048a67ef453</citedby><cites>FETCH-LOGICAL-c720t-f3ab17171b6910614208cfa56464bd319af4e0bf9ada677c75190c048a67ef453</cites><orcidid>0000-0002-6380-4052 ; 0000-0002-0639-9871</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8208558/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8208558/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,864,885,2102,2928,23866,27924,27925,53791,53793,79600,79601</link.rule.ids></links><search><contributor>Gadekallu, Thippa Reddy</contributor><creatorcontrib>Sokolova, Maria</creatorcontrib><creatorcontrib>Thompson, Fletcher</creatorcontrib><creatorcontrib>Mariani, Patrizio</creatorcontrib><creatorcontrib>Krag, Ludvig Ahm</creatorcontrib><title>Towards sustainable demersal fisheries: NepCon image acquisition system for automatic Nephrops norvegicus detection</title><title>PloS one</title><description>Underwater video monitoring systems are being widely used in fisheries to investigate fish behavior in relation to fishing gear and fishing gear performance during fishing. Such systems can be useful to evaluate the catch composition as well. In demersal trawl fisheries, however, their applicability can be challenged by low light conditions, mobilized sediment and scattering in murky waters. In this study, we introduce a novel observation system (called NepCon) which aims at reducing current limitations by combining an optimized image acquisition setup and tailored image analyses software. The NepCon system includes a high-contrast background to enhance the visibility of the target objects, a compact camera and an artificial light source. The image analysis software includes a machine learning algorithm which is evaluated here to test automatic detection and count of Norway lobster (Nephrops norvegicus). NepCon is specifically designed for applications in demersal trawls and this first phase aims at increasing the accuracy of N. norvegicus detection at the data acquisition level. To find the best contrasting background for the purpose we compared the output of four image segmentation methods applied to static images of N. norvegicus fixed in front of four test background colors. The background color with the best performance was then used to evaluate computer vision and deep learning approaches for automatic detection, tracking and counting of N. norvegicus in the videos. In this initial phase we tested the system in an experimental setting to understand the feasibility of the system for future implementation in real demersal fishing conditions. The N. norvegicus directed trawl fishery typically has no assistance from underwater observation technology and therefore are largely conducted blindly. The demonstrated perception system achieves 76% accuracy (F-score) in automatic detection and count of N. norvegicus, which provides a significant elevation of the current benchmark.</description><subject>Analysis</subject><subject>Animal behavior</subject><subject>Automation</subject><subject>Biology and Life Sciences</subject><subject>Bycatch</subject><subject>Cameras</subject><subject>Commercial fishing</subject><subject>Computer and Information Sciences</subject><subject>Crustaceans</subject><subject>Data collection</subject><subject>Decision support systems</subject><subject>Deep learning</subject><subject>Earth Sciences</subject><subject>Engineering and Technology</subject><subject>Evaluation</subject><subject>Fish</subject><subject>Fish industry</subject><subject>Fisheries</subject><subject>Fishing</subject><subject>Fishing equipment</subject><subject>Image acquisition</subject><subject>Image analysis</subject><subject>Image processing</subject><subject>Methods</subject><subject>Monitoring</subject><subject>Motivation</subject><subject>Nephrops norvegicus</subject><subject>Ocean floor</subject><subject>Physical Sciences</subject><subject>Research and Analysis Methods</subject><subject>Sediments</subject><subject>Sound detecting and ranging</subject><subject>Survival</subject><subject>Sustainable fisheries</subject><subject>Underwater</subject><issn>1932-6203</issn><issn>1932-6203</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>DOA</sourceid><recordid>eNqNk11r1UAQhoMotlb_gWBAEL04x_1O4oVQDn4cKBa0ertsNpOcLUn2dCep9t-76YnSSC9kL5JMnnln32EmSZ5TsqY8o28v_Rh60673voc1YZLlTDxIjmnB2Uoxwh_eeT9KniBeEiJ5rtTj5IgLyrkQ-XGCF_6nCRWmOOJgXG_KFtIKOgho2rR2uIPgAN-lX2C_8X3qOtNAauzV6NANLkbwBgfo0tqH1IyD78zg7ETvgt9j2vtwDY2zI0bVAeyU8jR5VJsW4dn8PEm-f_xwsfm8Ojv_tN2cnq1sxsiwqrkpaRZPqQpKFBWM5LY2UgklyorTwtQCSFkXpjIqy2wmaUEsEXn8glpIfpK8OOjuW4967hdqJgXNZMGIisT2QFTeXOp9iO7CjfbG6duAD402IfppQctYRQBnlbJW0Ko2FkiuCGOiEkJVJGq9n6uNZQeVhX4Ipl2ILv_0bqcbf63z6EvKPAq8ngWCvxoBB905tNC2pgc_3t6bcZnlOY3oy3_Q-93NVGOiAdfXPta1k6g-VUpKxXM-3Xt9DxVPnAJn43DVLsYXCW8WCZEZ4NfQmBFRb799_X_2_MeSfXWH3YFphx36dpxGBpegOIA2eMQA9d8mU6Kn3fjTDT3thp53g_8G-ygBdg</recordid><startdate>20210616</startdate><enddate>20210616</enddate><creator>Sokolova, Maria</creator><creator>Thompson, Fletcher</creator><creator>Mariani, Patrizio</creator><creator>Krag, Ludvig Ahm</creator><general>Public Library of Science</general><general>Public Library of Science (PLoS)</general><scope>AAYXX</scope><scope>CITATION</scope><scope>IOV</scope><scope>ISR</scope><scope>3V.</scope><scope>7QG</scope><scope>7QL</scope><scope>7QO</scope><scope>7RV</scope><scope>7SN</scope><scope>7SS</scope><scope>7T5</scope><scope>7TG</scope><scope>7TM</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AO</scope><scope>8C1</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>KB0</scope><scope>KL.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PDBOC</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-6380-4052</orcidid><orcidid>https://orcid.org/0000-0002-0639-9871</orcidid></search><sort><creationdate>20210616</creationdate><title>Towards sustainable demersal fisheries: NepCon image acquisition system for automatic Nephrops norvegicus detection</title><author>Sokolova, Maria ; Thompson, Fletcher ; Mariani, Patrizio ; Krag, Ludvig Ahm</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c720t-f3ab17171b6910614208cfa56464bd319af4e0bf9ada677c75190c048a67ef453</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Analysis</topic><topic>Animal behavior</topic><topic>Automation</topic><topic>Biology and Life Sciences</topic><topic>Bycatch</topic><topic>Cameras</topic><topic>Commercial fishing</topic><topic>Computer and Information Sciences</topic><topic>Crustaceans</topic><topic>Data collection</topic><topic>Decision support systems</topic><topic>Deep learning</topic><topic>Earth Sciences</topic><topic>Engineering and Technology</topic><topic>Evaluation</topic><topic>Fish</topic><topic>Fish industry</topic><topic>Fisheries</topic><topic>Fishing</topic><topic>Fishing equipment</topic><topic>Image acquisition</topic><topic>Image analysis</topic><topic>Image processing</topic><topic>Methods</topic><topic>Monitoring</topic><topic>Motivation</topic><topic>Nephrops norvegicus</topic><topic>Ocean floor</topic><topic>Physical Sciences</topic><topic>Research and Analysis Methods</topic><topic>Sediments</topic><topic>Sound detecting and ranging</topic><topic>Survival</topic><topic>Sustainable fisheries</topic><topic>Underwater</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Sokolova, Maria</creatorcontrib><creatorcontrib>Thompson, Fletcher</creatorcontrib><creatorcontrib>Mariani, Patrizio</creatorcontrib><creatorcontrib>Krag, Ludvig Ahm</creatorcontrib><collection>CrossRef</collection><collection>Gale In Context: Opposing Viewpoints</collection><collection>Gale In Context: Science</collection><collection>ProQuest Central (Corporate)</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Ecology Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Immunology Abstracts</collection><collection>Meteorological &amp; Geoastrophysical Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>Agricultural &amp; Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>Meteorological &amp; Geoastrophysical Abstracts - Academic</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agricultural Science Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Materials Science Collection</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>PloS one</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Sokolova, Maria</au><au>Thompson, Fletcher</au><au>Mariani, Patrizio</au><au>Krag, Ludvig Ahm</au><au>Gadekallu, Thippa Reddy</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Towards sustainable demersal fisheries: NepCon image acquisition system for automatic Nephrops norvegicus detection</atitle><jtitle>PloS one</jtitle><date>2021-06-16</date><risdate>2021</risdate><volume>16</volume><issue>6</issue><spage>e0252824</spage><epage>e0252824</epage><pages>e0252824-e0252824</pages><issn>1932-6203</issn><eissn>1932-6203</eissn><abstract>Underwater video monitoring systems are being widely used in fisheries to investigate fish behavior in relation to fishing gear and fishing gear performance during fishing. Such systems can be useful to evaluate the catch composition as well. In demersal trawl fisheries, however, their applicability can be challenged by low light conditions, mobilized sediment and scattering in murky waters. In this study, we introduce a novel observation system (called NepCon) which aims at reducing current limitations by combining an optimized image acquisition setup and tailored image analyses software. The NepCon system includes a high-contrast background to enhance the visibility of the target objects, a compact camera and an artificial light source. The image analysis software includes a machine learning algorithm which is evaluated here to test automatic detection and count of Norway lobster (Nephrops norvegicus). NepCon is specifically designed for applications in demersal trawls and this first phase aims at increasing the accuracy of N. norvegicus detection at the data acquisition level. To find the best contrasting background for the purpose we compared the output of four image segmentation methods applied to static images of N. norvegicus fixed in front of four test background colors. The background color with the best performance was then used to evaluate computer vision and deep learning approaches for automatic detection, tracking and counting of N. norvegicus in the videos. In this initial phase we tested the system in an experimental setting to understand the feasibility of the system for future implementation in real demersal fishing conditions. The N. norvegicus directed trawl fishery typically has no assistance from underwater observation technology and therefore are largely conducted blindly. The demonstrated perception system achieves 76% accuracy (F-score) in automatic detection and count of N. norvegicus, which provides a significant elevation of the current benchmark.</abstract><cop>San Francisco</cop><pub>Public Library of Science</pub><pmid>34133448</pmid><doi>10.1371/journal.pone.0252824</doi><tpages>e0252824</tpages><orcidid>https://orcid.org/0000-0002-6380-4052</orcidid><orcidid>https://orcid.org/0000-0002-0639-9871</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1932-6203
ispartof PloS one, 2021-06, Vol.16 (6), p.e0252824-e0252824
issn 1932-6203
1932-6203
language eng
recordid cdi_plos_journals_2541759206
source DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; Public Library of Science (PLoS); PubMed Central; Free Full-Text Journals in Chemistry
subjects Analysis
Animal behavior
Automation
Biology and Life Sciences
Bycatch
Cameras
Commercial fishing
Computer and Information Sciences
Crustaceans
Data collection
Decision support systems
Deep learning
Earth Sciences
Engineering and Technology
Evaluation
Fish
Fish industry
Fisheries
Fishing
Fishing equipment
Image acquisition
Image analysis
Image processing
Methods
Monitoring
Motivation
Nephrops norvegicus
Ocean floor
Physical Sciences
Research and Analysis Methods
Sediments
Sound detecting and ranging
Survival
Sustainable fisheries
Underwater
title Towards sustainable demersal fisheries: NepCon image acquisition system for automatic Nephrops norvegicus detection
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-05T14%3A19%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Towards%20sustainable%20demersal%20fisheries:%20NepCon%20image%20acquisition%20system%20for%20automatic%20Nephrops%20norvegicus%20detection&rft.jtitle=PloS%20one&rft.au=Sokolova,%20Maria&rft.date=2021-06-16&rft.volume=16&rft.issue=6&rft.spage=e0252824&rft.epage=e0252824&rft.pages=e0252824-e0252824&rft.issn=1932-6203&rft.eissn=1932-6203&rft_id=info:doi/10.1371/journal.pone.0252824&rft_dat=%3Cgale_plos_%3EA665563830%3C/gale_plos_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2541759206&rft_id=info:pmid/34133448&rft_galeid=A665563830&rft_doaj_id=oai_doaj_org_article_5ada4e32d6cc41dface0860224d446d0&rfr_iscdi=true