Robust cell particle detection to dense regions and subjective training samples based on prediction of particle center using convolutional neural network
In recent years, finding the cause of pathogenesis is expected by observing the cell images. In this paper, we propose a cell particle detection method in cell images. However, there are mainly two kinds of problems in particle detection in cell image. The first is the different properties between c...
Gespeichert in:
Veröffentlicht in: | PloS one 2018-10, Vol.13 (10), p.e0203646-e0203646 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | e0203646 |
---|---|
container_issue | 10 |
container_start_page | e0203646 |
container_title | PloS one |
container_volume | 13 |
creator | Nishida, Kenshiro Hotta, Kazuhiro |
description | In recent years, finding the cause of pathogenesis is expected by observing the cell images. In this paper, we propose a cell particle detection method in cell images. However, there are mainly two kinds of problems in particle detection in cell image. The first is the different properties between cell images and standard images used in computer vision researches. Edges of cell particles are ambiguous, and overlaps between cell particles are often occurred in dense regions. It is difficult to detect cell particles by simple detection method using a binary classifier. The second is the ground truth made by cell biologists. The number of training samples for training a classifier is limited, and incorrect samples are included by the subjectivity of observers. From the background, we propose a cell particle detection method to address those problems. In our proposed method, we predict the center of a cell particle from the peripheral regions by convolutional neural network, and the prediction results are voted. By using the obvious peripheral edges, we can robustly detect overlapped cell particles because all edges of overlapping cell particles are not ambiguous. In addition, voting from peripheral views enables reliable detection. Moreover, our method is useful in practical applications because we can prepare many training samples from a cell particle. In experiments, we evaluate our detection methods on two kinds of cell detection datasets. One is challenging dataset for synthetic cells, and our method achieved the state-of-the-art performance. The other is real dataset of lipid droplets, and our method outperformed the conventional detector using CNN with binary outputs for particles and non-particles classification. |
doi_str_mv | 10.1371/journal.pone.0203646 |
format | Article |
fullrecord | <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_2118227764</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A557679674</galeid><doaj_id>oai_doaj_org_article_6f8069f2407540bbac9a32ea1450b228</doaj_id><sourcerecordid>A557679674</sourcerecordid><originalsourceid>FETCH-LOGICAL-c758t-7764b3ce94b1d048d0cdec7838ee17b89812a32d06028c7008a2d3388101f4513</originalsourceid><addsrcrecordid>eNqNk9tu1DAQhiMEoqXwBggsISG42MWHxHZukKqKw0qVKpXDreU4kzRL1l5sZ4FH4W1xumnZoF4gXzhjf_OPZyaTZU8JXhImyJu1G7zV_XLrLCwxxYzn_F52TEpGFzyZ9w--j7JHIawxLpjk_GF2xHBaZSGOs9-XrhpCRAb6Hm21j53pAdUQwcTOWRRdMmwA5KFNdkDa1igM1Xq83wGKXne2sy0KerPtIaBKB6hR8tx6qLu9iGv-ShuwETwawuhknN25fhgh3SMLg7_e4g_nvz3OHjS6D_Bk2k-yL-_ffT77uDi_-LA6Oz1fGFHIuBCC5xUzUOYVqXEua2xqMEIyCUBEJUtJqGa0xhxTaQTGUtOaMSkJJk1eEHaSPd_rbnsX1FTVoCghktJRPRGrPVE7vVZb3220_6Wc7tT1gfOtmrJTvJGYlw3NsShyXFXalCk4aJIXuKJUJq23U7Sh2kA9ViPlPBOd39juSrVupzgRJSnLJPBqEvDu-wAhqk0Xxu5pC27Yv5vhUhY0oS_-Qe_ObqJanRLobONSXDOKqtOiEFyUXIzU8g4qrRo2XWojNF06nzm8njkkJsLP2OohBLX6dPn_7MXXOfvygL0C3cerMP1DYQ7me9B4F4KH5rbIBKtxgm6qocYJUtMEJbdnhw26dboZGfYHXpwYfQ</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2118227764</pqid></control><display><type>article</type><title>Robust cell particle detection to dense regions and subjective training samples based on prediction of particle center using convolutional neural network</title><source>MEDLINE</source><source>DOAJ Directory of Open Access Journals</source><source>Public Library of Science (PLoS) Journals Open Access</source><source>EZB-FREE-00999 freely available EZB journals</source><source>PubMed Central</source><source>Free Full-Text Journals in Chemistry</source><creator>Nishida, Kenshiro ; Hotta, Kazuhiro</creator><contributor>Wang, Long</contributor><creatorcontrib>Nishida, Kenshiro ; Hotta, Kazuhiro ; Wang, Long</creatorcontrib><description>In recent years, finding the cause of pathogenesis is expected by observing the cell images. In this paper, we propose a cell particle detection method in cell images. However, there are mainly two kinds of problems in particle detection in cell image. The first is the different properties between cell images and standard images used in computer vision researches. Edges of cell particles are ambiguous, and overlaps between cell particles are often occurred in dense regions. It is difficult to detect cell particles by simple detection method using a binary classifier. The second is the ground truth made by cell biologists. The number of training samples for training a classifier is limited, and incorrect samples are included by the subjectivity of observers. From the background, we propose a cell particle detection method to address those problems. In our proposed method, we predict the center of a cell particle from the peripheral regions by convolutional neural network, and the prediction results are voted. By using the obvious peripheral edges, we can robustly detect overlapped cell particles because all edges of overlapping cell particles are not ambiguous. In addition, voting from peripheral views enables reliable detection. Moreover, our method is useful in practical applications because we can prepare many training samples from a cell particle. In experiments, we evaluate our detection methods on two kinds of cell detection datasets. One is challenging dataset for synthetic cells, and our method achieved the state-of-the-art performance. The other is real dataset of lipid droplets, and our method outperformed the conventional detector using CNN with binary outputs for particles and non-particles classification.</description><identifier>ISSN: 1932-6203</identifier><identifier>EISSN: 1932-6203</identifier><identifier>DOI: 10.1371/journal.pone.0203646</identifier><identifier>PMID: 30303957</identifier><language>eng</language><publisher>United States: Public Library of Science</publisher><subject>Accuracy ; Architectural engineering ; Artificial neural networks ; Biology and Life Sciences ; Cell Tracking - methods ; Cell-Derived Microparticles - chemistry ; Cells (Biology) ; Classification ; Classifiers ; Computer and Information Sciences ; Computer vision ; Datasets ; Ground truth ; Image detection ; Integer programming ; Lipids ; Lipids - chemistry ; Machine Learning ; Medicine and Health Sciences ; Methods ; Neural networks ; Neural Networks (Computer) ; Optical Imaging - methods ; Pathogenesis ; Pattern recognition ; People and Places ; Physical Sciences ; Research and Analysis Methods ; Science Policy ; Training</subject><ispartof>PloS one, 2018-10, Vol.13 (10), p.e0203646-e0203646</ispartof><rights>COPYRIGHT 2018 Public Library of Science</rights><rights>2018 Nishida, Hotta. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2018 Nishida, Hotta 2018 Nishida, Hotta</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c758t-7764b3ce94b1d048d0cdec7838ee17b89812a32d06028c7008a2d3388101f4513</citedby><cites>FETCH-LOGICAL-c758t-7764b3ce94b1d048d0cdec7838ee17b89812a32d06028c7008a2d3388101f4513</cites><orcidid>0000-0001-5305-4684</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6179199/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6179199/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,864,885,2102,2928,23866,27924,27925,53791,53793,79600,79601</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/30303957$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><contributor>Wang, Long</contributor><creatorcontrib>Nishida, Kenshiro</creatorcontrib><creatorcontrib>Hotta, Kazuhiro</creatorcontrib><title>Robust cell particle detection to dense regions and subjective training samples based on prediction of particle center using convolutional neural network</title><title>PloS one</title><addtitle>PLoS One</addtitle><description>In recent years, finding the cause of pathogenesis is expected by observing the cell images. In this paper, we propose a cell particle detection method in cell images. However, there are mainly two kinds of problems in particle detection in cell image. The first is the different properties between cell images and standard images used in computer vision researches. Edges of cell particles are ambiguous, and overlaps between cell particles are often occurred in dense regions. It is difficult to detect cell particles by simple detection method using a binary classifier. The second is the ground truth made by cell biologists. The number of training samples for training a classifier is limited, and incorrect samples are included by the subjectivity of observers. From the background, we propose a cell particle detection method to address those problems. In our proposed method, we predict the center of a cell particle from the peripheral regions by convolutional neural network, and the prediction results are voted. By using the obvious peripheral edges, we can robustly detect overlapped cell particles because all edges of overlapping cell particles are not ambiguous. In addition, voting from peripheral views enables reliable detection. Moreover, our method is useful in practical applications because we can prepare many training samples from a cell particle. In experiments, we evaluate our detection methods on two kinds of cell detection datasets. One is challenging dataset for synthetic cells, and our method achieved the state-of-the-art performance. The other is real dataset of lipid droplets, and our method outperformed the conventional detector using CNN with binary outputs for particles and non-particles classification.</description><subject>Accuracy</subject><subject>Architectural engineering</subject><subject>Artificial neural networks</subject><subject>Biology and Life Sciences</subject><subject>Cell Tracking - methods</subject><subject>Cell-Derived Microparticles - chemistry</subject><subject>Cells (Biology)</subject><subject>Classification</subject><subject>Classifiers</subject><subject>Computer and Information Sciences</subject><subject>Computer vision</subject><subject>Datasets</subject><subject>Ground truth</subject><subject>Image detection</subject><subject>Integer programming</subject><subject>Lipids</subject><subject>Lipids - chemistry</subject><subject>Machine Learning</subject><subject>Medicine and Health Sciences</subject><subject>Methods</subject><subject>Neural networks</subject><subject>Neural Networks (Computer)</subject><subject>Optical Imaging - methods</subject><subject>Pathogenesis</subject><subject>Pattern recognition</subject><subject>People and Places</subject><subject>Physical Sciences</subject><subject>Research and Analysis Methods</subject><subject>Science Policy</subject><subject>Training</subject><issn>1932-6203</issn><issn>1932-6203</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>DOA</sourceid><recordid>eNqNk9tu1DAQhiMEoqXwBggsISG42MWHxHZukKqKw0qVKpXDreU4kzRL1l5sZ4FH4W1xumnZoF4gXzhjf_OPZyaTZU8JXhImyJu1G7zV_XLrLCwxxYzn_F52TEpGFzyZ9w--j7JHIawxLpjk_GF2xHBaZSGOs9-XrhpCRAb6Hm21j53pAdUQwcTOWRRdMmwA5KFNdkDa1igM1Xq83wGKXne2sy0KerPtIaBKB6hR8tx6qLu9iGv-ShuwETwawuhknN25fhgh3SMLg7_e4g_nvz3OHjS6D_Bk2k-yL-_ffT77uDi_-LA6Oz1fGFHIuBCC5xUzUOYVqXEua2xqMEIyCUBEJUtJqGa0xhxTaQTGUtOaMSkJJk1eEHaSPd_rbnsX1FTVoCghktJRPRGrPVE7vVZb3220_6Wc7tT1gfOtmrJTvJGYlw3NsShyXFXalCk4aJIXuKJUJq23U7Sh2kA9ViPlPBOd39juSrVupzgRJSnLJPBqEvDu-wAhqk0Xxu5pC27Yv5vhUhY0oS_-Qe_ObqJanRLobONSXDOKqtOiEFyUXIzU8g4qrRo2XWojNF06nzm8njkkJsLP2OohBLX6dPn_7MXXOfvygL0C3cerMP1DYQ7me9B4F4KH5rbIBKtxgm6qocYJUtMEJbdnhw26dboZGfYHXpwYfQ</recordid><startdate>20181010</startdate><enddate>20181010</enddate><creator>Nishida, Kenshiro</creator><creator>Hotta, Kazuhiro</creator><general>Public Library of Science</general><general>Public Library of Science (PLoS)</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>IOV</scope><scope>ISR</scope><scope>3V.</scope><scope>7QG</scope><scope>7QL</scope><scope>7QO</scope><scope>7RV</scope><scope>7SN</scope><scope>7SS</scope><scope>7T5</scope><scope>7TG</scope><scope>7TM</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AO</scope><scope>8C1</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>KB0</scope><scope>KL.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PDBOC</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0001-5305-4684</orcidid></search><sort><creationdate>20181010</creationdate><title>Robust cell particle detection to dense regions and subjective training samples based on prediction of particle center using convolutional neural network</title><author>Nishida, Kenshiro ; Hotta, Kazuhiro</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c758t-7764b3ce94b1d048d0cdec7838ee17b89812a32d06028c7008a2d3388101f4513</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Accuracy</topic><topic>Architectural engineering</topic><topic>Artificial neural networks</topic><topic>Biology and Life Sciences</topic><topic>Cell Tracking - methods</topic><topic>Cell-Derived Microparticles - chemistry</topic><topic>Cells (Biology)</topic><topic>Classification</topic><topic>Classifiers</topic><topic>Computer and Information Sciences</topic><topic>Computer vision</topic><topic>Datasets</topic><topic>Ground truth</topic><topic>Image detection</topic><topic>Integer programming</topic><topic>Lipids</topic><topic>Lipids - chemistry</topic><topic>Machine Learning</topic><topic>Medicine and Health Sciences</topic><topic>Methods</topic><topic>Neural networks</topic><topic>Neural Networks (Computer)</topic><topic>Optical Imaging - methods</topic><topic>Pathogenesis</topic><topic>Pattern recognition</topic><topic>People and Places</topic><topic>Physical Sciences</topic><topic>Research and Analysis Methods</topic><topic>Science Policy</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Nishida, Kenshiro</creatorcontrib><creatorcontrib>Hotta, Kazuhiro</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Gale In Context: Opposing Viewpoints</collection><collection>Gale In Context: Science</collection><collection>ProQuest Central (Corporate)</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>Nursing & Allied Health Database</collection><collection>Ecology Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Immunology Abstracts</collection><collection>Meteorological & Geoastrophysical Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>Agricultural & Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>Nursing & Allied Health Database (Alumni Edition)</collection><collection>Meteorological & Geoastrophysical Abstracts - Academic</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agricultural Science Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Nursing & Allied Health Premium</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Materials Science Collection</collection><collection>Access via ProQuest (Open Access)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>PloS one</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Nishida, Kenshiro</au><au>Hotta, Kazuhiro</au><au>Wang, Long</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Robust cell particle detection to dense regions and subjective training samples based on prediction of particle center using convolutional neural network</atitle><jtitle>PloS one</jtitle><addtitle>PLoS One</addtitle><date>2018-10-10</date><risdate>2018</risdate><volume>13</volume><issue>10</issue><spage>e0203646</spage><epage>e0203646</epage><pages>e0203646-e0203646</pages><issn>1932-6203</issn><eissn>1932-6203</eissn><abstract>In recent years, finding the cause of pathogenesis is expected by observing the cell images. In this paper, we propose a cell particle detection method in cell images. However, there are mainly two kinds of problems in particle detection in cell image. The first is the different properties between cell images and standard images used in computer vision researches. Edges of cell particles are ambiguous, and overlaps between cell particles are often occurred in dense regions. It is difficult to detect cell particles by simple detection method using a binary classifier. The second is the ground truth made by cell biologists. The number of training samples for training a classifier is limited, and incorrect samples are included by the subjectivity of observers. From the background, we propose a cell particle detection method to address those problems. In our proposed method, we predict the center of a cell particle from the peripheral regions by convolutional neural network, and the prediction results are voted. By using the obvious peripheral edges, we can robustly detect overlapped cell particles because all edges of overlapping cell particles are not ambiguous. In addition, voting from peripheral views enables reliable detection. Moreover, our method is useful in practical applications because we can prepare many training samples from a cell particle. In experiments, we evaluate our detection methods on two kinds of cell detection datasets. One is challenging dataset for synthetic cells, and our method achieved the state-of-the-art performance. The other is real dataset of lipid droplets, and our method outperformed the conventional detector using CNN with binary outputs for particles and non-particles classification.</abstract><cop>United States</cop><pub>Public Library of Science</pub><pmid>30303957</pmid><doi>10.1371/journal.pone.0203646</doi><tpages>e0203646</tpages><orcidid>https://orcid.org/0000-0001-5305-4684</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1932-6203 |
ispartof | PloS one, 2018-10, Vol.13 (10), p.e0203646-e0203646 |
issn | 1932-6203 1932-6203 |
language | eng |
recordid | cdi_plos_journals_2118227764 |
source | MEDLINE; DOAJ Directory of Open Access Journals; Public Library of Science (PLoS) Journals Open Access; EZB-FREE-00999 freely available EZB journals; PubMed Central; Free Full-Text Journals in Chemistry |
subjects | Accuracy Architectural engineering Artificial neural networks Biology and Life Sciences Cell Tracking - methods Cell-Derived Microparticles - chemistry Cells (Biology) Classification Classifiers Computer and Information Sciences Computer vision Datasets Ground truth Image detection Integer programming Lipids Lipids - chemistry Machine Learning Medicine and Health Sciences Methods Neural networks Neural Networks (Computer) Optical Imaging - methods Pathogenesis Pattern recognition People and Places Physical Sciences Research and Analysis Methods Science Policy Training |
title | Robust cell particle detection to dense regions and subjective training samples based on prediction of particle center using convolutional neural network |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T03%3A00%3A33IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Robust%20cell%20particle%20detection%20to%20dense%20regions%20and%20subjective%20training%20samples%20based%20on%20prediction%20of%20particle%20center%20using%20convolutional%20neural%20network&rft.jtitle=PloS%20one&rft.au=Nishida,%20Kenshiro&rft.date=2018-10-10&rft.volume=13&rft.issue=10&rft.spage=e0203646&rft.epage=e0203646&rft.pages=e0203646-e0203646&rft.issn=1932-6203&rft.eissn=1932-6203&rft_id=info:doi/10.1371/journal.pone.0203646&rft_dat=%3Cgale_plos_%3EA557679674%3C/gale_plos_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2118227764&rft_id=info:pmid/30303957&rft_galeid=A557679674&rft_doaj_id=oai_doaj_org_article_6f8069f2407540bbac9a32ea1450b228&rfr_iscdi=true |