High-precision automated reconstruction of neurons with flood-filling networks
Reconstruction of neural circuits from volume electron microscopy data requires the tracing of cells in their entirety, including all their neurites. Automated approaches have been developed for tracing, but their error rates are too high to generate reliable circuit diagrams without extensive human...
Gespeichert in:
Veröffentlicht in: | Nature methods 2018-08, Vol.15 (8), p.605-610 |
---|---|
Hauptverfasser: | , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 610 |
---|---|
container_issue | 8 |
container_start_page | 605 |
container_title | Nature methods |
container_volume | 15 |
creator | Januszewski, Michał Kornfeld, Jörgen Li, Peter H. Pope, Art Blakely, Tim Lindsey, Larry Maitin-Shepard, Jeremy Tyka, Mike Denk, Winfried Jain, Viren |
description | Reconstruction of neural circuits from volume electron microscopy data requires the tracing of cells in their entirety, including all their neurites. Automated approaches have been developed for tracing, but their error rates are too high to generate reliable circuit diagrams without extensive human proofreading. We present flood-filling networks, a method for automated segmentation that, similar to most previous efforts, uses convolutional neural networks, but contains in addition a recurrent pathway that allows the iterative optimization and extension of individual neuronal processes. We used flood-filling networks to trace neurons in a dataset obtained by serial block-face electron microscopy of a zebra finch brain. Using our method, we achieved a mean error-free neurite path length of 1.1 mm, and we observed only four mergers in a test set with a path length of 97 mm. The performance of flood-filling networks was an order of magnitude better than that of previous approaches applied to this dataset, although with substantially increased computational costs.
Flood-filling networks are a deep-learning-based pipeline for reconstruction of neurons from electron microscopy datasets. The approach results in exceptionally low error rates, thereby reducing the need for extensive human proofreading. |
doi_str_mv | 10.1038/s41592-018-0049-4 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2071563612</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2071563612</sourcerecordid><originalsourceid>FETCH-LOGICAL-c415t-16ffd2b7d77dde665b8fe547c7e0a2492b409ccfc747beb4a5bd338c51692eb03</originalsourceid><addsrcrecordid>eNp1kE9Lw0AQxRdRbK1-AC8S8OJldfZfNjlKUSsUveh5STa7bWqarbsJxW_vlrQKgqcZ5v3mzfAQuiRwS4Bld4ETkVMMJMMAPMf8CI2J4BmWBMTxoYecjNBZCCsAxjgVp2jEAAgDno7Ry6xeLPHGG12H2rVJ0XduXXSmSuLItaHzve52grNJa3ofR8m27paJbZyrsK2bpm4XUeq2zn-Ec3RiiyaYi32doPfHh7fpDM9fn56n93Os48sdJqm1FS1lJWVVmTQVZWaN4FJLAwXlOS055FpbLbksTckLUVaMZVqQNKemBDZBN4PvxrvP3oROreugTdMUrXF9UBQkESlLCY3o9R905Xrfxu8ilYHMaC7ySJGB0t6F4I1VG1-vC_-lCKhd2GoIW8Ww1S5sxePO1d65L9em-tk4pBsBOgAhSu3C-N_T_7t-A6FRirA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2080782959</pqid></control><display><type>article</type><title>High-precision automated reconstruction of neurons with flood-filling networks</title><source>Nature</source><source>SpringerNature Journals</source><creator>Januszewski, Michał ; Kornfeld, Jörgen ; Li, Peter H. ; Pope, Art ; Blakely, Tim ; Lindsey, Larry ; Maitin-Shepard, Jeremy ; Tyka, Mike ; Denk, Winfried ; Jain, Viren</creator><creatorcontrib>Januszewski, Michał ; Kornfeld, Jörgen ; Li, Peter H. ; Pope, Art ; Blakely, Tim ; Lindsey, Larry ; Maitin-Shepard, Jeremy ; Tyka, Mike ; Denk, Winfried ; Jain, Viren</creatorcontrib><description>Reconstruction of neural circuits from volume electron microscopy data requires the tracing of cells in their entirety, including all their neurites. Automated approaches have been developed for tracing, but their error rates are too high to generate reliable circuit diagrams without extensive human proofreading. We present flood-filling networks, a method for automated segmentation that, similar to most previous efforts, uses convolutional neural networks, but contains in addition a recurrent pathway that allows the iterative optimization and extension of individual neuronal processes. We used flood-filling networks to trace neurons in a dataset obtained by serial block-face electron microscopy of a zebra finch brain. Using our method, we achieved a mean error-free neurite path length of 1.1 mm, and we observed only four mergers in a test set with a path length of 97 mm. The performance of flood-filling networks was an order of magnitude better than that of previous approaches applied to this dataset, although with substantially increased computational costs.
Flood-filling networks are a deep-learning-based pipeline for reconstruction of neurons from electron microscopy datasets. The approach results in exceptionally low error rates, thereby reducing the need for extensive human proofreading.</description><identifier>ISSN: 1548-7091</identifier><identifier>EISSN: 1548-7105</identifier><identifier>DOI: 10.1038/s41592-018-0049-4</identifier><identifier>PMID: 30013046</identifier><language>eng</language><publisher>New York: Nature Publishing Group US</publisher><subject>631/1647/794 ; 631/378/116 ; Artificial neural networks ; Automation ; Axons ; Bioinformatics ; Biological Microscopy ; Biological Techniques ; Biomedical and Life Sciences ; Biomedical Engineering/Biotechnology ; Brain ; Circuit diagrams ; Computational neuroscience ; Electron microscopy ; Floods ; Iterative methods ; Life Sciences ; Microscopy ; Neural networks ; Neurons ; Proofreading ; Proteomics ; Reconstruction ; Segmentation</subject><ispartof>Nature methods, 2018-08, Vol.15 (8), p.605-610</ispartof><rights>The Author(s) 2018</rights><rights>Copyright Nature Publishing Group Aug 2018</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c415t-16ffd2b7d77dde665b8fe547c7e0a2492b409ccfc747beb4a5bd338c51692eb03</citedby><cites>FETCH-LOGICAL-c415t-16ffd2b7d77dde665b8fe547c7e0a2492b409ccfc747beb4a5bd338c51692eb03</cites><orcidid>0000-0001-6193-4454 ; 0000-0002-3480-2744 ; 0000-0003-1488-3505</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1038/s41592-018-0049-4$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1038/s41592-018-0049-4$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/30013046$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Januszewski, Michał</creatorcontrib><creatorcontrib>Kornfeld, Jörgen</creatorcontrib><creatorcontrib>Li, Peter H.</creatorcontrib><creatorcontrib>Pope, Art</creatorcontrib><creatorcontrib>Blakely, Tim</creatorcontrib><creatorcontrib>Lindsey, Larry</creatorcontrib><creatorcontrib>Maitin-Shepard, Jeremy</creatorcontrib><creatorcontrib>Tyka, Mike</creatorcontrib><creatorcontrib>Denk, Winfried</creatorcontrib><creatorcontrib>Jain, Viren</creatorcontrib><title>High-precision automated reconstruction of neurons with flood-filling networks</title><title>Nature methods</title><addtitle>Nat Methods</addtitle><addtitle>Nat Methods</addtitle><description>Reconstruction of neural circuits from volume electron microscopy data requires the tracing of cells in their entirety, including all their neurites. Automated approaches have been developed for tracing, but their error rates are too high to generate reliable circuit diagrams without extensive human proofreading. We present flood-filling networks, a method for automated segmentation that, similar to most previous efforts, uses convolutional neural networks, but contains in addition a recurrent pathway that allows the iterative optimization and extension of individual neuronal processes. We used flood-filling networks to trace neurons in a dataset obtained by serial block-face electron microscopy of a zebra finch brain. Using our method, we achieved a mean error-free neurite path length of 1.1 mm, and we observed only four mergers in a test set with a path length of 97 mm. The performance of flood-filling networks was an order of magnitude better than that of previous approaches applied to this dataset, although with substantially increased computational costs.
Flood-filling networks are a deep-learning-based pipeline for reconstruction of neurons from electron microscopy datasets. The approach results in exceptionally low error rates, thereby reducing the need for extensive human proofreading.</description><subject>631/1647/794</subject><subject>631/378/116</subject><subject>Artificial neural networks</subject><subject>Automation</subject><subject>Axons</subject><subject>Bioinformatics</subject><subject>Biological Microscopy</subject><subject>Biological Techniques</subject><subject>Biomedical and Life Sciences</subject><subject>Biomedical Engineering/Biotechnology</subject><subject>Brain</subject><subject>Circuit diagrams</subject><subject>Computational neuroscience</subject><subject>Electron microscopy</subject><subject>Floods</subject><subject>Iterative methods</subject><subject>Life Sciences</subject><subject>Microscopy</subject><subject>Neural networks</subject><subject>Neurons</subject><subject>Proofreading</subject><subject>Proteomics</subject><subject>Reconstruction</subject><subject>Segmentation</subject><issn>1548-7091</issn><issn>1548-7105</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp1kE9Lw0AQxRdRbK1-AC8S8OJldfZfNjlKUSsUveh5STa7bWqarbsJxW_vlrQKgqcZ5v3mzfAQuiRwS4Bld4ETkVMMJMMAPMf8CI2J4BmWBMTxoYecjNBZCCsAxjgVp2jEAAgDno7Ry6xeLPHGG12H2rVJ0XduXXSmSuLItaHzve52grNJa3ofR8m27paJbZyrsK2bpm4XUeq2zn-Ec3RiiyaYi32doPfHh7fpDM9fn56n93Os48sdJqm1FS1lJWVVmTQVZWaN4FJLAwXlOS055FpbLbksTckLUVaMZVqQNKemBDZBN4PvxrvP3oROreugTdMUrXF9UBQkESlLCY3o9R905Xrfxu8ilYHMaC7ySJGB0t6F4I1VG1-vC_-lCKhd2GoIW8Ww1S5sxePO1d65L9em-tk4pBsBOgAhSu3C-N_T_7t-A6FRirA</recordid><startdate>20180801</startdate><enddate>20180801</enddate><creator>Januszewski, Michał</creator><creator>Kornfeld, Jörgen</creator><creator>Li, Peter H.</creator><creator>Pope, Art</creator><creator>Blakely, Tim</creator><creator>Lindsey, Larry</creator><creator>Maitin-Shepard, Jeremy</creator><creator>Tyka, Mike</creator><creator>Denk, Winfried</creator><creator>Jain, Viren</creator><general>Nature Publishing Group US</general><general>Nature Publishing Group</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QL</scope><scope>7QO</scope><scope>7SS</scope><scope>7TK</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>88I</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>BKSAR</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M2P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PCBAR</scope><scope>PDBOC</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>Q9U</scope><scope>RC3</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0001-6193-4454</orcidid><orcidid>https://orcid.org/0000-0002-3480-2744</orcidid><orcidid>https://orcid.org/0000-0003-1488-3505</orcidid></search><sort><creationdate>20180801</creationdate><title>High-precision automated reconstruction of neurons with flood-filling networks</title><author>Januszewski, Michał ; Kornfeld, Jörgen ; Li, Peter H. ; Pope, Art ; Blakely, Tim ; Lindsey, Larry ; Maitin-Shepard, Jeremy ; Tyka, Mike ; Denk, Winfried ; Jain, Viren</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c415t-16ffd2b7d77dde665b8fe547c7e0a2492b409ccfc747beb4a5bd338c51692eb03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>631/1647/794</topic><topic>631/378/116</topic><topic>Artificial neural networks</topic><topic>Automation</topic><topic>Axons</topic><topic>Bioinformatics</topic><topic>Biological Microscopy</topic><topic>Biological Techniques</topic><topic>Biomedical and Life Sciences</topic><topic>Biomedical Engineering/Biotechnology</topic><topic>Brain</topic><topic>Circuit diagrams</topic><topic>Computational neuroscience</topic><topic>Electron microscopy</topic><topic>Floods</topic><topic>Iterative methods</topic><topic>Life Sciences</topic><topic>Microscopy</topic><topic>Neural networks</topic><topic>Neurons</topic><topic>Proofreading</topic><topic>Proteomics</topic><topic>Reconstruction</topic><topic>Segmentation</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Januszewski, Michał</creatorcontrib><creatorcontrib>Kornfeld, Jörgen</creatorcontrib><creatorcontrib>Li, Peter H.</creatorcontrib><creatorcontrib>Pope, Art</creatorcontrib><creatorcontrib>Blakely, Tim</creatorcontrib><creatorcontrib>Lindsey, Larry</creatorcontrib><creatorcontrib>Maitin-Shepard, Jeremy</creatorcontrib><creatorcontrib>Tyka, Mike</creatorcontrib><creatorcontrib>Denk, Winfried</creatorcontrib><creatorcontrib>Jain, Viren</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Neurosciences Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>Agricultural & Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Earth, Atmospheric & Aquatic Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agricultural Science Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Science Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Earth, Atmospheric & Aquatic Science Database</collection><collection>Materials Science Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>ProQuest Central Basic</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>Nature methods</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Januszewski, Michał</au><au>Kornfeld, Jörgen</au><au>Li, Peter H.</au><au>Pope, Art</au><au>Blakely, Tim</au><au>Lindsey, Larry</au><au>Maitin-Shepard, Jeremy</au><au>Tyka, Mike</au><au>Denk, Winfried</au><au>Jain, Viren</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>High-precision automated reconstruction of neurons with flood-filling networks</atitle><jtitle>Nature methods</jtitle><stitle>Nat Methods</stitle><addtitle>Nat Methods</addtitle><date>2018-08-01</date><risdate>2018</risdate><volume>15</volume><issue>8</issue><spage>605</spage><epage>610</epage><pages>605-610</pages><issn>1548-7091</issn><eissn>1548-7105</eissn><abstract>Reconstruction of neural circuits from volume electron microscopy data requires the tracing of cells in their entirety, including all their neurites. Automated approaches have been developed for tracing, but their error rates are too high to generate reliable circuit diagrams without extensive human proofreading. We present flood-filling networks, a method for automated segmentation that, similar to most previous efforts, uses convolutional neural networks, but contains in addition a recurrent pathway that allows the iterative optimization and extension of individual neuronal processes. We used flood-filling networks to trace neurons in a dataset obtained by serial block-face electron microscopy of a zebra finch brain. Using our method, we achieved a mean error-free neurite path length of 1.1 mm, and we observed only four mergers in a test set with a path length of 97 mm. The performance of flood-filling networks was an order of magnitude better than that of previous approaches applied to this dataset, although with substantially increased computational costs.
Flood-filling networks are a deep-learning-based pipeline for reconstruction of neurons from electron microscopy datasets. The approach results in exceptionally low error rates, thereby reducing the need for extensive human proofreading.</abstract><cop>New York</cop><pub>Nature Publishing Group US</pub><pmid>30013046</pmid><doi>10.1038/s41592-018-0049-4</doi><tpages>6</tpages><orcidid>https://orcid.org/0000-0001-6193-4454</orcidid><orcidid>https://orcid.org/0000-0002-3480-2744</orcidid><orcidid>https://orcid.org/0000-0003-1488-3505</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1548-7091 |
ispartof | Nature methods, 2018-08, Vol.15 (8), p.605-610 |
issn | 1548-7091 1548-7105 |
language | eng |
recordid | cdi_proquest_miscellaneous_2071563612 |
source | Nature; SpringerNature Journals |
subjects | 631/1647/794 631/378/116 Artificial neural networks Automation Axons Bioinformatics Biological Microscopy Biological Techniques Biomedical and Life Sciences Biomedical Engineering/Biotechnology Brain Circuit diagrams Computational neuroscience Electron microscopy Floods Iterative methods Life Sciences Microscopy Neural networks Neurons Proofreading Proteomics Reconstruction Segmentation |
title | High-precision automated reconstruction of neurons with flood-filling networks |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-03T04%3A13%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=High-precision%20automated%20reconstruction%20of%20neurons%20with%20flood-filling%20networks&rft.jtitle=Nature%20methods&rft.au=Januszewski,%20Micha%C5%82&rft.date=2018-08-01&rft.volume=15&rft.issue=8&rft.spage=605&rft.epage=610&rft.pages=605-610&rft.issn=1548-7091&rft.eissn=1548-7105&rft_id=info:doi/10.1038/s41592-018-0049-4&rft_dat=%3Cproquest_cross%3E2071563612%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2080782959&rft_id=info:pmid/30013046&rfr_iscdi=true |