nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation
Biomedical imaging is a driver of scientific discovery and a core component of medical care and is being stimulated by the field of deep learning. While semantic segmentation algorithms enable image analysis and quantification in many applications, the design of respective specialized solutions is n...
Gespeichert in:
Veröffentlicht in: | Nature methods 2021-02, Vol.18 (2), p.203-211 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 211 |
---|---|
container_issue | 2 |
container_start_page | 203 |
container_title | Nature methods |
container_volume | 18 |
creator | Isensee, Fabian Jaeger, Paul F. Kohl, Simon A. A. Petersen, Jens Maier-Hein, Klaus H. |
description | Biomedical imaging is a driver of scientific discovery and a core component of medical care and is being stimulated by the field of deep learning. While semantic segmentation algorithms enable image analysis and quantification in many applications, the design of respective specialized solutions is non-trivial and highly dependent on dataset properties and hardware conditions. We developed nnU-Net, a deep learning-based segmentation method that automatically configures itself, including preprocessing, network architecture, training and post-processing for any new task. The key design choices in this process are modeled as a set of fixed parameters, interdependent rules and empirical decisions. Without manual intervention, nnU-Net surpasses most existing approaches, including highly specialized solutions on 23 public datasets used in international biomedical segmentation competitions. We make nnU-Net publicly available as an out-of-the-box tool, rendering state-of-the-art segmentation accessible to a broad audience by requiring neither expert knowledge nor computing resources beyond standard network training.
nnU-Net is a deep learning-based image segmentation method that automatically configures itself for diverse biological and medical image segmentation tasks. nnU-Net offers state-of-the-art performance as an out-of-the-box tool. |
doi_str_mv | 10.1038/s41592-020-01008-z |
format | Article |
fullrecord | <record><control><sourceid>gale_proqu</sourceid><recordid>TN_cdi_proquest_miscellaneous_2468341213</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A655717282</galeid><sourcerecordid>A655717282</sourcerecordid><originalsourceid>FETCH-LOGICAL-c557t-6ed43aa3aee3eb50419a6f28745ccf6ae6a40f75519ca73d288fb36e40be02be3</originalsourceid><addsrcrecordid>eNp9kUtv1TAUhCMEog_4AyxQJDZsXI6fcdhVFS-pgg2V2FmOcxxcJfbFThb019eX21KBEPLClv3NaHymaV5QOKPA9ZsiqOwZAQYEKIAmN4-aYyqFJh0F-fj-DD09ak5KuQbgXDD5tDninGndK3rcfIvxinzG9W1r24KzJy5FH6Ythzi1C67f09j6lNsRcdfOaHOsD2SwBcd2CGnBMTg7t2GxE1aDacG42jWk-Kx54u1c8PndftpcvX_39eIjufzy4dPF-SVxUnYrUTgKbi23iBwHCYL2VnmmOyGd88qisgJ8JyXtne34WHP7gSsUMCCwAflp8_rgu8vpx4ZlNUsoDufZRkxbMUwozQVllFf01V_oddpyrOkqpZVioEA8UJOd0YTo05qt25uac1Uz045pVqmzf1B1jbiEOkP0od7_IWAHgcuplIze7HKdWv5pKJh9neZQp6l1ml91mpsqenmXeBvqqH9L7vurAD8AZbdvDPPDl_5jewvPxamn</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2486620604</pqid></control><display><type>article</type><title>nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation</title><source>MEDLINE</source><source>Nature Journals Online</source><source>SpringerLink Journals - AutoHoldings</source><creator>Isensee, Fabian ; Jaeger, Paul F. ; Kohl, Simon A. A. ; Petersen, Jens ; Maier-Hein, Klaus H.</creator><creatorcontrib>Isensee, Fabian ; Jaeger, Paul F. ; Kohl, Simon A. A. ; Petersen, Jens ; Maier-Hein, Klaus H.</creatorcontrib><description>Biomedical imaging is a driver of scientific discovery and a core component of medical care and is being stimulated by the field of deep learning. While semantic segmentation algorithms enable image analysis and quantification in many applications, the design of respective specialized solutions is non-trivial and highly dependent on dataset properties and hardware conditions. We developed nnU-Net, a deep learning-based segmentation method that automatically configures itself, including preprocessing, network architecture, training and post-processing for any new task. The key design choices in this process are modeled as a set of fixed parameters, interdependent rules and empirical decisions. Without manual intervention, nnU-Net surpasses most existing approaches, including highly specialized solutions on 23 public datasets used in international biomedical segmentation competitions. We make nnU-Net publicly available as an out-of-the-box tool, rendering state-of-the-art segmentation accessible to a broad audience by requiring neither expert knowledge nor computing resources beyond standard network training.
nnU-Net is a deep learning-based image segmentation method that automatically configures itself for diverse biological and medical image segmentation tasks. nnU-Net offers state-of-the-art performance as an out-of-the-box tool.</description><identifier>ISSN: 1548-7091</identifier><identifier>EISSN: 1548-7105</identifier><identifier>DOI: 10.1038/s41592-020-01008-z</identifier><identifier>PMID: 33288961</identifier><language>eng</language><publisher>New York: Nature Publishing Group US</publisher><subject>631/114/1564 ; 692/308/575 ; Algorithms ; Bioinformatics ; Biological Microscopy ; Biological Techniques ; Biomedical and Life Sciences ; Biomedical Engineering/Biotechnology ; Computer architecture ; Datasets ; Deep Learning ; Diagnostic imaging ; Empirical analysis ; Health services ; Image analysis ; Image processing ; Image Processing, Computer-Assisted - methods ; Image segmentation ; Life Sciences ; Machine learning ; Medical imaging ; Methods ; Neural Networks, Computer ; Post-production processing ; Proteomics ; Training</subject><ispartof>Nature methods, 2021-02, Vol.18 (2), p.203-211</ispartof><rights>The Author(s), under exclusive licence to Springer Nature America, Inc. 2020</rights><rights>COPYRIGHT 2021 Nature Publishing Group</rights><rights>The Author(s), under exclusive licence to Springer Nature America, Inc. 2020.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c557t-6ed43aa3aee3eb50419a6f28745ccf6ae6a40f75519ca73d288fb36e40be02be3</citedby><cites>FETCH-LOGICAL-c557t-6ed43aa3aee3eb50419a6f28745ccf6ae6a40f75519ca73d288fb36e40be02be3</cites><orcidid>0000-0002-6626-2463</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1038/s41592-020-01008-z$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1038/s41592-020-01008-z$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/33288961$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Isensee, Fabian</creatorcontrib><creatorcontrib>Jaeger, Paul F.</creatorcontrib><creatorcontrib>Kohl, Simon A. A.</creatorcontrib><creatorcontrib>Petersen, Jens</creatorcontrib><creatorcontrib>Maier-Hein, Klaus H.</creatorcontrib><title>nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation</title><title>Nature methods</title><addtitle>Nat Methods</addtitle><addtitle>Nat Methods</addtitle><description>Biomedical imaging is a driver of scientific discovery and a core component of medical care and is being stimulated by the field of deep learning. While semantic segmentation algorithms enable image analysis and quantification in many applications, the design of respective specialized solutions is non-trivial and highly dependent on dataset properties and hardware conditions. We developed nnU-Net, a deep learning-based segmentation method that automatically configures itself, including preprocessing, network architecture, training and post-processing for any new task. The key design choices in this process are modeled as a set of fixed parameters, interdependent rules and empirical decisions. Without manual intervention, nnU-Net surpasses most existing approaches, including highly specialized solutions on 23 public datasets used in international biomedical segmentation competitions. We make nnU-Net publicly available as an out-of-the-box tool, rendering state-of-the-art segmentation accessible to a broad audience by requiring neither expert knowledge nor computing resources beyond standard network training.
nnU-Net is a deep learning-based image segmentation method that automatically configures itself for diverse biological and medical image segmentation tasks. nnU-Net offers state-of-the-art performance as an out-of-the-box tool.</description><subject>631/114/1564</subject><subject>692/308/575</subject><subject>Algorithms</subject><subject>Bioinformatics</subject><subject>Biological Microscopy</subject><subject>Biological Techniques</subject><subject>Biomedical and Life Sciences</subject><subject>Biomedical Engineering/Biotechnology</subject><subject>Computer architecture</subject><subject>Datasets</subject><subject>Deep Learning</subject><subject>Diagnostic imaging</subject><subject>Empirical analysis</subject><subject>Health services</subject><subject>Image analysis</subject><subject>Image processing</subject><subject>Image Processing, Computer-Assisted - methods</subject><subject>Image segmentation</subject><subject>Life Sciences</subject><subject>Machine learning</subject><subject>Medical imaging</subject><subject>Methods</subject><subject>Neural Networks, Computer</subject><subject>Post-production processing</subject><subject>Proteomics</subject><subject>Training</subject><issn>1548-7091</issn><issn>1548-7105</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>BENPR</sourceid><recordid>eNp9kUtv1TAUhCMEog_4AyxQJDZsXI6fcdhVFS-pgg2V2FmOcxxcJfbFThb019eX21KBEPLClv3NaHymaV5QOKPA9ZsiqOwZAQYEKIAmN4-aYyqFJh0F-fj-DD09ak5KuQbgXDD5tDninGndK3rcfIvxinzG9W1r24KzJy5FH6Ythzi1C67f09j6lNsRcdfOaHOsD2SwBcd2CGnBMTg7t2GxE1aDacG42jWk-Kx54u1c8PndftpcvX_39eIjufzy4dPF-SVxUnYrUTgKbi23iBwHCYL2VnmmOyGd88qisgJ8JyXtne34WHP7gSsUMCCwAflp8_rgu8vpx4ZlNUsoDufZRkxbMUwozQVllFf01V_oddpyrOkqpZVioEA8UJOd0YTo05qt25uac1Uz045pVqmzf1B1jbiEOkP0od7_IWAHgcuplIze7HKdWv5pKJh9neZQp6l1ml91mpsqenmXeBvqqH9L7vurAD8AZbdvDPPDl_5jewvPxamn</recordid><startdate>20210201</startdate><enddate>20210201</enddate><creator>Isensee, Fabian</creator><creator>Jaeger, Paul F.</creator><creator>Kohl, Simon A. A.</creator><creator>Petersen, Jens</creator><creator>Maier-Hein, Klaus H.</creator><general>Nature Publishing Group US</general><general>Nature Publishing Group</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QL</scope><scope>7QO</scope><scope>7SS</scope><scope>7TK</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>88I</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>BKSAR</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M2P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PCBAR</scope><scope>PDBOC</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>Q9U</scope><scope>RC3</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-6626-2463</orcidid></search><sort><creationdate>20210201</creationdate><title>nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation</title><author>Isensee, Fabian ; Jaeger, Paul F. ; Kohl, Simon A. A. ; Petersen, Jens ; Maier-Hein, Klaus H.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c557t-6ed43aa3aee3eb50419a6f28745ccf6ae6a40f75519ca73d288fb36e40be02be3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>631/114/1564</topic><topic>692/308/575</topic><topic>Algorithms</topic><topic>Bioinformatics</topic><topic>Biological Microscopy</topic><topic>Biological Techniques</topic><topic>Biomedical and Life Sciences</topic><topic>Biomedical Engineering/Biotechnology</topic><topic>Computer architecture</topic><topic>Datasets</topic><topic>Deep Learning</topic><topic>Diagnostic imaging</topic><topic>Empirical analysis</topic><topic>Health services</topic><topic>Image analysis</topic><topic>Image processing</topic><topic>Image Processing, Computer-Assisted - methods</topic><topic>Image segmentation</topic><topic>Life Sciences</topic><topic>Machine learning</topic><topic>Medical imaging</topic><topic>Methods</topic><topic>Neural Networks, Computer</topic><topic>Post-production processing</topic><topic>Proteomics</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Isensee, Fabian</creatorcontrib><creatorcontrib>Jaeger, Paul F.</creatorcontrib><creatorcontrib>Kohl, Simon A. A.</creatorcontrib><creatorcontrib>Petersen, Jens</creatorcontrib><creatorcontrib>Maier-Hein, Klaus H.</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Neurosciences Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>Agricultural & Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Earth, Atmospheric & Aquatic Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agricultural Science Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Science Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Earth, Atmospheric & Aquatic Science Database</collection><collection>Materials Science Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>ProQuest Central Basic</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>Nature methods</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Isensee, Fabian</au><au>Jaeger, Paul F.</au><au>Kohl, Simon A. A.</au><au>Petersen, Jens</au><au>Maier-Hein, Klaus H.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation</atitle><jtitle>Nature methods</jtitle><stitle>Nat Methods</stitle><addtitle>Nat Methods</addtitle><date>2021-02-01</date><risdate>2021</risdate><volume>18</volume><issue>2</issue><spage>203</spage><epage>211</epage><pages>203-211</pages><issn>1548-7091</issn><eissn>1548-7105</eissn><abstract>Biomedical imaging is a driver of scientific discovery and a core component of medical care and is being stimulated by the field of deep learning. While semantic segmentation algorithms enable image analysis and quantification in many applications, the design of respective specialized solutions is non-trivial and highly dependent on dataset properties and hardware conditions. We developed nnU-Net, a deep learning-based segmentation method that automatically configures itself, including preprocessing, network architecture, training and post-processing for any new task. The key design choices in this process are modeled as a set of fixed parameters, interdependent rules and empirical decisions. Without manual intervention, nnU-Net surpasses most existing approaches, including highly specialized solutions on 23 public datasets used in international biomedical segmentation competitions. We make nnU-Net publicly available as an out-of-the-box tool, rendering state-of-the-art segmentation accessible to a broad audience by requiring neither expert knowledge nor computing resources beyond standard network training.
nnU-Net is a deep learning-based image segmentation method that automatically configures itself for diverse biological and medical image segmentation tasks. nnU-Net offers state-of-the-art performance as an out-of-the-box tool.</abstract><cop>New York</cop><pub>Nature Publishing Group US</pub><pmid>33288961</pmid><doi>10.1038/s41592-020-01008-z</doi><tpages>9</tpages><orcidid>https://orcid.org/0000-0002-6626-2463</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1548-7091 |
ispartof | Nature methods, 2021-02, Vol.18 (2), p.203-211 |
issn | 1548-7091 1548-7105 |
language | eng |
recordid | cdi_proquest_miscellaneous_2468341213 |
source | MEDLINE; Nature Journals Online; SpringerLink Journals - AutoHoldings |
subjects | 631/114/1564 692/308/575 Algorithms Bioinformatics Biological Microscopy Biological Techniques Biomedical and Life Sciences Biomedical Engineering/Biotechnology Computer architecture Datasets Deep Learning Diagnostic imaging Empirical analysis Health services Image analysis Image processing Image Processing, Computer-Assisted - methods Image segmentation Life Sciences Machine learning Medical imaging Methods Neural Networks, Computer Post-production processing Proteomics Training |
title | nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-05T16%3A51%3A34IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_proqu&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=nnU-Net:%20a%20self-configuring%20method%20for%20deep%20learning-based%20biomedical%20image%20segmentation&rft.jtitle=Nature%20methods&rft.au=Isensee,%20Fabian&rft.date=2021-02-01&rft.volume=18&rft.issue=2&rft.spage=203&rft.epage=211&rft.pages=203-211&rft.issn=1548-7091&rft.eissn=1548-7105&rft_id=info:doi/10.1038/s41592-020-01008-z&rft_dat=%3Cgale_proqu%3EA655717282%3C/gale_proqu%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2486620604&rft_id=info:pmid/33288961&rft_galeid=A655717282&rfr_iscdi=true |