A convolutional neural network segments yeast microscopy images with high accuracy
The identification of cell borders (‘segmentation’) in microscopy images constitutes a bottleneck for large-scale experiments. For the model organism Saccharomyces cerevisiae , current segmentation methods face challenges when cells bud, crowd, or exhibit irregular features. We present a convolution...
Gespeichert in:
Veröffentlicht in: | Nature communications 2020-11, Vol.11 (1), p.5723-5723, Article 5723 |
---|---|
Hauptverfasser: | , , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The identification of cell borders (‘segmentation’) in microscopy images constitutes a bottleneck for large-scale experiments. For the model organism
Saccharomyces cerevisiae
, current segmentation methods face challenges when cells bud, crowd, or exhibit irregular features. We present a convolutional neural network (CNN) named YeaZ, the underlying training set of high-quality segmented yeast images (>10 000 cells) including mutants, stressed cells, and time courses, as well as a graphical user interface and a web application (
www.quantsysbio.com/data-and-software
) to efficiently employ, test, and expand the system. A key feature is a cell-cell boundary test which avoids the need for fluorescent markers. Our CNN is highly accurate, including for buds, and outperforms existing methods on benchmark images, indicating it transfers well to other conditions. To demonstrate how efficient large-scale image processing uncovers new biology, we analyze the geometries of ≈2200 wild-type and cyclin mutant cells and find that morphogenesis control occurs unexpectedly early and gradually.
Current cell segmentation methods for
Saccharomyces cerevisiae
face challenges under a variety of standard experimental and imaging conditions. Here the authors develop a convolutional neural network for accurate, label-free cell segmentation. |
---|---|
ISSN: | 2041-1723 2041-1723 |
DOI: | 10.1038/s41467-020-19557-4 |