Accurate Morphology Preserving Segmentation of Overlapping Cells based on Active Contours

The identification of fluorescently stained cell nuclei is the basis of cell detection, segmentation, and feature extraction in high content microscopy experiments. The nuclear morphology of single cells is also one of the essential indicators of phenotypic variation. However, the cells used in expe...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Scientific reports 2016-08, Vol.6 (1), p.32412-32412, Article 32412
Hauptverfasser: Molnar, Csaba, Jermyn, Ian H., Kato, Zoltan, Rahkama, Vesa, Östling, Päivi, Mikkonen, Piia, Pietiäinen, Vilja, Horvath, Peter
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The identification of fluorescently stained cell nuclei is the basis of cell detection, segmentation, and feature extraction in high content microscopy experiments. The nuclear morphology of single cells is also one of the essential indicators of phenotypic variation. However, the cells used in experiments can lose their contact inhibition, and can therefore pile up on top of each other, making the detection of single cells extremely challenging using current segmentation methods. The model we present here can detect cell nuclei and their morphology even in high-confluency cell cultures with many overlapping cell nuclei. We combine the “gas of near circles” active contour model, which favors circular shapes but allows slight variations around them, with a new data model. This captures a common property of many microscopic imaging techniques: the intensities from superposed nuclei are additive, so that two overlapping nuclei, for example, have a total intensity that is approximately double the intensity of a single nucleus. We demonstrate the power of our method on microscopic images of cells, comparing the results with those obtained from a widely used approach, and with manual image segmentations by experts.
ISSN:2045-2322
2045-2322
DOI:10.1038/srep32412