RootPainter: deep learning segmentation of biological images with corrective annotation

Summary Convolutional neural networks (CNNs) are a powerful tool for plant image analysis, but challenges remain in making them more accessible to researchers without a machine‐learning background. We present RootPainter, an open‐source graphical user interface based software tool for the rapid trai...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The New phytologist 2022-10, Vol.236 (2), p.774-791
Hauptverfasser: Smith, Abraham George, Han, Eusun, Petersen, Jens, Olsen, Niels Alvin Faircloth, Giese, Christian, Athmann, Miriam, Dresbøll, Dorte Bodin, Thorup‐Kristensen, Kristian
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Summary Convolutional neural networks (CNNs) are a powerful tool for plant image analysis, but challenges remain in making them more accessible to researchers without a machine‐learning background. We present RootPainter, an open‐source graphical user interface based software tool for the rapid training of deep neural networks for use in biological image analysis. We evaluate RootPainter by training models for root length extraction from chicory (Cichorium intybus L.) roots in soil, biopore counting, and root nodule counting. We also compare dense annotations with corrective ones that are added during the training process based on the weaknesses of the current model. Five out of six times the models trained using RootPainter with corrective annotations created within 2 h produced measurements strongly correlating with manual measurements. Model accuracy had a significant correlation with annotation duration, indicating further improvements could be obtained with extended annotation. Our results show that a deep‐learning model can be trained to a high accuracy for the three respective datasets of varying target objects, background, and image quality with
ISSN:0028-646X
1469-8137
DOI:10.1111/nph.18387