Nuclei-Based Features for Uterine Cervical Cancer Histology Image Analysis With Fusion-Based Classification

Cervical cancer, which has been affecting women worldwide as the second most common cancer, can be cured if detected early and treated well. Routinely, expert pathologists visually examine histology slides for cervix tissue abnormality assessment. In previous research, we investigated an automated,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE journal of biomedical and health informatics 2016-11, Vol.20 (6), p.1595-1607
Hauptverfasser: Guo, Peng, Banerjee, Koyel, Joe Stanley, R., Long, Rodney, Antani, Sameer, Thoma, George, Zuna, Rosemary, Frazier, Shelliane R., Moss, Randy H., Stoecker, William V.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Cervical cancer, which has been affecting women worldwide as the second most common cancer, can be cured if detected early and treated well. Routinely, expert pathologists visually examine histology slides for cervix tissue abnormality assessment. In previous research, we investigated an automated, localized, fusion-based approach for classifying squamous epithelium into Normal, CIN1, CIN2, and CIN3 grades of cervical intraepithelial neoplasia (CIN) based on image analysis of 61 digitized histology images. This paper introduces novel acellular and atypical cell concentration features computed from vertical segment partitions of the epithelium region within digitized histology images to quantize the relative increase in nuclei numbers as the CIN grade increases. Based on the CIN grade assessments from two expert pathologists, image-based epithelium classification is investigated with voting fusion of vertical segments using support vector machine and linear discriminant analysis approaches. Leave-one-out is used for the training and testing for CIN classification, achieving an exact grade labeling accuracy as high as 88.5%.
ISSN:2168-2194
2168-2208
DOI:10.1109/JBHI.2015.2483318