Multimodality computerized diagnosis of breast lesions using mammography and sonography

The purpose of this study is to investigate the use of computer-extracted features of lesions imaged by means of two modalities, mammography and breast ultrasound, in the computerized classification of breast lesions. We performed computerized analysis on a database of 97 patients with a total of 10...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Academic radiology 2005-08, Vol.12 (8), p.970-979
Hauptverfasser: Drukker, Karen, Horsch, Karla, Giger, Maryellen L
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The purpose of this study is to investigate the use of computer-extracted features of lesions imaged by means of two modalities, mammography and breast ultrasound, in the computerized classification of breast lesions. We performed computerized analysis on a database of 97 patients with a total of 100 lesions (40 malignant, 40 benign solid, and 20 cystic lesions). Mammograms and ultrasound images were available for these breast lesions. There was an average of three mammographic images and two ultrasound images per lesion. Based on seed points indicated by a radiologist, the computer automatically segmented lesions from the parenchymal background and automatically extracted a set of characteristic features for each lesion. For each feature, its value averaged over all images pertaining to a given lesion was input to a Bayesian neural network for classification. We also investigated different approaches to combine image-based features into this by-lesion analysis. In that analysis, mean, maximum, and minimum feature values were considered for all images representing a lesion. We considered performance by using a leave-one-lesion-out approach, based on image features from mammography alone (two to five features), ultrasound alone (three to four features), and a combination of features from both modalities (three to five features total). For the classification task of distinguishing cancer from other abnormalities in a lesion-based analysis by using a single modality, areas under the receiver operating characteristic curves (A(z) values) increased significantly when the computer selected the manner (mean, minimum, or maximum) in which image-based features were combined into lesion-based features. The highest performance was found for lesion-based analysis and automated feature selection from mean, maximum, and minimum values of features from both modalities (resulting in a total of four features being used). That A(z) value for the task of distinguishing cancer was 0.92, showing a statistically significant increase over that achieved with features from either mammography or ultrasound alone. Computerized classification of cancer significantly improved when lesion features from both modalities were combined. Classification performance depended on specific methods for combining features from multiple images per lesion. These results are encouraging and warrant further exploration of computerized methods for multimodality imaging.
ISSN:1076-6332
DOI:10.1016/j.acra.2005.04.014