An improved deep learning approach for detection of thyroid papillary cancer in ultrasound images
Unlike daily routine images, ultrasound images are usually monochrome and low-resolution. In ultrasound images, the cancer regions are usually blurred, vague margin and irregular in shape. Moreover, the features of cancer region are very similar to normal or benign tissues. Therefore, training ultra...
Gespeichert in:
Veröffentlicht in: | Scientific reports 2018-04, Vol.8 (1), p.6600-12, Article 6600 |
---|---|
Hauptverfasser: | , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Unlike daily routine images, ultrasound images are usually monochrome and low-resolution. In ultrasound images, the cancer regions are usually blurred, vague margin and irregular in shape. Moreover, the features of cancer region are very similar to normal or benign tissues. Therefore, training ultrasound images with original Convolutional Neural Network (CNN) directly is not satisfactory. In our study, inspired by state-of-the-art object detection network Faster R-CNN, we develop a detector which is more suitable for thyroid papillary carcinoma detection in ultrasound images. In order to improve the accuracy of the detection, we add a spatial constrained layer to CNN so that the detector can extract the features of surrounding region in which the cancer regions are residing. In addition, by concatenating the shallow and deep layers of the CNN, the detector can detect blurrier or smaller cancer regions. The experiments demonstrate that the potential of this new methodology can reduce the workload for pathologists and increase the objectivity of diagnoses. We find that 93:5% of papillary thyroid carcinoma regions could be detected automatically while 81:5% of benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. |
---|---|
ISSN: | 2045-2322 2045-2322 |
DOI: | 10.1038/s41598-018-25005-7 |