Leveraging Model Scaling and Butterfly Network in the Bone Scan Image Segmentation

As we all know, cancer is one of the leading causes of death worldwide and the second leading cause of death overall. This is why regular screenings or health checks are necessary to detect cancer lesions early. Since bone scan images have become the primary means of detecting the emergence of cance...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of computational intelligence systems 2024-04, Vol.17 (1), p.1-18, Article 92
Hauptverfasser: Rachmawati, E., Sulistiyo, M. D., Nugraha, D. B.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:As we all know, cancer is one of the leading causes of death worldwide and the second leading cause of death overall. This is why regular screenings or health checks are necessary to detect cancer lesions early. Since bone scan images have become the primary means of detecting the emergence of cancer lesions on bone, high segmentation accuracy is essential for establishing the model of some predefined regions in bone scan images where cancer metastasis was predicted to appear. Consequently, robust localization and identification of the specific region in bone scan images are required for automated metastasis detection. To this end, we propose Efficient-BtrflyNet, a new deep learning-based architecture for skeleton segmentation of whole-body bone scan images. The proposed architecture exploits the benefits of EfficientNet’s model scaling and the encoder–decoder design of butterfly-type networks. We added EfficientNetB7 to the encoder section to obtain more specific features. The proposed architecture simultaneously processes anterior and posterior whole-body bone scan images. Using 37 bone scan images, we evaluated the performance of our proposed skeleton segmentation system using the Dice score. Efficient-BtrflyNet achieves superior segmentation performance compared to the existing representative method.
ISSN:1875-6883
1875-6883
DOI:10.1007/s44196-024-00453-4