GL-FusionNet: Fusing global and local features to classify deep and superficial partial thickness burn

Burns constitute one of the most common injuries in the world, and they can be very painful for the patient. Especially in the judgment of superficial partial thickness burns and deep partial thickness burns, many inexperienced clinicians are easily confused. Therefore, in order to make burn depth c...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Mathematical biosciences and engineering : MBE 2023-01, Vol.20 (6), p.10153-10173
Hauptverfasser: Li, Zhiwei, Huang, Jie, Tong, Xirui, Zhang, Chenbei, Lu, Jianyu, Zhang, Wei, Song, Anping, Ji, Shizhao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Burns constitute one of the most common injuries in the world, and they can be very painful for the patient. Especially in the judgment of superficial partial thickness burns and deep partial thickness burns, many inexperienced clinicians are easily confused. Therefore, in order to make burn depth classification automated as well as accurate, we have introduced the deep learning method. This methodology uses a U-Net to segment burn wounds. On this basis, a new thickness burn classification model that fuses global and local features (GL-FusionNet) is proposed. For the thickness burn classification model, we use a ResNet50 to extract local features, use a ResNet101 to extract global features, and finally implement the add method to perform feature fusion and obtain the deep partial or superficial partial thickness burn classification results. Burns images are collected clinically, and they are segmented and labeled by professional physicians. Among the segmentation methods, the U-Net used achieved a Dice score of 85.352 and IoU score of 83.916, which are the best results among all of the comparative experiments. In the classification model, different existing classification networks are mainly used, as well as a fusion strategy and feature extraction method that are adjusted to conduct experiments; the proposed fusion network model also achieved the best results. Our method yielded the following: accuracy of 93.523, recall of 93.67, precision of 93.51, and F1-score of 93.513. In addition, the proposed method can quickly complete the auxiliary diagnosis of the wound in the clinic, which can greatly improve the efficiency of the initial diagnosis of burns and the nursing care of clinical medical staff.
ISSN:1551-0018
1551-0018
DOI:10.3934/mbe.2023445