ELU-Net: An Efficient and Lightweight U-Net for Medical Image Segmentation

Recent years have witnessed a growing interest in the use of U-Net and its improvement. It is one of the classic semantic segmentation networks with an encoder-decoder architecture and is widely used in medical image segmentation. In the series versions of U-Net, U-Net++ has been developed as an imp...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2022, Vol.10, p.35932-35941
Hauptverfasser: Deng, Yunjiao, Hou, Yulei, Yan, Jiangtao, Zeng, Daxing
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Recent years have witnessed a growing interest in the use of U-Net and its improvement. It is one of the classic semantic segmentation networks with an encoder-decoder architecture and is widely used in medical image segmentation. In the series versions of U-Net, U-Net++ has been developed as an improved U-Net by designing an architecture with nested and dense skip connections, and U-Net 3+ has been developed as an improved U-Net++ by taking advantage of full-scale skip connections and deep supervision on full-scale aggregated feature maps. Each network architecture has its own advantages in the use of the encoder and decoder. In this paper, we propose an efficient and lightweight U-Net (ELU-Net) with deep skip connections. The deep skip connections include same- and large-scale skip connections from the encoder to fully extract the features of the encoder. In addition, the proposed ELU-Net with different loss functions is discussed to improve the effect of brain tumor learning including WT (whole tumor), TC (tumor core) and ET (enhance tumor) and a new loss function DFK is designed. The effectiveness of the proposed method is demonstrated for a brain tumor dataset used in the BraTS 2018 Challenge and liver dataset used in the ISBI LiTS 2017 Challenge.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3163711