Res2-UNeXt: a novel deep learning framework for few-shot cell image segmentation
Recently, developing more accurate and more efficient deep learning algorithms for medical images segmentation attracts more and more attentions of researchers. Most of methods increase the depth of the network to replace with acquiring multi-information. The costs of training images annotation are...
Gespeichert in:
Veröffentlicht in: | Multimedia tools and applications 2022-04, Vol.81 (10), p.13275-13288 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recently, developing more accurate and more efficient deep learning algorithms for medical images segmentation attracts more and more attentions of researchers. Most of methods increase the depth of the network to replace with acquiring multi-information. The costs of training images annotation are too expensive to label by hand. In this paper, we propose a multi-scale and better performance deep architecture for medical image segmentation, named Res2-UNeXt. Our architecture is an encoder-decoder network with Res2XBlocks. The Res2XBlocks aim at acquiring multi-scale information better in images. To cooperate with Res2-UNeXt, we put forward a simple and efficient method of data augmentation. The data augmentation method, based on the process of cell movement and deformation, has biological implications in away. We evaluate Res2-UNeXt in comparison with recent variants of U-Net: UNet++, CE-Net and LadderNet and the method that different from U-Net architecture: FCN and DFANet on the dataset of ISBI cell tracking challenge 2019 via four different cell images. The experimental results demonstrate that Res2-UNeXt can achieve better performance than both recent variants of U-Net and non-U-Net architecture methods. Besides, the proposed architecture and the data augmentation method have been proven efficiently by the ablation experiments. |
---|---|
ISSN: | 1380-7501 1573-7721 |
DOI: | 10.1007/s11042-021-10536-5 |