SAA-Net: U-shaped network with Scale-Axis-Attention for liver tumor segmentation
•We propose the Scale Attention mechanism, which is effective for multi-scale problem in liver tumor segmentation.•We improve the self-attention and propose the Axis Attention mechanism, which is efficient and effective for spatial information modeling globally.•We combine the Scale Attention and th...
Gespeichert in:
Veröffentlicht in: | Biomedical signal processing and control 2022-03, Vol.73, p.103460, Article 103460 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •We propose the Scale Attention mechanism, which is effective for multi-scale problem in liver tumor segmentation.•We improve the self-attention and propose the Axis Attention mechanism, which is efficient and effective for spatial information modeling globally.•We combine the Scale Attention and the Axis Attention mechanisms organically with a style of adaptive global pooling in SAA-Net, which is efficient for preserving fine-grained information in global pooling, beneficial to final segmentation.
In liver tumor segmentation tasks, the problems of multi-scale and global spatial modeling significantly affect the segmentation accuracy. For multi-scale feature extraction, we propose a dynamic scale attention mechanism, which assigns adaptive weights to multi-scale convolutions. Scale Attention could fuse receptive fields from multiple scales, which is beneficial to segmentation of multi-scale targets. For global modeling of spatial information, Axis Attention is proposed, which optimizes the computational resources utilization of self-attention and the attentive effect of convolution attention simultaneously. Axis Attention could model spatial long-range dependencies effectively and efficiently. Scale Attention and Axis Attention are organically combined with a style of adaptive global pooling and the composite proposed mechanism is called Scale-Axis-Attention (SAA). We incorporate it into U-shaped network to improve the performance of liver tumor segmentation, termed as SAA-Net. Our method not only is far superior to self-attention in terms of the computational resources utilization, but also incorporates the scale and spatial attention mechanisms simultaneously for performance improvement. We show that SAA-Net achieves the improved model capability and generalization performance through extensive experiments on qualitative and quantitative test datasets. Experiments also demonstrate the effectiveness of our method in the segmentation of tumors with small size. |
---|---|
ISSN: | 1746-8094 1746-8108 |
DOI: | 10.1016/j.bspc.2021.103460 |