Fuzzy Attention-Based Border Rendering Orthogonal Network for Lung Organ Segmentation
Automatic lung organ segmentation on computerized tomography images is crucial for lung disease diagnosis. However, the unlimited voxel values and class imbalance of lung organs can lead to false-negative/positive and leakage issues in numerous state-of-the-art methods. In addition, some lung organs...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on fuzzy systems 2024-10, Vol.32 (10), p.5462-5476 |
---|---|
Hauptverfasser: | , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Automatic lung organ segmentation on computerized tomography images is crucial for lung disease diagnosis. However, the unlimited voxel values and class imbalance of lung organs can lead to false-negative/positive and leakage issues in numerous state-of-the-art methods. In addition, some lung organs are easily lost during the recycled down/up-sample procedure, e.g., bronchioles and arterioles, which can cause severe discontinuity issue. Inspired by these, this article introduces an effective lung organ segmentation method called fuzzy attention-based border rendering feature orthogonal network, which 1) integrates an efficient transformer-like fuzzy-attention module into deep networks to cope with the uncertainty in feature representations; 2) decouples and depicts the lung organ regions as cube-trees by focusing only on recycle -sampling border vulnerable points, rendering the severely discontinuous, false-negative/positive organ regions with two novel global-local cube-tree fusion and sparse patched feature orthogonal modules; 3) develops a multiscale self-knowledge guidance module to improve model performance and robustness. We have demonstrated the efficacy of proposed method on five challenging datasets of lung organ segmentation, i.e., airway and artery. All experimental results demonstrate that our method can achieve the favorable performance significantly. |
---|---|
ISSN: | 1063-6706 1941-0034 |
DOI: | 10.1109/TFUZZ.2024.3433506 |