Polarized Attention Weak Supervised Semantic Segmentation Network
Currently, weakly supervised semantic segmentation methods based on image-level annotation often rely on pseudo pixel masks generated by seed regions. However, the growth of seed regions is stochastic, and in cases where targets are occluded or overlapped in the image without additional reference in...
Gespeichert in:
Veröffentlicht in: | IEEE access 2024-01, Vol.12, p.1-1 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Currently, weakly supervised semantic segmentation methods based on image-level annotation often rely on pseudo pixel masks generated by seed regions. However, the growth of seed regions is stochastic, and in cases where targets are occluded or overlapped in the image without additional reference information, the segmentation network may encounter issues of missed or incorrect segmentation. To address this problem, this paper proposes a polarized attention mechanism for weakly supervised semantic segmentation networks. The attention mechanism consists of a semantic perception branch and a boundary detection branch. The semantic perception branch allows the network to better distinguish the category of each pixel in the image. Subsequently, the boundary detection branch enables the seed region to naturally expand towards the target boundary. The pseudo pixel mask generated by this method provides better coverage of the target area and improves the performance of the segmentation network. The test set and validation set mean Intersection over Union (mIoU) of the PASCAL VOC 2012 dataset achieved 72.9% and 73.2%. The results of the experiments demonstrated the effectiveness of the proposed method. The experimental results indicated that the attention mechanism, as proposed in this paper, can effectively enhance the segmentation performance in situations where objects in the image are occluded or overlapped. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2023.3344098 |