COSNet: A Novel Semantic Segmentation Network using Enhanced Boundaries in Cluttered Scenes
Automated waste recycling aims to efficiently separate the recyclable objects from the waste by employing vision-based systems. However, the presence of varying shaped objects having different material types makes it a challenging problem, especially in cluttered environments. Existing segmentation...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Automated waste recycling aims to efficiently separate the recyclable objects
from the waste by employing vision-based systems. However, the presence of
varying shaped objects having different material types makes it a challenging
problem, especially in cluttered environments. Existing segmentation methods
perform reasonably on many semantic segmentation datasets by employing
multi-contextual representations, however, their performance is degraded when
utilized for waste object segmentation in cluttered scenarios. In addition,
plastic objects further increase the complexity of the problem due to their
translucent nature. To address these limitations, we introduce an efficacious
segmentation network, named COSNet, that uses boundary cues along with
multi-contextual information to accurately segment the objects in cluttered
scenes. COSNet introduces novel components including feature sharpening block
(FSB) and boundary enhancement module (BEM) for enhancing the features and
highlighting the boundary information of irregular waste objects in cluttered
environment. Extensive experiments on three challenging datasets including
ZeroWaste-f, SpectralWaste, and ADE20K demonstrate the effectiveness of the
proposed method. Our COSNet achieves a significant gain of 1.8% on ZeroWaste-f
and 2.1% on SpectralWaste datasets respectively in terms of mIoU metric. |
---|---|
DOI: | 10.48550/arxiv.2410.24139 |