DMINet: dense multi-scale inference network for salient object detection

Although the salient object detection (SOD) methods based on fully convolutional networks have made extraordinary achievements, it is still a challenge to accurately detect salient objects with complicated structure from cluttered real-world scenes due to their rarely considering the effectiveness a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Visual computer 2022-09, Vol.38 (9-10), p.3059-3072
Hauptverfasser: Xia, Chenxing, Sun, Yanguang, Gao, Xiuju, Ge, Bin, Duan, Songsong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Although the salient object detection (SOD) methods based on fully convolutional networks have made extraordinary achievements, it is still a challenge to accurately detect salient objects with complicated structure from cluttered real-world scenes due to their rarely considering the effectiveness and correlation of the captured different scale context and how to efficient interaction of complementary information. Motivate by this, in this paper, a novel Dense Multi-scale Inference Network (DMINet) is proposed for the accurate SOD task, which mainly consists of a dual-stream multi-receptive field module and a residual multi-mode interaction strategy. The former uses the well-designed different receptive field convolution operations and dense guidance connections to efficiently capture and utilize multi-scale contextual features for better salient objects inferring, while the latter adopts diverse interaction manners to adequately interact complementary information from multi-level features, generating powerful feature representations for predicting high-quality saliency maps. Quantitative and qualitative comparison results on five SOD datasets convincingly demonstrate that our DMINet performs favorably compared with 17 state-of-the-art SOD methods under different evaluation metrics.
ISSN:0178-2789
1432-2315
DOI:10.1007/s00371-022-02561-8