Salient Object Detection Network With Selection Mechanism for Consumer Electronics Imaging Systems

Salient object detection aims to identify the most attentive regions of an image, and as an image processing technology, it is widely used in imaging systems of many consumer electronics. With the development of deep learning, a series of notable results has achieved in salient object detection fiel...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on consumer electronics 2024-02, Vol.70 (1), p.3403-3413
Hauptverfasser: Jia, Ning, Liu, Xianhui, Sun, Yougang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Salient object detection aims to identify the most attentive regions of an image, and as an image processing technology, it is widely used in imaging systems of many consumer electronics. With the development of deep learning, a series of notable results has achieved in salient object detection field. However, there are two intractable challenges still remain to be studied: 1) the issue of model lightweight; 2) model robustness. To address these two problems, we propose a feature filtering and fusion module based on a novel selection mechanism (SM), which can adaptively discover and fuse multi-scale features. Compared to other methods, it reduces the number of features that need to be processed, greatly reducing the complexity of the salient object detection model. We simultaneously explored the role of global contextual information and global multi-scale features in saliency detection task and validated them. Based on the SM block, we propose a lightweight salient object detection network named SMNet. Extensive experimental results on six datasets show that our model achieves satisfactory results, which can greatly save storage space and improve data transmission speed of image acquisition equipment.
ISSN:0098-3063
1558-4127
DOI:10.1109/TCE.2023.3345939