Saliency Boosting: a novel framework to refine salient object detection
Salient object detection is a challenging research area and various methods have been proposed in literature. However, these methods usually focus on detecting salient objects in particular type of images only and fail when exposed to a variety of images. Here, we address this problem by proposing a...
Gespeichert in:
Veröffentlicht in: | The Artificial intelligence review 2020-06, Vol.53 (5), p.3731-3772 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Salient object detection is a challenging research area and various methods have been proposed in literature. However, these methods usually focus on detecting salient objects in particular type of images only and fail when exposed to a variety of images. Here, we address this problem by proposing a novel framework called Saliency Boosting for refining saliency maps. In particular, the framework trains an Artificial Neural Network Regressor to refine initial saliency measures which are obtained from existing saliency methods. Extensive experiments on seven publicly available datasets viz. MSRA10K-test, DUT-OMRON-test, ECSSD, PASCAL-S, SED2, THUR15K, and HKU-IS have been performed to determine the effectiveness of the proposed framework. The performance of the proposed framework is measured in terms of Precision, Recall, F-Measure, Precision–Recall curve, Overlapping Ratio, Area Under the Curve and Receiver Operating Characteristic curve. The proposed framework is compared with 20 state-of-the-art-methods including best performing methods in the last decade. Further, performance of the proposed framework is better than each individual saliency detection method used in the framework. The proposed framework outperforms or is comparable with 20 state-of-the-art methods in terms of the aforementioned performance measures on all datasets. |
---|---|
ISSN: | 0269-2821 1573-7462 |
DOI: | 10.1007/s10462-019-09777-6 |