Focusing intermediate pixels loss for salient object segmentation

To improve the network performance of salient object segmentation, many researchers modified the loss functions and set weights to pixel losses. However, these loss functions paid less attention to intermediate pixels of which the predicted probabilities lie in the intermediate region between correc...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2024-02, Vol.83 (7), p.19747-19766
Hauptverfasser: Chen, Lei, Cao, Tieyong, Zheng, Yunfei, Fang, Zheng, Wang, Yang, Fu, Bingyang, Wang, Yekui, Han, Tong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:To improve the network performance of salient object segmentation, many researchers modified the loss functions and set weights to pixel losses. However, these loss functions paid less attention to intermediate pixels of which the predicted probabilities lie in the intermediate region between correct and incorrect classification. To solve this problem, focusing intermediate pixels loss is proposed. Firstly, foreground and background are divided into correct and incorrect classified sets respectively to discover intermediate pixels which are difficult to determine the category. Secondly, the intermediate pixels are paid more attention according to the predicted probability. Finally, misclassified pixels are strengthened dynamically with the order of training epochs. The proposed method can 1) make the model focus on intermediate pixels that have more uncertainty; 2) solve the vanishing gradient problem of Focal Loss for well-classified pixels. Experiment results on six public datasets and two different type of network structures show that the proposed method performs better than other state-of-the-art weighted loss functions and the average F β is increased by about 2.7% compared with typical cross entropy.
ISSN:1573-7721
1380-7501
1573-7721
DOI:10.1007/s11042-023-15873-1