Label Noise Robust Crowd Counting with Loss Filtering Factor

Crowd counting, a crucial computer vision task, aims at estimating the number of individuals in various environments. Each person in crowd counting datasets is typically annotated by a point at the center of the head. However, challenges like dense crowds, diverse scenarios, significant obscuration,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied artificial intelligence 2024-12, Vol.38 (1)
Hauptverfasser: Xu, Zhengmeng, Lin, Hai, Chen, Yufeng, Li, Yanli
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Crowd counting, a crucial computer vision task, aims at estimating the number of individuals in various environments. Each person in crowd counting datasets is typically annotated by a point at the center of the head. However, challenges like dense crowds, diverse scenarios, significant obscuration, and low resolution lead to inevitable label noise, adversely impacting model performance. Driven by the need to enhance model robustness in noisy environments and improve accuracy, we propose the Loss Filtering Factor (LFF) and the corresponding Label Noise Robust Crowd Counting (LNRCC) training scheme. LFF innovatively filters out losses caused by label noise during training, enabling models to focus on accurate data, thereby increasing reliability. Our extensive experiments demonstrate the effectiveness of LNRCC, which consistently improves performance across all models and datasets, with an average enhancement of 3.68% in Mean Absolute Error (MAE), 6.7% in Mean Squared Error (MSE) and 4.68% in Grid Average Mean Absolute Error (GAME). The universal applicability of this approach, coupled with its ease of integration into any neural network model architecture, marks a significant advancement in the field of computer vision, particularly in addressing the pivotal issue of accuracy in crowd counting under challenging conditions.
ISSN:0883-9514
1087-6545
1087-6545
DOI:10.1080/08839514.2024.2329859