Scale-Context Perceptive Network for Crowd Counting and Localization in Smart City System

The task of crowd counting and localization is to predict the count and position of people in a crowd, which is a practical and essential sub-task in crowd analysis and smart city systems. However, the inherent problems of scale variation and background disturbance restrain their performance. While...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE internet of things journal 2023-11, Vol.10 (21), p.1-1
Hauptverfasser: Zhai, Wenzhe, Gao, Mingliang, Guo, Xiangyu, Li, Qilei
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The task of crowd counting and localization is to predict the count and position of people in a crowd, which is a practical and essential sub-task in crowd analysis and smart city systems. However, the inherent problems of scale variation and background disturbance restrain their performance. While recent researches focus on studying counting and localization independently, a few works are capable of executing both tasks simultaneously. To this end, we propose a Scale-Context Perceptive Network (SCPNet) to jointly tackle the crowd counting and localization tasks in a unified framework. Specifically, a scale perceptive (SP) module with a local-global branch schema is designed to capture multiscale information. Meanwhile, a context perceptive (CP) module, by the channel-spatial self-attention mechanism, is derived to suppress the background disturbance. Furthermore, a novel hierarchical scale loss function that combines the Euclidean loss function and structural similarity loss function is designed to prompt the proposed model to fulfill the counting and localization simultaneously. Extensive experiments on challenging crowd datasets prove the superiority of the proposed SCPNet compared with the state-of-the-art competitors in both objective and subjective evaluations.
ISSN:2327-4662
2327-4662
DOI:10.1109/JIOT.2023.3268226