Real time detection and monitoring of pedestrians with unmanned aerial vehicles

The exponential increase in the transmission rate of the COVID-19 pandemic has impacted mankind in every way. It can be limited by staying away from actual contact among individuals. Because of the event of new variations as well as the asymptomatic nature, observing social distancing ends up being...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Takawade, Siddhi, Mishra, Aditya, Mulani, Amaan, Bhandwalkar, Payal, Patil, Shashikant, Benslimane, Abderrahim, Perez, Julio
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The exponential increase in the transmission rate of the COVID-19 pandemic has impacted mankind in every way. It can be limited by staying away from actual contact among individuals. Because of the event of new variations as well as the asymptomatic nature, observing social distancing ends up being the best countermeasure. The motivation behind this work is, hence, to implement social distancing utilizing the above point of view. The system utilizes the YOLOv3 object detection model to recognize people in video frames. In the proposed paper, the objects are distinguished utilizing the You Only Look Once (YOLO) object-recognizing method. Furthermore the coloured bounding boxes demonstrate the safe zone - green boxes demonstrating a safe zone, Yellow boxees demonstrate low risk and red showing violation. Using the Euclidean distance, the following calculation is utilized to recognize people in video groupings to such an extent that the individual who disregards/passes the social distance boundary is likewise being followed. This framework is executed utilizing the video input got from the Unmanned Aerial Vehicle (UAV) which permits greater versatility and adaptability for our undertaking when contrasted with fixed observation frameworks like CCTVs.
ISSN:0094-243X
1551-7616
DOI:10.1063/5.0234352