SafeFac: Video-based smart safety monitoring for preventing industrial work accidents

This work presents SafeFac, an intelligent camera-based system for managing the safety of factory environments. In SafeFac a set of cameras installed on the assembly line are used to capture images of workers that approach the machinery under hazardous situations to alert system managers and halt th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Expert systems with applications 2023-04, Vol.215, p.119397, Article 119397
Hauptverfasser: Ahn, Jungmo, Park, JaeYeon, Lee, Sung Sik, Lee, Kyu-Hyuk, Do, Heesung, Ko, JeongGil
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This work presents SafeFac, an intelligent camera-based system for managing the safety of factory environments. In SafeFac a set of cameras installed on the assembly line are used to capture images of workers that approach the machinery under hazardous situations to alert system managers and halt the line if needed. Given a challenging set of practical application-level requirements such as multi-camera support and low response latency, SafeFac exploits a YOLOv3-based light-weight human object detection. To address the latency–accuracy tradeoff, SafeFac incorporates a set of algorithms as pre- and post-processing modules and a novel adaptive camera scheduling scheme. Our evaluation with a video dataset containing more that 113,000 frames from real assembly line activity shows that SafeFac achieves high precision (99.93%) and recall (96.44%), and SafeFac successfully satisfies such challenging requirements as a ready-for-deployment system for safe factory management. •Design an intelligent vision-based safety monitoring system for factory environments.•We compile a set of industrial-level system requirements for factory safety monitoring.•We validate our system using videos from a real assembly line to show high accuracy.
ISSN:0957-4174
1873-6793
DOI:10.1016/j.eswa.2022.119397