Distributed Inference in Resource-Constrained IoT for Real-Time Video Surveillance

Advances in communication technologies and computational capabilities of Internet of Things (IoT) devices enable a range of complex applications that require ever increasing processing of sensors' data. An illustrative example is real-time video surveillance that captures videos of target scene...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE systems journal 2023-03, Vol.17 (1), p.1512-1523
Hauptverfasser: Khan, Muhammad Asif, Hamila, Ridha, Erbad, Aiman, Gabbouj, Moncef
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Advances in communication technologies and computational capabilities of Internet of Things (IoT) devices enable a range of complex applications that require ever increasing processing of sensors' data. An illustrative example is real-time video surveillance that captures videos of target scenes and process them to detect anomalies using deep learning (DL). Running deep learning models requires huge processing and incurs high computation delay and energy consumption on resource-constraint IoT devices. In this article, we introduce methods for distributed inference over IoT devices and edge server. Two distinct algorithms are proposed to split the deep neural network layers computation between IoT device and an edge server; the early split strategy (ESS) for battery powered IoT devices and the late split strategy (LSS) for IoT devices connected to regular power source. The evaluation shows that both the ESS and LSS schemes achieve the target inference delay deadline when tested over VGG16 and MobileNet_V2 CNN models. In terms of computational load, the ESS scheme achieves nearly 15-20% reduction whereas LSS scheme achieves up to 60% reduction. The gains in energy saving of IoT devices for both the ESS and LSS schemes are nearly 18% and 52%, respectively.
ISSN:1932-8184
1937-9234
DOI:10.1109/JSYST.2022.3198711