A deep learning fusion approach to retrieve images of People's unsafe behavior from construction sites
Retrieving unsafe behaviours from an existing digital database can provide managers and the like with the necessary information to put in place strategies to improve safety in construction. Prevailing studies have focused on developing content-based image retrieval (CBIR) approaches (e.g., color-bas...
Gespeichert in:
Veröffentlicht in: | Developments in the built environment 2022-12, Vol.12, p.100085, Article 100085 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Retrieving unsafe behaviours from an existing digital database can provide managers and the like with the necessary information to put in place strategies to improve safety in construction. Prevailing studies have focused on developing content-based image retrieval (CBIR) approaches (e.g., color-based) to retrieve objects and materials obtained from construction sites. While CBIR approaches are effective in extracting low-level features from digital images they are unable to accurately retrieve unsafe behaviours those from existing databases. To address this limitation, we develop an improved CBIR approach to retrieve unsafe behaviour images more accurately and automatically, which combines features extracted from different models. We utilise a digital database developed by Huazhong University of Science and Technology to validate the feasibility of our proposed approach. Our research demonstrates that the fusion of ResNet-101 and VGG-19 can obtain higher levels of Top-K recall and outperform the one feature extraction method.
•A deep learning fusion approach is proposed to retrieve images of peoples unsafe behavior from construction sites.•The developed method obtained higher accuracy than single feature extraction ones.•Examples are used to illustrate the feasibility of the proposed approach. |
---|---|
ISSN: | 2666-1659 2666-1659 |
DOI: | 10.1016/j.dibe.2022.100085 |