Developing an explainable hybrid deep learning model in digital transformation: an empirical study

Automated inspection is an important component of digital transformation. However, most deep learning models that have been widely applied in automated inspection cannot objectively explain the results. Their resulting outcome, known as low interpretability, creates difficulties in finding the root...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of intelligent manufacturing 2024-04, Vol.35 (4), p.1793-1810
Hauptverfasser: Chiu, Ming-Chuan, Chiang, Yu-Hsiang, Chiu, Jing-Er
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Automated inspection is an important component of digital transformation. However, most deep learning models that have been widely applied in automated inspection cannot objectively explain the results. Their resulting outcome, known as low interpretability, creates difficulties in finding the root cause of errors and improving the accuracy of the model. This research proposes an integrative method that combines a deep learning object detection model, a clustering algorithm, and a similarity algorithm to achieve an explainable automated detection process. An electronic embroidery case study demonstrates the explainable method, which can quickly be debugged to enhance accuracy. The results show an accuracy during testing of 97.58% with inspection time reduced by 25.93%. This proposed method resolves several challenges involved with automated inspection and digital transformation. Academically, the automated detection deep learning model proposed in this study has high accuracy along with good interpretability and debugability. In practice, this process can speed up the inspection process while saving human effort.
ISSN:0956-5515
1572-8145
DOI:10.1007/s10845-023-02127-y