Is human-like decision making explainable? Towards an explainable artificial intelligence for autonomous vehicles

•Constructed a Width Human-like Neural Network (WNN) model for driving intention analysis.•Developed a framework to explain WNN's black-box nature.•Demonstrated the critical role of vehicle parameters in human-like intention analysis.•These findings support the potential of human-like neural ne...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Transportation research interdisciplinary perspectives 2025-01, Vol.29, p.101278, Article 101278
Hauptverfasser: Xie, Jiming, Zhang, Yan, Qin, Yaqin, Wang, Bijun, Dong, Shuai, Li, Ke, Xia, Yulan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•Constructed a Width Human-like Neural Network (WNN) model for driving intention analysis.•Developed a framework to explain WNN's black-box nature.•Demonstrated the critical role of vehicle parameters in human-like intention analysis.•These findings support the potential of human-like neural network models in performing complex driving tasks. To achieve trustworthy human-like decisions for autonomous vehicles (AVs), this paper proposes a new explainable framework for personalized human-like driving intention analysis. In the first stage, we adopt a spectral clustering method for driving style characterization, and introduce a misclassification cost matrix to describe different driving needs. Based on the parallelism in the complex neural network of human brain, we construct a Width Human-like neural network (WNN) model for personalized cognitive and human-like driving intention decision making. In the second stage, we draw inspiration from the field of brain-like trusted AI to construct a robust, in-depth, and unbiased evaluation and interpretability framework involving three dimensions: Permutation Importance (PI) analysis, Partial Dependence Plot (PDP) analysis, and model complexity analysis. An empirical investigation using real driving trajectory data from Kunming, China, confirms the ability of our approach to predict potential driving decisions with high accuracy while providing the rationale implicit AV decisions. These findings have the potential to inform ongoing research on brain-like neural learning and could function as a catalyst for developing swifter and more potent algorithmic solutions in the realm of intelligent transportation.
ISSN:2590-1982
2590-1982
DOI:10.1016/j.trip.2024.101278