Interpretability analysis for thermal sensation machine learning models: An exploration based on the SHAP approach
Machine learning models have been widely used for studying thermal sensations. However, the black‐box properties of machine learning models lead to the lack of model transparency, and existing explanations for the thermal sensation models are generally flawed in terms of the perspectives of interpre...
Gespeichert in:
Veröffentlicht in: | Indoor air 2022-02, Vol.32 (2), p.e12984-n/a |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Machine learning models have been widely used for studying thermal sensations. However, the black‐box properties of machine learning models lead to the lack of model transparency, and existing explanations for the thermal sensation models are generally flawed in terms of the perspectives of interpretable methods. In this study, we perform an interpretability analysis using the "SHapley Additive exPlanation" (SHAP) from game theory for thermal sensation machine learning models. The effects of different features on thermal sensations and typical decision routes in the models are investigated from both local and global perspectives, and the properties of correlation between features and thermal sensations and decision routes within machine learning models are summarized. The differences in the effects of features across samples reflect the effects of features on thermal sensations not only can be demonstrated by significant magnitudes but also by differentiation. The effects of features on thermal sensations often appear in the form of combinations of two to four features, which determine the final thermal sensation in most cases. Therefore, the neutral environment may actually be a dynamic high‐dimensional space consisting of certain combinations of features in certain ranges with changing shapes. |
---|---|
ISSN: | 0905-6947 1600-0668 1600-0668 |
DOI: | 10.1111/ina.12984 |