A Human-Grounded Evaluation of SHAP for Alert Processing
In the past years, many new explanation methods have been proposed to achieve interpretability of machine learning predictions. However, the utility of these methods in practical applications has not been researched extensively. In this paper we present the results of a human-grounded evaluation of...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In the past years, many new explanation methods have been proposed to achieve
interpretability of machine learning predictions. However, the utility of these
methods in practical applications has not been researched extensively. In this
paper we present the results of a human-grounded evaluation of SHAP, an
explanation method that has been well-received in the XAI and related
communities. In particular, we study whether this local model-agnostic
explanation method can be useful for real human domain experts to assess the
correctness of positive predictions, i.e. alerts generated by a classifier. We
performed experimentation with three different groups of participants (159 in
total), who had basic knowledge of explainable machine learning. We performed a
qualitative analysis of recorded reflections of experiment participants
performing alert processing with and without SHAP information. The results
suggest that the SHAP explanations do impact the decision-making process,
although the model's confidence score remains to be a leading source of
evidence. We statistically test whether there is a significant difference in
task utility metrics between tasks for which an explanation was available and
tasks in which it was not provided. As opposed to common intuitions, we did not
find a significant difference in alert processing performance when a SHAP
explanation is available compared to when it is not. |
---|---|
DOI: | 10.48550/arxiv.1907.03324 |