Is Conversational XAI All You Need? Human-AI Decision Making With a Conversational XAI Assistant

Explainable artificial intelligence (XAI) methods are being proposed to help interpret and understand how AI systems reach specific predictions. Inspired by prior work on conversational user interfaces, we argue that augmenting existing XAI methods with conversational user interfaces can increase us...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: He, Gaole, Aishwarya, Nilay, Gadiraju, Ujwal
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator He, Gaole
Aishwarya, Nilay
Gadiraju, Ujwal
description Explainable artificial intelligence (XAI) methods are being proposed to help interpret and understand how AI systems reach specific predictions. Inspired by prior work on conversational user interfaces, we argue that augmenting existing XAI methods with conversational user interfaces can increase user engagement and boost user understanding of the AI system. In this paper, we explored the impact of a conversational XAI interface on users' understanding of the AI system, their trust, and reliance on the AI system. In comparison to an XAI dashboard, we found that the conversational XAI interface can bring about a better understanding of the AI system among users and higher user trust. However, users of both the XAI dashboard and conversational XAI interfaces showed clear overreliance on the AI system. Enhanced conversations powered by large language model (LLM) agents amplified over-reliance. Based on our findings, we reason that the potential cause of such overreliance is the illusion of explanatory depth that is concomitant with both XAI interfaces. Our findings have important implications for designing effective conversational XAI interfaces to facilitate appropriate reliance and improve human-AI collaboration. Code can be found at https://github.com/delftcrowd/IUI2025_ConvXAI
doi_str_mv 10.48550/arxiv.2501.17546
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2501_17546</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2501_17546</sourcerecordid><originalsourceid>FETCH-arxiv_primary_2501_175463</originalsourceid><addsrcrecordid>eNpjYJA0NNAzsTA1NdBPLKrILNMzMjUw1DM0NzUx42RI8CxWcM7PK0stKk4syczPS8xRiHD0VHDMyVGIzC9V8EtNTbFX8CjNTczTBQq7pCZnFgNVKfgmZmfmpSuEZ5ZkKCRiNaC4OLO4JDGvhIeBNS0xpziVF0pzM8i7uYY4e-iC3RJfUJSZm1hUGQ9yUzzYTcaEVQAAU4BAng</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Is Conversational XAI All You Need? Human-AI Decision Making With a Conversational XAI Assistant</title><source>arXiv.org</source><creator>He, Gaole ; Aishwarya, Nilay ; Gadiraju, Ujwal</creator><creatorcontrib>He, Gaole ; Aishwarya, Nilay ; Gadiraju, Ujwal</creatorcontrib><description>Explainable artificial intelligence (XAI) methods are being proposed to help interpret and understand how AI systems reach specific predictions. Inspired by prior work on conversational user interfaces, we argue that augmenting existing XAI methods with conversational user interfaces can increase user engagement and boost user understanding of the AI system. In this paper, we explored the impact of a conversational XAI interface on users' understanding of the AI system, their trust, and reliance on the AI system. In comparison to an XAI dashboard, we found that the conversational XAI interface can bring about a better understanding of the AI system among users and higher user trust. However, users of both the XAI dashboard and conversational XAI interfaces showed clear overreliance on the AI system. Enhanced conversations powered by large language model (LLM) agents amplified over-reliance. Based on our findings, we reason that the potential cause of such overreliance is the illusion of explanatory depth that is concomitant with both XAI interfaces. Our findings have important implications for designing effective conversational XAI interfaces to facilitate appropriate reliance and improve human-AI collaboration. Code can be found at https://github.com/delftcrowd/IUI2025_ConvXAI</description><identifier>DOI: 10.48550/arxiv.2501.17546</identifier><language>eng</language><subject>Computer Science - Artificial Intelligence ; Computer Science - Human-Computer Interaction</subject><creationdate>2025-01</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2501.17546$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.1145/3708359.3712133$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.48550/arXiv.2501.17546$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>He, Gaole</creatorcontrib><creatorcontrib>Aishwarya, Nilay</creatorcontrib><creatorcontrib>Gadiraju, Ujwal</creatorcontrib><title>Is Conversational XAI All You Need? Human-AI Decision Making With a Conversational XAI Assistant</title><description>Explainable artificial intelligence (XAI) methods are being proposed to help interpret and understand how AI systems reach specific predictions. Inspired by prior work on conversational user interfaces, we argue that augmenting existing XAI methods with conversational user interfaces can increase user engagement and boost user understanding of the AI system. In this paper, we explored the impact of a conversational XAI interface on users' understanding of the AI system, their trust, and reliance on the AI system. In comparison to an XAI dashboard, we found that the conversational XAI interface can bring about a better understanding of the AI system among users and higher user trust. However, users of both the XAI dashboard and conversational XAI interfaces showed clear overreliance on the AI system. Enhanced conversations powered by large language model (LLM) agents amplified over-reliance. Based on our findings, we reason that the potential cause of such overreliance is the illusion of explanatory depth that is concomitant with both XAI interfaces. Our findings have important implications for designing effective conversational XAI interfaces to facilitate appropriate reliance and improve human-AI collaboration. Code can be found at https://github.com/delftcrowd/IUI2025_ConvXAI</description><subject>Computer Science - Artificial Intelligence</subject><subject>Computer Science - Human-Computer Interaction</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2025</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNpjYJA0NNAzsTA1NdBPLKrILNMzMjUw1DM0NzUx42RI8CxWcM7PK0stKk4syczPS8xRiHD0VHDMyVGIzC9V8EtNTbFX8CjNTczTBQq7pCZnFgNVKfgmZmfmpSuEZ5ZkKCRiNaC4OLO4JDGvhIeBNS0xpziVF0pzM8i7uYY4e-iC3RJfUJSZm1hUGQ9yUzzYTcaEVQAAU4BAng</recordid><startdate>20250129</startdate><enddate>20250129</enddate><creator>He, Gaole</creator><creator>Aishwarya, Nilay</creator><creator>Gadiraju, Ujwal</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20250129</creationdate><title>Is Conversational XAI All You Need? Human-AI Decision Making With a Conversational XAI Assistant</title><author>He, Gaole ; Aishwarya, Nilay ; Gadiraju, Ujwal</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-arxiv_primary_2501_175463</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2025</creationdate><topic>Computer Science - Artificial Intelligence</topic><topic>Computer Science - Human-Computer Interaction</topic><toplevel>online_resources</toplevel><creatorcontrib>He, Gaole</creatorcontrib><creatorcontrib>Aishwarya, Nilay</creatorcontrib><creatorcontrib>Gadiraju, Ujwal</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>He, Gaole</au><au>Aishwarya, Nilay</au><au>Gadiraju, Ujwal</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Is Conversational XAI All You Need? Human-AI Decision Making With a Conversational XAI Assistant</atitle><date>2025-01-29</date><risdate>2025</risdate><abstract>Explainable artificial intelligence (XAI) methods are being proposed to help interpret and understand how AI systems reach specific predictions. Inspired by prior work on conversational user interfaces, we argue that augmenting existing XAI methods with conversational user interfaces can increase user engagement and boost user understanding of the AI system. In this paper, we explored the impact of a conversational XAI interface on users' understanding of the AI system, their trust, and reliance on the AI system. In comparison to an XAI dashboard, we found that the conversational XAI interface can bring about a better understanding of the AI system among users and higher user trust. However, users of both the XAI dashboard and conversational XAI interfaces showed clear overreliance on the AI system. Enhanced conversations powered by large language model (LLM) agents amplified over-reliance. Based on our findings, we reason that the potential cause of such overreliance is the illusion of explanatory depth that is concomitant with both XAI interfaces. Our findings have important implications for designing effective conversational XAI interfaces to facilitate appropriate reliance and improve human-AI collaboration. Code can be found at https://github.com/delftcrowd/IUI2025_ConvXAI</abstract><doi>10.48550/arxiv.2501.17546</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2501.17546
ispartof
issn
language eng
recordid cdi_arxiv_primary_2501_17546
source arXiv.org
subjects Computer Science - Artificial Intelligence
Computer Science - Human-Computer Interaction
title Is Conversational XAI All You Need? Human-AI Decision Making With a Conversational XAI Assistant
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-13T22%3A10%3A34IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Is%20Conversational%20XAI%20All%20You%20Need?%20Human-AI%20Decision%20Making%20With%20a%20Conversational%20XAI%20Assistant&rft.au=He,%20Gaole&rft.date=2025-01-29&rft_id=info:doi/10.48550/arxiv.2501.17546&rft_dat=%3Carxiv_GOX%3E2501_17546%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true