Role of Trust in AI-Driven Healthcare Systems: Discussion from the Perspective of Patient Safety

In the field of healthcare, enhancing patient safety depends on several factors (e.g., regulation, technology, care quality, physical environment, human factors) that are interconnected. Artificial Intelligence (AI), along with an increasing realm of use, functions as a component of the overall heal...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Proceedings of the International Symposium of Human Factors and Ergonomics in Healthcare 2022-09, Vol.11 (1), p.129-134
Hauptverfasser: Bilal Unver, Mehmet, Asan, Onur
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In the field of healthcare, enhancing patient safety depends on several factors (e.g., regulation, technology, care quality, physical environment, human factors) that are interconnected. Artificial Intelligence (AI), along with an increasing realm of use, functions as a component of the overall healthcare system from a multi-agent systems viewpoint. Far from a stand-alone agent, AI cannot be held liable for the flawed decisions in healthcare. Also, AI does not have the capacity to be trusted according to the most prevalent definitions of trust because it does not possess emotive states or cannot be held responsible for their actions. A positive experience of AI reliance come to be indicative of ‘trustworthiness’ rather than ‘trust’, implying further consequences related to the patient safety. From a multi-agent systems viewpoint, ‘trust’ requires all the environmental, psychological and technical conditions being responsive to patient safety. It is fertilized for the overall system in which ‘responsibility’, ‘accountability’, ‘privacy’, ‘transparency; and ‘fairness’ need to be secured for all the parties involved in AI-driven healthcare, given the ethical and legal concerns and their threat to the trust.
ISSN:2327-8595
2327-8595
DOI:10.1177/2327857922111026