Tools to foster responsibility in digital solutions that operate with or without artificial intelligence: A scoping review for health and innovation policymakers
•Tools to guide responsible D/AI practices in health are numerous and heterogeneous;•Tools rely mostly on a normative approach and target multiple groups;•73% of the principles occurred in tools published before 2018;•Principles vary greatly across tools and tool creators;•Very few tools were develo...
Gespeichert in:
Veröffentlicht in: | International journal of medical informatics (Shannon, Ireland) Ireland), 2023-02, Vol.170, p.104933, Article 104933 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •Tools to guide responsible D/AI practices in health are numerous and heterogeneous;•Tools rely mostly on a normative approach and target multiple groups;•73% of the principles occurred in tools published before 2018;•Principles vary greatly across tools and tool creators;•Very few tools were developed using a rigorous method.
Digital health solutions that operate with or without artificial intelligence (D/AI) raise several responsibility challenges. Though many frameworks and tools have been developed, determining what principles should be translated into practice remains under debate. This scoping review aims to provide policymakers with a rigorous body of knowledge by asking: 1) what kinds of practice-oriented tools are available?; 2) on what principles do they predominantly rely?; and 3) what are their limitations?
We searched six academic and three grey literature databases for practice-oriented tools, defined as frameworks and/or sets of principles with clear operational explanations, published in English or French from 2015 to 2021. Characteristics of the tools were qualitatively coded and variations across the dataset identified through descriptive statistics and a network analysis.
A total of 56 tools met our inclusion criteria: 19 health-specific tools (33.9%) and 37 generic tools (66.1%). They adopt a normative (57.1%), reflective (35.7%), operational (3.6%), or mixed approach (3.6%) to guide developers (14.3%), managers (16.1%), end users (10.7%), policymakers (5.4%) or multiple groups (53.6%). The frequency of 40 principles varies greatly across tools (from 0% for ‘environmental sustainability’ to 83.8% for ‘transparency’). While 50% or more of the generic tools promote up to 19 principles, 50% or more of the health-specific tools promote 10 principles, and 50% or more of all tools disregard 21 principles. In contrast to the scattered network of principles proposed by academia, the business sector emphasizes closely connected principles. Few tools rely on a formal methodology (17.9%).
Despite a lack of consensus, there is a solid knowledge-basis for policymakers to anchor their role in such a dynamic field. Because several tools lack rigour and ignore key social, economic, and environmental issues, an integrated and methodologically sound approach to responsibility in D/AI solutions is warranted. |
---|---|
ISSN: | 1386-5056 1872-8243 |
DOI: | 10.1016/j.ijmedinf.2022.104933 |