From everyday life predictions to suicide prevention: Clinical and ethical considerations in suicide predictive analytic tools

Advances in artificial intelligence and machine learning have fueled growing interest in the application of predictive analytics to identify high‐risk suicidal patients. Such application will require the aggregation of large‐scale, sensitive patient data to help inform complex and potentially stigma...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of clinical psychology 2022-02, Vol.78 (2), p.137-148
Hauptverfasser: Luk, Jeremy W., Pruitt, Larry D., Smolenski, Derek J., Tucker, Jennifer, Workman, Don E., Belsher, Bradley E.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Advances in artificial intelligence and machine learning have fueled growing interest in the application of predictive analytics to identify high‐risk suicidal patients. Such application will require the aggregation of large‐scale, sensitive patient data to help inform complex and potentially stigmatizing health care decisions. This paper provides a description of how suicide prediction is uniquely difficult by comparing it to nonmedical (weather and traffic forecasting) and medical predictions (cancer and human immunodeficiency virus risk), followed by clinical and ethical challenges presented within a risk‐benefit conceptual framework. Because the misidentification of suicide risk may be associated with unintended negative consequences, clinicians and policymakers need to carefully weigh the risks and benefits of using suicide predictive analytics across health care populations. Practical recommendations are provided to strengthen the protection of patient rights and enhance the clinical utility of suicide predictive analytics tools.
ISSN:0021-9762
1097-4679
DOI:10.1002/jclp.23202