A Picture is Worth a Thousand Words: The Role of Survey Training Materials in Stated-Preference Studies

Background Online survey-based methods are increasingly used to elicit preferences for healthcare. This digitization creates an opportunity for interactive survey elements, potentially improving respondents’ understanding and/or engagement. Objective Our objective was to understand whether, and how,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The patient : patient-centered outcomes research 2020-04, Vol.13 (2), p.163-173
Hauptverfasser: Vass, Caroline M., Davison, Niall J., Vander Stichele, Geert, Payne, Katherine
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Background Online survey-based methods are increasingly used to elicit preferences for healthcare. This digitization creates an opportunity for interactive survey elements, potentially improving respondents’ understanding and/or engagement. Objective Our objective was to understand whether, and how, training materials in a survey influenced stated preferences. Methods An online discrete-choice experiment (DCE) was designed to elicit public preferences for a new targeted approach to prescribing biologics (“biologic calculator”) for rheumatoid arthritis (RA) compared with conventional prescribing. The DCE presented three alternatives, two biologic calculators and a conventional approach (opt out), described by five attributes: delay to treatment, positive predictive value, negative predictive value, infection risk, and cost saving to the national health service. Respondents were randomized to receive training materials as plain text or an animated storyline. Training materials contained information about RA and approaches to treatment and described the biologic calculator. Background questions included sociodemographics and self-reported measures of task difficulty and attribute non-attendance. DCE data were analyzed using conditional and heteroskedastic conditional logit (HCL) models. Results In total, 300 respondents completed the DCE, receiving either plain text ( n  = 158) or the animated storyline ( n  = 142). The HCL showed the estimated coefficients for all attributes aligned with a priori expectations and were statistically significant. The scale term was statistically significant, indicating that respondents who received plain-text materials had more random choices. Further tests suggested preference homogeneity after accounting for differences in scale. Conclusions Using animated training materials did not change the preferences of respondents, but they appeared to improve choice consistency, potentially allowing researchers to include more complex designs with increased numbers of attributes, levels, alternatives or choice sets.
ISSN:1178-1653
1178-1661
DOI:10.1007/s40271-019-00391-w