Systematic prioritization of the STARE-HI reporting items. An application to short conference papers on health informatics evaluation
We previously devised and published a guideline for reporting health informatics evaluation studies named STARE-HI, which is formally endorsed by IMIA and EFMI. To develop a prioritization framework of ranked reporting items to assist authors when reporting health informatics evaluation studies in s...
Gespeichert in:
Veröffentlicht in: | Methods of information in medicine 2012, Vol.51 (2), p.104-111 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We previously devised and published a guideline for reporting health informatics evaluation studies named STARE-HI, which is formally endorsed by IMIA and EFMI.
To develop a prioritization framework of ranked reporting items to assist authors when reporting health informatics evaluation studies in space restricted conference papers and to apply this prioritization framework to measure the quality of recent health informatics conference papers on evaluation studies.
We deconstructed the STARE-HI guideline to identify reporting items. We invited a total of 111 authors of health informatics evaluation studies, reviewers and editors of health Informatics conference proceedings to score those reporting items on a scale ranging from "0 - not necessary in a conference paper" through to "10 - essential in a conference paper" by a web-based survey. From the responses we derived a mean priority score. All evaluation papers published in proceedings of MIE2006, Medinfo2007, MIE2008 and AMIA2008 were rated on these items by two reviewers. From these ratings a priority adjusted completeness score was computed for each paper.
We identified 104 reporting items from the STARE-HI guideline. The response rate for the survey was 59% (66 out of 111). The most important reporting items (mean score ≥9) were "Interpret the data and give an answer to the study question - (in Discussion)", "Whether it is a laboratory, simulation or field study - (in Methods-study design)" and "Description of the outcome measure/evaluation criteria - (in Methods-study design)". Per reporting area the statistically more significant important reporting items were distinguished from less important ones. Four reporting items had a mean score ≤6. The mean priority adjusted completeness of evaluation papers of recent health informatics conferences was 48% (range 14-78%).
We produced a ranked list of reporting items from STARE-HI according to their prioritized relevance for inclusion in space-limited conference papers. The priority adjusted completeness scores demonstrated room for improvement for the analyzed conference papers. We believe that this prioritization framework is an aid to improving the quality and utility of conference papers on health informatics evaluation studies. |
---|---|
ISSN: | 0026-1270 2511-705X |
DOI: | 10.3414/ME10-01-0072 |