Authors arbitrarily used methodological approaches to analyze the quality of reporting in research reports: a meta-research study

Many authors used reporting checklists as an assessment tool to analyze the reporting quality of diverse types of evidence. We aimed to analyze methodological approaches used by researchers assessing reporting quality of evidence in randomized controlled trials, systematic reviews, and observational...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of clinical epidemiology 2023-06, Vol.158, p.53-61
Hauptverfasser: Plenkovic, Mia, Civljak, Marta, Puljak, Livia
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Many authors used reporting checklists as an assessment tool to analyze the reporting quality of diverse types of evidence. We aimed to analyze methodological approaches used by researchers assessing reporting quality of evidence in randomized controlled trials, systematic reviews, and observational studies. We analyzed articles reporting quality assessment of evidence with Preferred Reporting Items of Systematic Reviews and Meta-Analyses (PRISMA), CONsolidated Standards of Reporting Trials (CONSORT), or the Strengthening the Reporting of Observational studies in Epidemiology (STROBE) checklists published up to 18 July 2021. We analyzed methods used for assessing reporting quality. Among 356 analyzed articles, 293 (88%) investigated a specific thematic field. The CONSORT checklist (N = 225; 67%) was most often used, in its original, modified, partial form, or its extension. Numerical scores were given for adherence to checklist items in 252 articles (75%), of which 36 articles (11%) used various reporting quality thresholds. In 158 (47%) articles, predictors of adherence to reporting checklist were analyzed. The most studied factor associated with adherence to reporting checklist was the year of article publication (N = 82; 52%). The methodology used for assessing reporting quality of evidence varied considerably. The research community needs a consensus on a consistent methodology for assessing the quality of reporting.
ISSN:0895-4356
1878-5921
DOI:10.1016/j.jclinepi.2023.03.008