Prospective comparison of live evaluation and video review in the evaluation of operator performance in a pediatric emergency airway simulation

Real-time assessment of operator performance during procedural simulation is a common practice that requires undivided attention by 1 or more reviewers, potentially over many repetitions of the same case. To determine whether reviewers display better interrater agreement of procedural competency whe...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of graduate medical education 2012-09, Vol.4 (3), p.312-316
Hauptverfasser: House, Joseph B, Dooley-Hash, Suzanne, Kowalenko, Terry, Sikavitsas, Athina, Seeyave, Desiree M, Younger, John G, Hamstra, Stanley J, Nypaver, Michele M
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Real-time assessment of operator performance during procedural simulation is a common practice that requires undivided attention by 1 or more reviewers, potentially over many repetitions of the same case. To determine whether reviewers display better interrater agreement of procedural competency when observing recorded, rather than live, performance; and to develop an assessment tool for pediatric rapid sequence intubation (pRSI). A framework of a previously established Objective Structured Assessment of Technical Skills (OSATS) tool was modified for pRSI. Emergency medicine residents (postgraduate year 1-4) were prospectively enrolled in a pRSI simulation scenario and evaluated by 2 live raters using the modified tool. Sessions were videotaped and reviewed by the same raters at least 4 months later. Raters were blinded to their initial rating. Interrater agreement was determined by using the Krippendorff generalized concordance method. Overall interrater agreement for live review was 0.75 (95% confidence interval [CI], 0.72-0.78) and for video was 0.79 (95% CI, 0.73-0.82). Live review was significantly superior to video review in only 1 of the OSATS domains (Preparation) and was equivalent in the other domains. Intrarater agreement between the live and video evaluation was very good, greater than 0.75 for all raters, with a mean of 0.81 (95% CI, 0.76-0.85). The modified OSATS assessment tool demonstrated some evidence of validity in discriminating among levels of resident experience and high interreviewer reliability. With this tool, intrareviewer reliability was high between live and 4-months' delayed video review of the simulated procedure, which supports feasibility of delayed video review in resident assessment.
ISSN:1949-8349
1949-8357
DOI:10.4300/JGME-D-11-00123.1