Development and Validation of an Assessment Tool for Competency in Critical Care Ultrasound

Point-of-care ultrasound is an emerging technology in critical care medicine. Despite requirements for critical care medicine fellowship programs to demonstrate knowledge and competency in point-of-care ultrasound, tools to guide competency-based training are lacking. We describe the development and...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of graduate medical education 2015-12, Vol.7 (4), p.567-573
Hauptverfasser: Patrawalla, Paru, Eisen, Lewis Ari, Shiloh, Ariel, Shah, Brijen J, Savenkov, Oleksandr, Wise, Wendy, Evans, Laura, Mayo, Paul, Szyld, Demian
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Point-of-care ultrasound is an emerging technology in critical care medicine. Despite requirements for critical care medicine fellowship programs to demonstrate knowledge and competency in point-of-care ultrasound, tools to guide competency-based training are lacking. We describe the development and validity arguments of a competency assessment tool for critical care ultrasound. A modified Delphi method was used to develop behaviorally anchored checklists for 2 ultrasound applications: "Perform deep venous thrombosis study (DVT)" and "Qualify left ventricular function using parasternal long axis and parasternal short axis views (Echo)." One live rater and 1 video rater evaluated performance of 28 fellows. A second video rater evaluated a subset of 10 fellows. Validity evidence for content, response process, and internal consistency was assessed. An expert panel finalized checklists after 2 rounds of a modified Delphi method. The DVT checklist consisted of 13 items, including 1.00 global rating step (GRS). The Echo checklist consisted of 14 items, and included 1.00 GRS for each of 2 views. Interrater reliability evaluated with a Cohen kappa between the live and video rater was 1.00 for the DVT GRS, 0.44 for the PSLA GRS, and 0.58 for the PSSA GRS. Cronbach α was 0.85 for DVT and 0.92 for Echo. The findings offer preliminary evidence for the validity of competency assessment tools for 2 applications of critical care ultrasound and data on live versus video raters.
ISSN:1949-8349
1949-8357
DOI:10.4300/JGME-D-14-00613.1