Standardizing evaluation of on-line continuing medical education: Physician knowledge, attitudes, and reflection on practice

Introduction: Physicians increasingly earn continuing medical education (CME) credits through on‐line courses, but there have been few rigorous evaluations to determine their effects. The present study explores the feasibility of implementing standardized evaluation templates and tests them to evalu...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Journal of continuing education in the health professions 2004, Vol.24 (2), p.68-75
Hauptverfasser: Casebeer, Linda, Kristofco, Robert E., Strasser, Sheryl, Reilly, Michael, Krishnamoorthy, Periyakaruppan, Rabin, Andrew, Zheng, Shimin, Karp, Simone, Myers, Lloyd
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Introduction: Physicians increasingly earn continuing medical education (CME) credits through on‐line courses, but there have been few rigorous evaluations to determine their effects. The present study explores the feasibility of implementing standardized evaluation templates and tests them to evaluate 30 on‐line CME courses. Methods: A time series design was used to compare the knowledge, attitudes, and reported changes in practice of physician participants who completed any of 30 on‐line CME courses that were hosted on an academic CME Web site and a CME Web portal during the period from August 1, 2002, through March 31, 2003. Data were collected at baseline, at course completion, and 4 weeks later. Paired t tests were used to compare the means of responses across time. Results: U.S. physicians completed 720 post‐tests. Quality of content was the characteristic of most importance to participants; too little interaction was the largest source of dissatis‐faction. Overall mean knowledge scores increased from 58.1% to 75.6% at post‐test and then decreased to 68.2% at 4 weeks following the course. Effect sizes of increased knowledge immediately following the course were larger for case‐based than for text‐based courses. Nearly all physicians reported making changes in practice following course completion, although reported changes differed from expected changes. Conclusions: Increases in physician knowledge and knowledge retention were demonstrated following participation in on‐line CME courses. The implementation of standardized evaluation tests proved to be feasible and allowed longitudinal evaluation analyses across CME providers and content areas.
ISSN:0894-1912
1554-558X
DOI:10.1002/chp.1340240203