Examiner fatigue in communication skills objective structured clinical examinations

Context The assessment of undergraduates’ communication skills by means of objective structured clinical examinations (OSCEs) is a demanding task for examiners. Tiredness over the course of an examining session may introduce systematic error. In addition, unsystematic error may also be present which...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Medical education 2001-05, Vol.35 (5), p.444-449
Hauptverfasser: Humphris, G M, Kaney, S
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Context The assessment of undergraduates’ communication skills by means of objective structured clinical examinations (OSCEs) is a demanding task for examiners. Tiredness over the course of an examining session may introduce systematic error. In addition, unsystematic error may also be present which changes over the duration of the OSCE session. Aim To determine the strength of some sources of systematic and unsystematic error in the assessment of communication skills over the duration of an examination schedule. Methods Undergraduate first‐year medical students completing their initial summative assessment of communication skills (a four‐station OSCE) comprised the study population. Students from three cohorts were included (1996–98 intake). In all 3 years the OSCE was carried out identically. All stations lasted 5 minutes with a simulated patient. Students were assessed using an examiner (content expert) and a simulated‐patient evaluation tool, the Liverpool Communication Skills Assessment Scale (LCSAS) and the Global Simulated‐patient Rating Scale (GSPRS), respectively. Each student was assigned a time slot ranging from 1 to 24, where 1, for example, would denote that the student entered the exam first and 24 indicates the final slot for entry into the examination. The number of students who failed this exam was noted for each of the 24 time slots. A control set of marks from a communication skills written exam was also adopted for exploring a possible link with the time slot. Analysis was conducted using graphical display, covariate analysis and logistic regression. Results No significant relationship was found between the schedule point that the student entered the OSCE exam and their performance. The reliability of the content expert and simulated‐patient assessments was stable throughout the session. Conclusion No evidence could be found that duration of examining in a communication OSCE influenced examiners and the marks they awarded. Checks of this nature are recommended for routine inspection to confirm a lack of bias.
ISSN:0308-0110
1365-2923
DOI:10.1046/j.1365-2923.2001.00893.x