Using video-based examiner score comparison and adjustment

Purpose Ensuring equivalence of examiners' judgements within distributed objective structured clinical exams (OSCEs) is key to both fairness and validity but is hampered by lack of cross-over in the performances which different groups of examiners observe. This study develops a novel method cal...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:BMC medical education 2023-10, Vol.23 (1)
Hauptverfasser: Yeates, Peter, Maluf, Adriano, Cope, Natalie, McCray, Gareth, McBain, Stuart, Beardow, Dominic, Fuller, Richard, McKinley, Robert Bob
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 1
container_start_page
container_title BMC medical education
container_volume 23
creator Yeates, Peter
Maluf, Adriano
Cope, Natalie
McCray, Gareth
McBain, Stuart
Beardow, Dominic
Fuller, Richard
McKinley, Robert Bob
description Purpose Ensuring equivalence of examiners' judgements within distributed objective structured clinical exams (OSCEs) is key to both fairness and validity but is hampered by lack of cross-over in the performances which different groups of examiners observe. This study develops a novel method called Video-based Examiner Score Comparison and Adjustment (VESCA) using it to compare examiners scoring from different OSCE sites for the first time. Materials/ methods Within a summative 16 station OSCE, volunteer students were videoed on each station and all examiners invited to score station-specific comparator videos in addition to usual student scoring. Linkage provided through the video-scores enabled use of Many Facet Rasch Modelling (MFRM) to compare 1/ examiner-cohort and 2/ site effects on students' scores. Results Examiner-cohorts varied by 6.9% in the overall score allocated to students of the same ability. Whilst only a tiny difference was apparent between sites, examiner-cohort variability was greater in one site than the other. Adjusting student scores produced a median change in rank position of 6 places (0.48 deciles), however 26.9% of students changed their rank position by at least 1 decile. By contrast, only 1 student's pass/fail classification was altered by score adjustment. Conclusions Whilst comparatively limited examiner participation rates may limit interpretation of score adjustment in this instance, this study demonstrates the feasibility of using VESCA for quality assurance purposes in large scale distributed OSCEs. Keywords: OSCE, Assessment, Equivalence, Examiner-Cohorts, Distributed Assessment
doi_str_mv 10.1186/s12909-023-04774-4
format Article
fullrecord <record><control><sourceid>gale</sourceid><recordid>TN_cdi_gale_infotracmisc_A770475371</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A770475371</galeid><sourcerecordid>A770475371</sourcerecordid><originalsourceid>FETCH-LOGICAL-g671-632ebcb470468733e3d510b1d6d48a47a0e7560e50b2a968a463715db929258a3</originalsourceid><addsrcrecordid>eNptjEtLAzEcxIMoWKtfwFPAc2re2XgrxRcUvNRzyeO_S0o3kc0qfnwDeuhB5jDD8JtB6JbRFWOdvq-MW2oJ5YJQaYwk8gwtmDScaMvp-Um-RFe1HihlphNsgR7ea8oD_koRCvGuQsTw7caUYcI1lAlwKOOHm1ItGbscsYuHzzqPkOdrdNG7Y4WbP1-i3dPjbvNCtm_Pr5v1lgzaMKIFBx-8NFTqzggBIipGPYs6ys5J4ygYpSko6rmzulVaGKait9xy1TmxRHe_t4M7wj7lvsyTC2OqYb827dWoxjdq9Q_VFGFMoWToU-tPBj9NzVfk</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Using video-based examiner score comparison and adjustment</title><source>Springer Nature - Complete Springer Journals</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>PubMed Central</source><source>PubMed Central Open Access</source><source>Springer Nature OA Free Journals</source><creator>Yeates, Peter ; Maluf, Adriano ; Cope, Natalie ; McCray, Gareth ; McBain, Stuart ; Beardow, Dominic ; Fuller, Richard ; McKinley, Robert Bob</creator><creatorcontrib>Yeates, Peter ; Maluf, Adriano ; Cope, Natalie ; McCray, Gareth ; McBain, Stuart ; Beardow, Dominic ; Fuller, Richard ; McKinley, Robert Bob</creatorcontrib><description>Purpose Ensuring equivalence of examiners' judgements within distributed objective structured clinical exams (OSCEs) is key to both fairness and validity but is hampered by lack of cross-over in the performances which different groups of examiners observe. This study develops a novel method called Video-based Examiner Score Comparison and Adjustment (VESCA) using it to compare examiners scoring from different OSCE sites for the first time. Materials/ methods Within a summative 16 station OSCE, volunteer students were videoed on each station and all examiners invited to score station-specific comparator videos in addition to usual student scoring. Linkage provided through the video-scores enabled use of Many Facet Rasch Modelling (MFRM) to compare 1/ examiner-cohort and 2/ site effects on students' scores. Results Examiner-cohorts varied by 6.9% in the overall score allocated to students of the same ability. Whilst only a tiny difference was apparent between sites, examiner-cohort variability was greater in one site than the other. Adjusting student scores produced a median change in rank position of 6 places (0.48 deciles), however 26.9% of students changed their rank position by at least 1 decile. By contrast, only 1 student's pass/fail classification was altered by score adjustment. Conclusions Whilst comparatively limited examiner participation rates may limit interpretation of score adjustment in this instance, this study demonstrates the feasibility of using VESCA for quality assurance purposes in large scale distributed OSCEs. Keywords: OSCE, Assessment, Equivalence, Examiner-Cohorts, Distributed Assessment</description><identifier>ISSN: 1472-6920</identifier><identifier>EISSN: 1472-6920</identifier><identifier>DOI: 10.1186/s12909-023-04774-4</identifier><language>eng</language><publisher>BioMed Central Ltd</publisher><subject>Clinical competence ; Comparative analysis ; Evaluation ; Grading and marking (Students) ; Influence ; Medical students</subject><ispartof>BMC medical education, 2023-10, Vol.23 (1)</ispartof><rights>COPYRIGHT 2023 BioMed Central Ltd.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,860,27901,27902</link.rule.ids></links><search><creatorcontrib>Yeates, Peter</creatorcontrib><creatorcontrib>Maluf, Adriano</creatorcontrib><creatorcontrib>Cope, Natalie</creatorcontrib><creatorcontrib>McCray, Gareth</creatorcontrib><creatorcontrib>McBain, Stuart</creatorcontrib><creatorcontrib>Beardow, Dominic</creatorcontrib><creatorcontrib>Fuller, Richard</creatorcontrib><creatorcontrib>McKinley, Robert Bob</creatorcontrib><title>Using video-based examiner score comparison and adjustment</title><title>BMC medical education</title><description>Purpose Ensuring equivalence of examiners' judgements within distributed objective structured clinical exams (OSCEs) is key to both fairness and validity but is hampered by lack of cross-over in the performances which different groups of examiners observe. This study develops a novel method called Video-based Examiner Score Comparison and Adjustment (VESCA) using it to compare examiners scoring from different OSCE sites for the first time. Materials/ methods Within a summative 16 station OSCE, volunteer students were videoed on each station and all examiners invited to score station-specific comparator videos in addition to usual student scoring. Linkage provided through the video-scores enabled use of Many Facet Rasch Modelling (MFRM) to compare 1/ examiner-cohort and 2/ site effects on students' scores. Results Examiner-cohorts varied by 6.9% in the overall score allocated to students of the same ability. Whilst only a tiny difference was apparent between sites, examiner-cohort variability was greater in one site than the other. Adjusting student scores produced a median change in rank position of 6 places (0.48 deciles), however 26.9% of students changed their rank position by at least 1 decile. By contrast, only 1 student's pass/fail classification was altered by score adjustment. Conclusions Whilst comparatively limited examiner participation rates may limit interpretation of score adjustment in this instance, this study demonstrates the feasibility of using VESCA for quality assurance purposes in large scale distributed OSCEs. Keywords: OSCE, Assessment, Equivalence, Examiner-Cohorts, Distributed Assessment</description><subject>Clinical competence</subject><subject>Comparative analysis</subject><subject>Evaluation</subject><subject>Grading and marking (Students)</subject><subject>Influence</subject><subject>Medical students</subject><issn>1472-6920</issn><issn>1472-6920</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid/><recordid>eNptjEtLAzEcxIMoWKtfwFPAc2re2XgrxRcUvNRzyeO_S0o3kc0qfnwDeuhB5jDD8JtB6JbRFWOdvq-MW2oJ5YJQaYwk8gwtmDScaMvp-Um-RFe1HihlphNsgR7ea8oD_koRCvGuQsTw7caUYcI1lAlwKOOHm1ItGbscsYuHzzqPkOdrdNG7Y4WbP1-i3dPjbvNCtm_Pr5v1lgzaMKIFBx-8NFTqzggBIipGPYs6ys5J4ygYpSko6rmzulVaGKait9xy1TmxRHe_t4M7wj7lvsyTC2OqYb827dWoxjdq9Q_VFGFMoWToU-tPBj9NzVfk</recordid><startdate>20231026</startdate><enddate>20231026</enddate><creator>Yeates, Peter</creator><creator>Maluf, Adriano</creator><creator>Cope, Natalie</creator><creator>McCray, Gareth</creator><creator>McBain, Stuart</creator><creator>Beardow, Dominic</creator><creator>Fuller, Richard</creator><creator>McKinley, Robert Bob</creator><general>BioMed Central Ltd</general><scope/></search><sort><creationdate>20231026</creationdate><title>Using video-based examiner score comparison and adjustment</title><author>Yeates, Peter ; Maluf, Adriano ; Cope, Natalie ; McCray, Gareth ; McBain, Stuart ; Beardow, Dominic ; Fuller, Richard ; McKinley, Robert Bob</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-g671-632ebcb470468733e3d510b1d6d48a47a0e7560e50b2a968a463715db929258a3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Clinical competence</topic><topic>Comparative analysis</topic><topic>Evaluation</topic><topic>Grading and marking (Students)</topic><topic>Influence</topic><topic>Medical students</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Yeates, Peter</creatorcontrib><creatorcontrib>Maluf, Adriano</creatorcontrib><creatorcontrib>Cope, Natalie</creatorcontrib><creatorcontrib>McCray, Gareth</creatorcontrib><creatorcontrib>McBain, Stuart</creatorcontrib><creatorcontrib>Beardow, Dominic</creatorcontrib><creatorcontrib>Fuller, Richard</creatorcontrib><creatorcontrib>McKinley, Robert Bob</creatorcontrib><jtitle>BMC medical education</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Yeates, Peter</au><au>Maluf, Adriano</au><au>Cope, Natalie</au><au>McCray, Gareth</au><au>McBain, Stuart</au><au>Beardow, Dominic</au><au>Fuller, Richard</au><au>McKinley, Robert Bob</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Using video-based examiner score comparison and adjustment</atitle><jtitle>BMC medical education</jtitle><date>2023-10-26</date><risdate>2023</risdate><volume>23</volume><issue>1</issue><issn>1472-6920</issn><eissn>1472-6920</eissn><abstract>Purpose Ensuring equivalence of examiners' judgements within distributed objective structured clinical exams (OSCEs) is key to both fairness and validity but is hampered by lack of cross-over in the performances which different groups of examiners observe. This study develops a novel method called Video-based Examiner Score Comparison and Adjustment (VESCA) using it to compare examiners scoring from different OSCE sites for the first time. Materials/ methods Within a summative 16 station OSCE, volunteer students were videoed on each station and all examiners invited to score station-specific comparator videos in addition to usual student scoring. Linkage provided through the video-scores enabled use of Many Facet Rasch Modelling (MFRM) to compare 1/ examiner-cohort and 2/ site effects on students' scores. Results Examiner-cohorts varied by 6.9% in the overall score allocated to students of the same ability. Whilst only a tiny difference was apparent between sites, examiner-cohort variability was greater in one site than the other. Adjusting student scores produced a median change in rank position of 6 places (0.48 deciles), however 26.9% of students changed their rank position by at least 1 decile. By contrast, only 1 student's pass/fail classification was altered by score adjustment. Conclusions Whilst comparatively limited examiner participation rates may limit interpretation of score adjustment in this instance, this study demonstrates the feasibility of using VESCA for quality assurance purposes in large scale distributed OSCEs. Keywords: OSCE, Assessment, Equivalence, Examiner-Cohorts, Distributed Assessment</abstract><pub>BioMed Central Ltd</pub><doi>10.1186/s12909-023-04774-4</doi></addata></record>
fulltext fulltext
identifier ISSN: 1472-6920
ispartof BMC medical education, 2023-10, Vol.23 (1)
issn 1472-6920
1472-6920
language eng
recordid cdi_gale_infotracmisc_A770475371
source Springer Nature - Complete Springer Journals; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; PubMed Central; PubMed Central Open Access; Springer Nature OA Free Journals
subjects Clinical competence
Comparative analysis
Evaluation
Grading and marking (Students)
Influence
Medical students
title Using video-based examiner score comparison and adjustment
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-12T18%3A42%3A31IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Using%20video-based%20examiner%20score%20comparison%20and%20adjustment&rft.jtitle=BMC%20medical%20education&rft.au=Yeates,%20Peter&rft.date=2023-10-26&rft.volume=23&rft.issue=1&rft.issn=1472-6920&rft.eissn=1472-6920&rft_id=info:doi/10.1186/s12909-023-04774-4&rft_dat=%3Cgale%3EA770475371%3C/gale%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_galeid=A770475371&rfr_iscdi=true