Evaluating procedural skills competence: inter-rater reliability of expert and non-expert observers

To examine the inter-rater reliability of expert and non-expert observers when they used objective structured checklists to evaluate candidates' performances on three simulated medical procedures. Simulations and structured checklists were developed for three medical procedures: endotracheal in...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Academic medicine : journal of the Association of American Medical Colleges 1999-01, Vol.74 (1), p.76-78
Hauptverfasser: Bullock, G, Kovacs, G, Macdonald, K, Story, B A
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 78
container_issue 1
container_start_page 76
container_title Academic medicine : journal of the Association of American Medical Colleges
container_volume 74
creator Bullock, G
Kovacs, G
Macdonald, K
Story, B A
description To examine the inter-rater reliability of expert and non-expert observers when they used objective structured checklists to evaluate candidates' performances on three simulated medical procedures. Simulations and structured checklists were developed for three medical procedures: endotracheal intubation, application of a forearm cast, and suturing a simple skin laceration. Groups comprised of two expert and two non-expert observers scored the performances of 101 procedures by 38 medical trainees and practitioners of varying skill levels. Inter-rater reliability was assessed using Pearson correlation coefficients. Inter-rater reliability was good for expert/expert, expert/non-expert, and non-expert/non-expert pairings in all three skills simulations. Both expert and non-expert observers demonstrated good inter-rater reliability when using structured checklists to assess procedural skills. Further study is required to determine whether this conclusion may be extrapolated to other study groups or procedures.
doi_str_mv 10.1097/00001888-199901001-00023
format Article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_proquest_miscellaneous_69565515</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>69565515</sourcerecordid><originalsourceid>FETCH-LOGICAL-p138t-23393a25838cba1346c2420b2a95023c6518a8c6a7532798a0d4b8580db3c1843</originalsourceid><addsrcrecordid>eNotkEtLAzEUhbNQaq3-BCErd9E8p4k7KfUBBTe6Hu5kbiWaeZjMFPvvDdi7uJdzOFz4DiFU8DvB3fqelxHWWiacc1wUwYoj1RlZCq45k1pXF-Qy569iV2ujFmThnNKKiyXx2wPEGabQf9IxDR7bOUGk-TvEmKkfuhEn7D0-0NBPmFiCsmnCGKAJMUxHOuwp_o6YJgp9S_uhZyc5NBnTAVO-Iud7iBmvT3dFPp6275sXtnt7ft087tgolJ2YVMopkMYq6xsQSldeaskbCc4UHF8ZYcH6CgqCXDsLvNWNNZa3jfLCarUit_9_C8jPjHmqu5A9xgg9DnOuK2cqY4QpwZtTcG46bOsxhQ7SsT61ov4Ath5kNw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>69565515</pqid></control><display><type>article</type><title>Evaluating procedural skills competence: inter-rater reliability of expert and non-expert observers</title><source>MEDLINE</source><source>Journals@Ovid LWW Legacy Archive</source><source>Alma/SFX Local Collection</source><creator>Bullock, G ; Kovacs, G ; Macdonald, K ; Story, B A</creator><creatorcontrib>Bullock, G ; Kovacs, G ; Macdonald, K ; Story, B A</creatorcontrib><description>To examine the inter-rater reliability of expert and non-expert observers when they used objective structured checklists to evaluate candidates' performances on three simulated medical procedures. Simulations and structured checklists were developed for three medical procedures: endotracheal intubation, application of a forearm cast, and suturing a simple skin laceration. Groups comprised of two expert and two non-expert observers scored the performances of 101 procedures by 38 medical trainees and practitioners of varying skill levels. Inter-rater reliability was assessed using Pearson correlation coefficients. Inter-rater reliability was good for expert/expert, expert/non-expert, and non-expert/non-expert pairings in all three skills simulations. Both expert and non-expert observers demonstrated good inter-rater reliability when using structured checklists to assess procedural skills. Further study is required to determine whether this conclusion may be extrapolated to other study groups or procedures.</description><identifier>ISSN: 1040-2446</identifier><identifier>DOI: 10.1097/00001888-199901001-00023</identifier><identifier>PMID: 9934301</identifier><language>eng</language><publisher>United States</publisher><subject>Clinical Competence ; Evaluation Studies as Topic ; Humans ; Observer Variation ; Patient Simulation</subject><ispartof>Academic medicine : journal of the Association of American Medical Colleges, 1999-01, Vol.74 (1), p.76-78</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27923,27924</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/9934301$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Bullock, G</creatorcontrib><creatorcontrib>Kovacs, G</creatorcontrib><creatorcontrib>Macdonald, K</creatorcontrib><creatorcontrib>Story, B A</creatorcontrib><title>Evaluating procedural skills competence: inter-rater reliability of expert and non-expert observers</title><title>Academic medicine : journal of the Association of American Medical Colleges</title><addtitle>Acad Med</addtitle><description>To examine the inter-rater reliability of expert and non-expert observers when they used objective structured checklists to evaluate candidates' performances on three simulated medical procedures. Simulations and structured checklists were developed for three medical procedures: endotracheal intubation, application of a forearm cast, and suturing a simple skin laceration. Groups comprised of two expert and two non-expert observers scored the performances of 101 procedures by 38 medical trainees and practitioners of varying skill levels. Inter-rater reliability was assessed using Pearson correlation coefficients. Inter-rater reliability was good for expert/expert, expert/non-expert, and non-expert/non-expert pairings in all three skills simulations. Both expert and non-expert observers demonstrated good inter-rater reliability when using structured checklists to assess procedural skills. Further study is required to determine whether this conclusion may be extrapolated to other study groups or procedures.</description><subject>Clinical Competence</subject><subject>Evaluation Studies as Topic</subject><subject>Humans</subject><subject>Observer Variation</subject><subject>Patient Simulation</subject><issn>1040-2446</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>1999</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNotkEtLAzEUhbNQaq3-BCErd9E8p4k7KfUBBTe6Hu5kbiWaeZjMFPvvDdi7uJdzOFz4DiFU8DvB3fqelxHWWiacc1wUwYoj1RlZCq45k1pXF-Qy569iV2ujFmThnNKKiyXx2wPEGabQf9IxDR7bOUGk-TvEmKkfuhEn7D0-0NBPmFiCsmnCGKAJMUxHOuwp_o6YJgp9S_uhZyc5NBnTAVO-Iud7iBmvT3dFPp6275sXtnt7ft087tgolJ2YVMopkMYq6xsQSldeaskbCc4UHF8ZYcH6CgqCXDsLvNWNNZa3jfLCarUit_9_C8jPjHmqu5A9xgg9DnOuK2cqY4QpwZtTcG46bOsxhQ7SsT61ov4Ath5kNw</recordid><startdate>199901</startdate><enddate>199901</enddate><creator>Bullock, G</creator><creator>Kovacs, G</creator><creator>Macdonald, K</creator><creator>Story, B A</creator><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>7X8</scope></search><sort><creationdate>199901</creationdate><title>Evaluating procedural skills competence: inter-rater reliability of expert and non-expert observers</title><author>Bullock, G ; Kovacs, G ; Macdonald, K ; Story, B A</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-p138t-23393a25838cba1346c2420b2a95023c6518a8c6a7532798a0d4b8580db3c1843</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>1999</creationdate><topic>Clinical Competence</topic><topic>Evaluation Studies as Topic</topic><topic>Humans</topic><topic>Observer Variation</topic><topic>Patient Simulation</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Bullock, G</creatorcontrib><creatorcontrib>Kovacs, G</creatorcontrib><creatorcontrib>Macdonald, K</creatorcontrib><creatorcontrib>Story, B A</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>MEDLINE - Academic</collection><jtitle>Academic medicine : journal of the Association of American Medical Colleges</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Bullock, G</au><au>Kovacs, G</au><au>Macdonald, K</au><au>Story, B A</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Evaluating procedural skills competence: inter-rater reliability of expert and non-expert observers</atitle><jtitle>Academic medicine : journal of the Association of American Medical Colleges</jtitle><addtitle>Acad Med</addtitle><date>1999-01</date><risdate>1999</risdate><volume>74</volume><issue>1</issue><spage>76</spage><epage>78</epage><pages>76-78</pages><issn>1040-2446</issn><abstract>To examine the inter-rater reliability of expert and non-expert observers when they used objective structured checklists to evaluate candidates' performances on three simulated medical procedures. Simulations and structured checklists were developed for three medical procedures: endotracheal intubation, application of a forearm cast, and suturing a simple skin laceration. Groups comprised of two expert and two non-expert observers scored the performances of 101 procedures by 38 medical trainees and practitioners of varying skill levels. Inter-rater reliability was assessed using Pearson correlation coefficients. Inter-rater reliability was good for expert/expert, expert/non-expert, and non-expert/non-expert pairings in all three skills simulations. Both expert and non-expert observers demonstrated good inter-rater reliability when using structured checklists to assess procedural skills. Further study is required to determine whether this conclusion may be extrapolated to other study groups or procedures.</abstract><cop>United States</cop><pmid>9934301</pmid><doi>10.1097/00001888-199901001-00023</doi><tpages>3</tpages></addata></record>
fulltext fulltext
identifier ISSN: 1040-2446
ispartof Academic medicine : journal of the Association of American Medical Colleges, 1999-01, Vol.74 (1), p.76-78
issn 1040-2446
language eng
recordid cdi_proquest_miscellaneous_69565515
source MEDLINE; Journals@Ovid LWW Legacy Archive; Alma/SFX Local Collection
subjects Clinical Competence
Evaluation Studies as Topic
Humans
Observer Variation
Patient Simulation
title Evaluating procedural skills competence: inter-rater reliability of expert and non-expert observers
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-11T14%3A51%3A59IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Evaluating%20procedural%20skills%20competence:%20inter-rater%20reliability%20of%20expert%20and%20non-expert%20observers&rft.jtitle=Academic%20medicine%20:%20journal%20of%20the%20Association%20of%20American%20Medical%20Colleges&rft.au=Bullock,%20G&rft.date=1999-01&rft.volume=74&rft.issue=1&rft.spage=76&rft.epage=78&rft.pages=76-78&rft.issn=1040-2446&rft_id=info:doi/10.1097/00001888-199901001-00023&rft_dat=%3Cproquest_pubme%3E69565515%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=69565515&rft_id=info:pmid/9934301&rfr_iscdi=true