Interrater Reliability in Content Analysis of Healthcare Service Quality Using Montreal's Conceptual Framework

This study examines the usefulness of the Montreal Service Concept framework of service quality measurement, when it was used as a predefined set of codes in content analysis of patients' responses. As well, the study quantifies the interrater agreement of coded data. Two raters independently r...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Canadian journal of program evaluation 2009-10, Vol.24 (2), p.81-102
Hauptverfasser: Leclerc, Bernard-Simon, Dassa, Clement
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This study examines the usefulness of the Montreal Service Concept framework of service quality measurement, when it was used as a predefined set of codes in content analysis of patients' responses. As well, the study quantifies the interrater agreement of coded data. Two raters independently reviewed each of the responses from a mail survey of ambulatory patients about the quality of care and recorded whether or not a patient expressed each concern. Interrater agreement was measured in three ways: the percent crude agreement, Cohen's kappa, and the coefficient of the generalizability theory. We found all levels of interrater code-specific agreement to be over 96%. All kappa values were above 0.80, except four codes associated with rarely observed characteristics. A coefficient of generalizability equal to 0.93 was obtained. All indices consistently revealed substantial agreement. We empirically showed that the content categories of the Montreal Service Concept were exhaustive and reliable in a well-defined content-analysis procedure. (Contains 1 figure and 3 tables.)
ISSN:0834-1516
1496-7308
DOI:10.3138/cjpe.24.004