Validation of ultrasound examinations performed by general practitioners

Objective: The aim of this study was to evaluate the diagnostic agreement when a general practitioner and subsequently a specialist (radiologist/gynecologist) performed point-of-care ultrasound examinations for certain abdominal and gynecological conditions of low to moderate complexity. Design: A p...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Scandinavian journal of primary health care 2017-09, Vol.35 (3), p.256-261
Hauptverfasser: Lindgaard, Karsten, Riisgaard, Lars
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Objective: The aim of this study was to evaluate the diagnostic agreement when a general practitioner and subsequently a specialist (radiologist/gynecologist) performed point-of-care ultrasound examinations for certain abdominal and gynecological conditions of low to moderate complexity. Design: A prospective study of inter-rater reliability and agreement. Setting: Patients were recruited and initially scanned in general practice. The validation examinations were conducted in a hospital setting. Subjects: A convenient sample of 114 patients presenting with abdominal pain or discomfort, possible pregnancy or known risk factors toward abdominal aortic aneurism were included. Main outcome measures: Inter-rater agreement (Kappa statistic and percentage agreement) between ultrasound examinations by general practitioner and specialist for the following conditions: gallstones, ascites, abdominal aorta >5 cm, intrauterine pregnancy and gestational age. Results: An overall Kappa value of 0.93 (95% confidence interval (CI): 0.87-0.98) was obtained. Ascites, abdominal aortic diameter >5cm, and intrauterine pregnancy showed Kappa values of 1. Conclusion: Our study showed that general practitioners performing point-of-care ultrasound examinations with low-to-moderate complexity had a very high rate of inter-rater agreement compared with specialists.
ISSN:0281-3432
1502-7724
DOI:10.1080/02813432.2017.1358437