Assessing the impact of clinical information-retrieval technology in a family practice residency

Rationale and objective  Evidence‐based sources of information do not integrate self‐assessment tools to assess the impact of a users’ search for clinical information. We present a method to evaluate evidence‐based sources of information, by systematically assessing the impact of searches for  clini...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of evaluation in clinical practice 2005-12, Vol.11 (6), p.576-586
Hauptverfasser: Grad, Roland M., Pluye, Pierre, Meng, Yuejing, Segal, Bernard, Tamblyn, Robyn
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Rationale and objective  Evidence‐based sources of information do not integrate self‐assessment tools to assess the impact of a users’ search for clinical information. We present a method to evaluate evidence‐based sources of information, by systematically assessing the impact of searches for  clinical  information  in  everyday  practice.  Methods  We integrated an information management tool (InfoRetriever 2003) with an educational intervention in a cohort of 26 family medicine residents. An electronic impact assessment scale was used by these doctors to report the perceived impact of each item of information (each hit) retrieved on hand‐held computer. We compared the types of impact associated with hits in two distinct categories: clinical decision support systems (CDSS) vs. clinical information‐retrieval technology (CIRT). Information hits in CDSS were defined  as any hit in the following InfoRetriever databases: Clinical Prediction Rules, History and Physical Exam diagnostic calculator and Diagnostic Test calculator. CIRT information hits were defined as any hit in: s of Cochrane Reviews, InfoPOEMs, evidence‐based practice guideline summaries and  the  Griffith's  5  Minute  Clinical  Consult. Results  The impact assessment questionnaire was linked to 5160 information hits. 4946 impact assessment questionnaires were answered (95.9%), and 2495 contained reports of impact (48.4%). Reports of positive impact on doctors were most frequently in the areas of learning and practice improvement. In comparison to CDSS, CIRT hits were more frequently associated with learning and recall. CDSS hits were more frequently associated with reports of practice improvement. Conclusions  Our new method permits systematic and comparative assessment of impact associated with distinct categories of information.
ISSN:1356-1294
1365-2753
DOI:10.1111/j.1365-2753.2005.00594.x