A workstation-integrated peer review quality assurance program: pilot study.

Margaret M O'Keeffe, Todd M Davis, Kerry Siminoski
Author Information
  1. Margaret M O'Keeffe: Department of Radiology and Diagnostic Imaging, University of Alberta, and Medical Imaging Consultants, 11010-101 Street, Edmonton, AB T5H 4B9, Canada.

Abstract

BACKGROUND: The surrogate indicator of radiological excellence that has become accepted is consistency of assessments between radiologists, and the technique that has become the standard for evaluating concordance is peer review. This study describes the results of a workstation-integrated peer review program in a busy outpatient radiology practice.
METHODS: Workstation-based peer review was performed using the software program Intelerad Peer Review. Cases for review were randomly chosen from those being actively reported. If an appropriate prior study was available, and if the reviewing radiologist and the original interpreting radiologist had not exceeded review targets, the case was scored using the modified RADPEER system.
RESULTS: There were 2,241 cases randomly assigned for peer review. Of selected cases, 1,705 (76%) were interpreted. Reviewing radiologists agreed with prior reports in 99.1% of assessments. Positive feedback (score 0) was given in three cases (0.2%) and concordance (scores of 0 to 2) was assigned in 99.4%, similar to reported rates of 97.0% to 99.8%. Clinically significant discrepancies (scores of 3 or 4) were identified in 10 cases (0.6%). Eighty-eight percent of reviewed radiologists found the reviews worthwhile, 79% found scores appropriate, and 65% felt feedback was appropriate. Two-thirds of radiologists found case rounds discussing significant discrepancies to be valuable.
CONCLUSIONS: The workstation-based computerized peer review process used in this pilot project was seamlessly incorporated into the normal workday and met most criteria for an ideal peer review system. Clinically significant discrepancies were identified in 0.6% of cases, similar to published outcomes using the RADPEER system. Reviewed radiologists felt the process was worthwhile.

References

  1. Radiology. 2008 Jun;247(3):771-8 [PMID: 18375839]
  2. J Am Coll Radiol. 2009 Dec;6(12):888-9 [PMID: 19945047]
  3. Radiology. 2011 Jun;259(3):626-32 [PMID: 21602501]
  4. JAMA. 2003 Sep 3;290(9):1183-9 [PMID: 12953001]
  5. JAMA. 1993 Apr 7;269(13):1655-60 [PMID: 8240483]
  6. J Am Coll Radiol. 2004 Jan;1(1):59-65 [PMID: 17411521]
  7. Radiographics. 2007 May-Jun;27(3):769-74 [PMID: 17495291]
  8. AJR Am J Roentgenol. 2006 Jan;186(1):265 [PMID: 16357415]
  9. Radiology. 1979 Aug;132(2):277-80 [PMID: 461779]
  10. Can Assoc Radiol J. 2005 Dec;56(5):271-5 [PMID: 16579020]
  11. Radiographics. 2009 Sep-Oct;29(5):1221-31 [PMID: 19564252]
  12. J Am Coll Radiol. 2012 Apr;9(4):264-9 [PMID: 22469377]
  13. Radiographics. 2009 Jul-Aug;29(4):951-9 [PMID: 19448105]
  14. J Am Coll Radiol. 2010 Jun;7(6):425-30 [PMID: 20522395]
  15. Eur Radiol. 2005 Aug;15(8):1760-7 [PMID: 15726377]
  16. AJR Am J Roentgenol. 2012 Dec;199(6):1320-7 [PMID: 23169725]
  17. J Am Coll Radiol. 2011 Jun;8(6):409-14 [PMID: 21636055]
  18. Acad Radiol. 1998 Mar;5(3):148-54 [PMID: 9522880]
  19. J Am Coll Radiol. 2004 Mar;1(3):212-7 [PMID: 17411562]
  20. J Am Coll Radiol. 2011 Jan;8(1):6-7 [PMID: 21211756]
  21. Radiology. 2002 Jul;224(1):42-6 [PMID: 12091660]
  22. J Am Coll Radiol. 2009 Jan;6(1):21-5 [PMID: 19111267]
  23. Eur Radiol. 2003 May;13(5):1095-9 [PMID: 12695833]
  24. J Am Coll Radiol. 2004 Dec;1(12):984-7 [PMID: 17411742]
  25. J Am Coll Radiol. 2007 Mar;4(3):162-5 [PMID: 17412256]
  26. AJR Am J Roentgenol. 2012 May;198(5):1121-5 [PMID: 22528902]
  27. Can Assoc Radiol J. 2011 May;62(2):88-9 [PMID: 21501794]

MeSH Term

Canada
Diagnostic Imaging
Peer Review
Pilot Projects
Professional Competence
Quality Assurance, Health Care
Radiology
Systems Integration
User-Computer Interface

Word Cloud

Created with Highcharts 10.0.0reviewpeerradiologistscases0studyusingappropriatesystem99scoressignificantdiscrepanciesfoundbecomeassessmentsconcordanceworkstation-integratedprogramrandomlyreportedpriorradiologistcaseRADPEER2assignedfeedbacksimilarClinicallyidentified6%worthwhilefeltprocesspilotBACKGROUND:surrogateindicatorradiologicalexcellenceacceptedconsistencytechniquestandardevaluatingdescribesresultsbusyoutpatientradiologypracticeMETHODS:Workstation-basedperformedsoftwareInteleradPeerReviewCaseschosenactivelyavailablereviewingoriginalinterpretingexceededtargetsscoredmodifiedRESULTS:241selected170576%interpretedReviewingagreedreports1%Positivescoregiventhree2%4%rates970%8%3410Eighty-eightpercentreviewedreviews79%65%Two-thirdsroundsdiscussingvaluableCONCLUSIONS:workstation-basedcomputerizedusedprojectseamlesslyincorporatednormalworkdaymetcriteriaidealpublishedoutcomesReviewedqualityassuranceprogram:

Similar Articles

Cited By (2)