uk
Standards for
Learning from Discrepancies meetings
Contents
Foreword 3 7. Discrepancies and incident reporting 9
1. Recommended standards 4 8. Annual radiological discrepancy 10
meeting report
2. Introduction 5
References 11
3. Definition of a reporting discrepancy 5
Appendix 1. LDM template 14
4. Causes of a reporting discrepancy 6
Appendix 2. Biases 15
5. Running LDMs 6 Sampling bias 15
Convener 6 Selection bias 15
Case collection 6 Presentation bias 15
Preparation for the meeting 7 Information bias 15
Conduct of the meeting 7 Hindsight bias 15
Outcome bias 15
6. Outcomes of LDMs 8 Attendance bias 15
Learning 8 Variation 15
Categorising discrepancies 8 Commercial bias 15
Clinical significance 8
Foreword
As radiologists, we are constantly striving to improve the standards
of service we provide to patients with a culture of learning, self-reflection
and personal development.
Humans will always make errors and radiologists are no different.
As part of the reporting process, we are constantly having to give an
opinion under conditions of uncertainty. With hindsight, often combined
with additional information, it is inevitable that discrepancies will be
acknowledged in the original interpretation of a study. It is important
that the concept that not all discrepancies are ‘errors’ is understood and
managed so that harm or potential harm is minimised, and that a
learning system is in place in an attempt to avoid repetition.
Reviewing and learning from discrepancies and adverse events can
provide evidence of reflective practice and, if performed in a supportive
learning environment, can contribute to the evidence for providers and
users of a service as to its safety. Structuring the learning to help identify
contributing factors can also help inform the organisation of potential
trends that can be addressed to mitigate against recurrence and
contribute to the enhancement of patient safety.
The Royal College of Radiologists (RCR) has produced this document
to set standards and give guidance on how shared learning may be
used. It replaces the previously published document Standards for
Radiology Discrepancy Meetings, which has now been withdrawn.
The document emphasises the educational role of the learning from
discrepancies meeting (LDM) and how such meetings should be part of
a radiology quality assurance (QA) programme. The document should
be read alongside the RCR documents Quality Assurance in Radiology
Reporting: Peer Feedback and Cancer multidisciplinary team meetings
– standards for clinical radiologists, Second edition.2,3
Dr Pete Cavanagh
Vice-President, Clinical Radiology
The Royal College of Radiologists
Standards for Learning from
4 www.rcr.ac.uk Discrepancies meetings
1. Recommended Standard 4
should make it easy for individuals Preparation for the meeting Attendees can commit their opinion
to submit cases anonymously, so on paper without discussion, but
The convener will need to obtain
that fear of retribution does not cases often contain several facets
the images together with the
discourage submission. and this method can be time-
original request details before the
consuming. As the process is
A secure and anonymous system for meeting so that each case can be
inevitably artificial, honest,
case collection may comprise presented with the information that
consensus-aimed discussion can be
locked boxes situated in was available to the reporter.
more informative and is more likely
appropriate reporting areas, Clinical follow-up, outcome and/or
to emphasise the learning rather
together with short standardised case notes may also be required.
than judgemental aspects if
‘case submission forms’ available
For maximum efficiency, it may be conducted in a non-blaming,
(and regularly replenished) next to
helpful if the relevant information anonymous manner. Further clinical
the collection box. These forms
for each case is entered onto a information may be required during
need only list the essential
standardised learning from the discussion, and having the
information on the case for the
discrepancy form (Appendix 1). clinical notes to hand may be
meeting, and the person entering
The sections of this form to be helpful. All attendees should be
the case should be anonymous. Any
completed prior to submission/ encouraged to contribute to the
case with learning potential
presentation could include: discussion and the convener should
(including false-positives) should be
prevent the discussion from being
entered. Alternatively, electronic • The date of original report
dominated by a few individuals.
methods, such as discrepancy files
• The imaging modality
on picture archiving and
communication systems (PACS), • The date and circumstances
may also work well as long as of detection of reporting Standards
security is maintained. discrepancy, for example, MDTM
All radiologists should
The convener should make • Whether the clinician in charge is regularly participate in
clinicians aware of these meetings aware of the discrepancy radiology LDMs. Individuals
so that they can refer cases when • The reason for entering the case. should achieve at least a 50%
appropriate. attendance rate, and the
The LDM form should also include a attendance record should be
As a guide, approximately five to section for recording the outcomes made available to individual
ten cases for a 1–1.5 hour meeting (as discussed below) which will be radiologists and the clinical
held once a month will usually be completed at the meeting. director.
sufficient for worthwhile
educational discussion. To some To foster shared learning, The minimum frequency of
extent, the numbers will depend on consideration should be given meetings should be every two
the size of the radiology to the inclusion of ‘near misses’ months.
department. and ‘good catches’ in the
case selection. There should be a formal
It is recommended that process for electing a
departments should schedule Conduct of the meeting convener for a fixed term
at least one meeting per year to There are various different ways in (renewable by agreement).
review and reflect on the cases which meetings may be conducted
published in Radiology Errors that can be tailored to local
and Discrepancies circumstances, but the emphasis on
(READ, www.rcr.ac.uk/READ).48 shared learning and maintenance of
anonymity during the presentation
is essential. The cases should be
presented with the information and
images that were available at the
time of reporting, accepting that it
is never possible to recreate the
original reporting conditions.
Standards for Learning from
8 www.rcr.ac.uk Discrepancies meetings
6. Outcomes of LDMs All discrepancies discussed should many different clinical factors
be considered for submission to (including other forms of imaging)
The main outcome of the LDM is READ to promote national shared feed into the clinical decision-
learning. However, radiologists learning.48 making process. If an error has
within the meeting have a significantly adversely affected their
professional duty of care to ensure Categorising discrepancies
care, patients have a right to this
that the discrepancy being The main purpose of categorising information. However,
discussed has not been the cause the cases discussed is to help direct communication with the patient
of patient harm. The General learning towards understanding must be undertaken in a sensitive
Medical Council (GMC) guidance the likely cause of the discrepancy, manner following discussions
Raising and acting on concerns to assess the need for clinical between the radiologist and the
about patient safety (2012) sets action and help the reporter and/or clinical team. There must be no
out the expectation that all doctors clinical director to identify trends fraudulent concealment.
will, whatever their role, take that may need to be addressed
appropriate action to raise and act to mitigate against it happening
on concerns about patient care, again. A structured learning matrix
dignity and safety.49 is included in the LDM form in
Standards
Learning Appendix 1.
There should be a formal
The convenor should guide the LDMs are important for learning
process for recording the
meeting to an agreed learning rather than individual performance
outcome of LDMs. This should
point. After the meeting a summary assessment. Adopting a grading
include:
of each case should be prepared or scoring system to decide if an
along with the learning point(s). This error has occurred is unreliable • Consensus-aimed
could take the form of a single and subjective as there is often discussion of each case
PowerPoint slide which can be poor agreement between • Learning points and action
easily made available to all scorers.5–8 Simply allocating a score points where appropriate
radiologists to encourage reflection to a discrepancy is of questionable
value as it is unlikely, on its own, • Whether the clinician in
and learning.
to lead to a specific outcome charge of the patient is
Confidential feedback to the or action. aware of the discrepancy.
person who reported the original
A scoring culture can fuel a A summary of all cases
investigation using a standardised
blame culture with less collective discussed should be available
feedback form is required, even if
learning from discrepancies/near to all radiologists in the
the individual does not work in that
misses. This has adverse department.
hospital (for example,
teleradiologists, rotating specialist risks and consequences for
registrars, locums, reporting patients, team working and service
radiographers and so on). The improvement.4,8,9,14,51–61
standard feedback form should Clinical significance
include a short summary of the
discussion at the discrepancy Judging the clinical significance for
meeting including the agreed the patient of a false-negative (or
learning point(s). This will allow false-positive) imaging report is
individual reflection which can be much more problematic. Potential
used as evidence of audited impact and actual impact on
practice in the radiologist’s patient management are different,
appraisal/revalidation portfolio.50 may depend on the clinical setting
of the report and are often difficult
to judge at a radiology LDM. The
situation where clinical
management is solely determined
by the imaging report is very
different from the setting where
Standards for Learning from
Discrepancies meetings www.rcr.ac.uk 9
Yes No
Yes/possible/
No Yes No
not sure
References 10. Iyer RS, Swanson JO, Otto RK, 20. Forman HP, Larson DB, Kazerooni EA
Weinberger E. Peer review comments et al. Masters of radiology panel
1. The Mid Staffordshire NHS augment diagnostic error discussion: hyperefficient radiology –
Foundation Trust Public Inquiry. characterization and departmental can we maintain the pace? AJR Am J
Chaired by Robert Francis QC. Report quality assurance: 1-year experience Roentgenol 2012; 199(4): 838–843.
of the Mid Staffordshire NHS from a children’s hospital. AJR Am J
Foundation Trust Public Inquiry. 21. The Royal College of Radiologists.
Roentgenol 2013; 200(1): 132–137.
London: The Stationery Office, 2013. Standards for the communication of
11. McCoubrie P, FitzGerald R. critical, urgent and unexpected
2. The Royal College of Radiologists. Commentary on discrepancies in significant radiological findings,
Quality assurance in radiology discrepancy meetings. Clin Radiol second edition. London: The Royal
reporting: peer feedback. London: The 2014; 69(1): 11–12. College of Radiologists, 2012.
Royal College of Radiologists, 2014.
12. The Royal College of Radiologists. 22. Dalla Palma L, Stacul F, Meduri S,
3. The Royal College of Radiologists. Personal reflection on discrepancies Geitung JT. Relationships between
Cancer multidisciplinary team and adverse events. London: The radiologists and clinicians; results from
meetings – standards for clinical Royal College of Radiologists, 2010. three surveys. Clin Radiol 2000; 55(8):
radiologists, Second edition. London: 602–605.
The Royal College of Radiologists, 13. FitzGerald R. Error in radiology.
2014. Clin Radiol 2001; 56(12): 938–946. 23. Quekel LG, Kessels AG, Goei R, van
Engelshoven JM. Miss rate of lung
4. Prowse SJ, Pinkey B, Etherington B. 14. FitzGerald R. Radiological error:
cancer on the chest radiograph in
Discrepancies in discrepancies analysis, standard setting, targeted
clinical practice. Chest 1999; 115(3):
meetings. Results of the UK National instruction and teamworking.
720–724.
Discrepancy Meeting Survey. Eur Radiol 2005; 15(8): 1760–1767.
Clin Radiol 2014; 69(1): 18–22. 24. Shimal AK, Jones R, Vohrah A.
15. Berlin L. Accuracy of diagnostic
Reporting discrepancies in chest
5. Mucci B, Murray H, Downie A, procedures; has it improved over
radiographs at Coventry Hospitals,
Osborne K. Inter-rater variation in the past five decades? AJR Am J
UK: a pictorial review of six years
scoring radiological discrepancies. Roentgenol 2007; 188(5): 1173–1178.
experience. European Congress of
Br J Radiol 2013; 86(1028): 20130245. 16. Berlin L. Radiological errors and Radiology, Vienna, 2006 (Poster C-166).
6. Bender LC, Linnau KF, Meier EN, Anzai malpractice: a blurry distinction.
25. Armato SG 3rd, Roberts RY,
Y, Gunn ML. Interrater agreement in AJR Am J Roentgenol 2007; 189(3):
Kocherginsky M et al. Assessment
the evaluation of discrepant imaging 517–522.
of radiologist performance in the
findings with the Radpeer system. 17. Kim YW, Mansfield LT. Fool me twice: detection of lung nodules:
AJR Am J Roentgenol 2012; 199(6): Delayed diagnosis in radiology with Dependence on the definition of
1320–1327. emphasis on perpetuated errors. ‘truth’. Acad Radiol 2009; 16(1): 28–29.
7. Semelka RC, Ryan AF, Yonkers S, AJR Am J Roentgenol 2014; 202(4):
26. Lim KY, Kligerman SJ, Lin CT,
Braga L. Objective determination of 465–470.
White CS. Missed pulmonary
standard of care: use of blind readings 18. Jones DN, Thomas MJ, Mandel CJ embolism on abdominal CT. AJR Am
by external radiologists. AJR Am J et al. Where failures occur in the J Roentgenol 2014; 202(3): 738–743.
Roentgenol 2010; 195(2): 429–431. imaging care cycle: lessons from the
27. Abujudeh HH, Hani H, Boland G et al.
8. Larson DB, Nance JJ. Rethinking peer Radiology Events Register. J Am Coll
Abdominal and pelvic computed
review: what aviation can teach Radiol 2010; 7(8): 593–602.
tomography (CT): discrepancy rates
radiology about performance 19. Lee CS, Nagy PG, Weaver SJ, among experienced radiologists.
improvement.Radiology 2011; 259(3): Newman-Toker DE. Cognitive and Eur Radiol 2010; 20(8): 1952–1957.
626–632. system factors contributing to
28. Horton KM, Johnson PT, Fishman EK.
9. National Advisory Group on diagnostic errors in radiology.
MDCT of the abdomen. Common
the Safety of Patients in AJR Am J Roentgenol 2013; 201(3):
misdiagnoses at a busy academic
England. A promise to learn – 611–617.
center. AJR Am J Roentgenol 2010;
a commitment to act. Improving the 194(3): 660–667.
safety of patients in England. London:
NHS England, 2013.
Standards for Learning from
12 www.rcr.ac.uk Discrepancies meetings
29. Levine CD, Aizenstein O, Lehavi O, 37. Modic MT, Obuchowski NA, Ross JS 45. Soffa DJ, Lewis RS, Sunshine JH,
Blachar A. Why we miss the diagnosis et al. Acute low back pain and Bhargavan M. Disagreement in
of appendicitis on abdominal CT: radiculopathy: MR imaging findings interpretation: a method for the
Evaluation of imaging features of and their prognostic role and effect on development of benchmarks for
appendicitis incorrectly diagnosed on outcome. Radiology 2005; 237(2): quality assurance in imaging. J Am
CT. AJR Am J Roentgenol 2005; 184(3): 597–604. Coll Radiol 2004; 1(3): 212–217.
855–859.
38. van Rijn JC, Klemetsö N, Reitsma JB 46. Borgstede JP, Lewis RS, Bhargavan M,
30. Blackmore CC, Terasawa T. Optimizing et al. Observer variation in MRI Sunshine JH. RADPEER quality
the interpretation of CT for evaluation of patients suspected of assurance program: A multifacility
appendicitis: modelling health utilities lumbar disk herniation. AJR Am J study of interpretative disagreement
for clinical practice. J Am Coll Radiol Roentgenol 2005; 184(1): 299–303. rates. J Am Coll Radiol 2004; 1(1):
2006; 3(2): 115–121. 59–65.
39. Offiah AC, Moon L, Hall CM,
31. Gangi S, Fletcher JG, Nathan MA et al. Todd-Pokropek A. Diagnostic 47. Wu MZ, McInnes MDF, Macdonald
Time interval between abnormalities accuracy of fracture detection in DB, Kielar AZ, Duigenan S. CT in
seen on CT and the clinical diagnosis suspected non-accidental injury: the adults: Systematic review and
of pancreatic cancer: retrospective effect of edge enhancement and meta-analysis of interpretation
review of CT scans obtained before digital display on observer discrepancy rates. Radiology 2014;
diagnosis. AJR Am J Roentgenol 2004; performance. Clin Radiol 2006; 61(2): 270(3): 717–735.
182(4): 897–903. 163–173.
48. www.rcr.ac.uk/READ (last accessed
32. Briggs RH, Rowbotham E, Johnstone 40. Krief OP, Huguet D. Shoulder pain and 21/08/2014)
AL, Chalmers AG. Provisional disability: comparison with MR
49. General Medical Council. Raising and
reporting of polytrauma CT by on-call findings. AJR Am J Roentgenol 2006;
acting on concerns about patient
radiology registrars. Is it safe? 186(5): 1234–1239.
safety. London: General Medical
Clin Radiol 2010: 65(8): 616–622.
41. Zanetti M, Pfirrmann CW, Schmid MR, Council, 2012.
33. Zan E, Yousem DM, Carone M, Lewin Romero J, Seifert B, Hodler J. Patients
50. The Royal College of Radiologists.
JS. Second-opinion consultations in with suspected meniscal tears:
Personal reflection on discrepancies
neuroradiology. Radiology 2010; prevalence of abnormalities seen on
and adverse events. London: The
255(1): 135–141. MRI of 100 symptomatic and 100
Royal College of Radiologists, 2010.
contralateral asymptomatic knees.
34. Jordan YJ, Jordan JE, Lightfoote JB,
AJR Am J Roentgenol 2003; 181(3): 51. Hussain S, Hussain JS, Karam A,
Ragland KD. Quality outcomes of
635–641. Vijayaraghavan G. Focused peer
reinterpretation of brain CT studies by
review: the end game of peer review. J
subspeciality experts in stroke 42. McCreadie G, Oliver TB. Eight CT
Am Coll Radiol 2012; 9(6): 430–433.
imaging. AJR Am J Roentgenol 2012; lessons that we learned the hard way:
199(6): 1365–1370. an analysis of current patterns of 52. Eisenberg RL, Cunningham ML,
radiological error and discrepancy Siewert B, Kruskal JB. Survey of faculty
35. Erly WK, Ashdown BC, Lucio RW 2nd,
with particular emphasis on CT. perceptions regarding a peer review
Carmody RF, Seeger JF, Alcala JN.
Clin Radiol 2009; 64(5): 491–499. system. J Am Coll Radiol 2014; 11(4):
Evaluation of emergency CT scans of
397–401.
the head: is there a community 43. Smidt N, Rutjes AW, van der Windt DA
standard? AJR Am J Roentgenol 2003; et al. Quality of reporting of diagnostic 53. Alkasab TK, Harvey HB, Gowda V,
180(6): 1727–1730. accuracy studies. Radiology 2005; Thrall JH, Rosenthal DI, Gazelle GS.
235(2): 347–353. Consensus-oriented group peer
36. Jarvik JG, Deyo RA. Moderate versus
review: a new process to review
mediocre: The reliability of spine MRI 44. Shewhart WA. Economic control of
radiologist work output. J Am
data interpretations. Radiology 2009; quality of manufactured product. New
Coll Radiol 2014; 11(2): 131–138.
250(1): 15–17. York: D Van Nostrand Company, 1931.
(Reprinted by ASQC Quality Press,
1980).
Standards for Learning from
Discrepancies meetings www.rcr.ac.uk 13
Case to be discussed:
Name: DOB: Ref no:
Imaging studies:
Date of study:
Discrepancy to be discussed:
Yes
The Royal College of Radiologists. Standards for For permission to reproduce any of the content
Learning from Discrepancies meetings. London: contained herein, please email: permissions@rcr.ac.uk
The Royal College of Radiologists, 2014.
This material has been produced by The Royal
Ref No. BFCR(14)11 College of Radiologists (RCR) for use internally
© The Royal College of Radiologists, October 2014. within the specialties of clinical oncology and clinical
radiology in the United Kingdom. It is provided for
use by appropriately qualified professionals, and the
making of any decision regarding the applicability and
suitability of the material in any particular circumstance
is subject to the user’s professional judgement.