Anda di halaman 1dari 16

www.rcr.ac.

uk

Standards for
Learning from Discrepancies meetings

Faculty of Clinical Radiology


Standards for Learning from
2 www.rcr.ac.uk Discrepancies meetings

Contents
Foreword 3 7. Discrepancies and incident reporting 9
1. Recommended standards 4 8. Annual radiological discrepancy 10
meeting report
2. Introduction 5
References 11
3. Definition of a reporting discrepancy 5
Appendix 1. LDM template 14
4. Causes of a reporting discrepancy 6
Appendix 2. Biases 15
5. Running LDMs 6 Sampling bias 15
Convener 6 Selection bias 15
Case collection 6 Presentation bias 15
Preparation for the meeting 7 Information bias 15
Conduct of the meeting 7 Hindsight bias 15
Outcome bias 15
6. Outcomes of LDMs 8 Attendance bias 15
Learning 8 Variation 15
Categorising discrepancies 8 Commercial bias 15
Clinical significance 8

RCR Standards The standards are not regulations governing practice,


but attempt to define the aspects of radiological
The RCR, a registered charity, exists to advance the
services and care which promote the provision of a
science and practice of radiology and oncology.
high-quality service to patients.
It undertakes to produce standards documents to
Specific cancer standards are issued separately by the
provide guidance to radiologists and others involved in
Department of Health, the Welsh Assembly
the delivery of radiological services with the aim of
Government, the Scottish Executive, and the Northern
defining good practice, advancing the practice of
Ireland Government (Appendix 1). These RCR
radiology and improving the service for the benefit
standards will therefore need to be interpreted in the
of patients.
light of separate standards issued by the separate
The standards documents cover a wide range of topics. national governments of the United Kingdom.
All have undergone an extensive consultation process
The RCR has committed to reviewing all relevant
to ensure a broad consensus, underpinned by
publications in line with the recommendations of the
published evidence, where applicable. Each is subject
Francis report and where appropriate applying the
to review three years after publication or earlier, if
category of standard defined by Francis (fundamental,
appropriate.
enhanced or developmental).1 This document contains
standards that fall within the enhanced category.

Current standards documents


Standards for radiofrequency ablation Standards and recommendations for the Standards for intravascular contrast
(RFA), Second edition reporting and interpretation of imaging agent administration to adult patients,
investigations by non-radiologist Second edition
Standards for patient confidentiality
medically qualified practitioners and
and PACS and RIS Standards for providing a 24-hour
teleradiologists
diagnostic radiology service
Standards for the communication of
Standards for the NPSA and RCR safety
critical, urgent and unexpected Standards for providing a 24-hour
checklist for radiological interventions
significant radiological findings, interventional radiology service
Second edition Standards for the provision of
Standards for Self-assessment
teleradiology within the United Kingdom
Standards for patient consent particular of Performance
to radiology, Second edition Standards for the recording of second
Standards for the Reporting and
opinions or reviews in radiology
Standards of practice and guidance Interpretation of Imaging investigations
departments
for trauma radiology in severely
Standards for Ultrasound Equipment
injured patients Standards for a results
acknowledgement system
Standards for Learning from
Discrepancies meetings www.rcr.ac.uk 3

Foreword
As radiologists, we are constantly striving to improve the standards
of service we provide to patients with a culture of learning, self-reflection
and personal development.
Humans will always make errors and radiologists are no different.
As part of the reporting process, we are constantly having to give an
opinion under conditions of uncertainty. With hindsight, often combined
with additional information, it is inevitable that discrepancies will be
acknowledged in the original interpretation of a study. It is important
that the concept that not all discrepancies are ‘errors’ is understood and
managed so that harm or potential harm is minimised, and that a
learning system is in place in an attempt to avoid repetition.
Reviewing and learning from discrepancies and adverse events can
provide evidence of reflective practice and, if performed in a supportive
learning environment, can contribute to the evidence for providers and
users of a service as to its safety. Structuring the learning to help identify
contributing factors can also help inform the organisation of potential
trends that can be addressed to mitigate against recurrence and
contribute to the enhancement of patient safety.
The Royal College of Radiologists (RCR) has produced this document
to set standards and give guidance on how shared learning may be
used. It replaces the previously published document Standards for
Radiology Discrepancy Meetings, which has now been withdrawn.
The document emphasises the educational role of the learning from
discrepancies meeting (LDM) and how such meetings should be part of
a radiology quality assurance (QA) programme. The document should
be read alongside the RCR documents Quality Assurance in Radiology
Reporting: Peer Feedback and Cancer multidisciplinary team meetings
– standards for clinical radiologists, Second edition.2,3

Dr Pete Cavanagh
Vice-President, Clinical Radiology
The Royal College of Radiologists
Standards for Learning from
4 www.rcr.ac.uk Discrepancies meetings

1. Recommended Standard 4

standards A summary of all cases discussed


should be available to all
Standard 1 radiologists in the department.
All radiologists should regularly Standard 5
participate in radiology LDMs.
Individuals should achieve at least a There should be a formal process
50% attendance rate, and the for confidential feedback.
attendance record should be made Standard 6
available to individual radiologists
and the clinical director. The convener should produce a
formal bi-annual report
Standard 2 documenting key learning and
The minimum frequency of meetings action points, including any
should be every two months. recurrent patterns of error to
demonstrate a departmental
Standard 3 process for learning from mistakes.
There should be a formal process Standard 7
for recording the outcome of LDMs.
This should include: There should be a formal process
for electing a convener for a fixed
• Consensus-aimed discussion term (renewable by agreement).
of each case
• Learning points and action points
where appropriate
• Whether the clinician in charge
of the patient is aware of the
discrepancy.
Standards for Learning from
Discrepancies meetings www.rcr.ac.uk 5

2. Introduction shared learning, significantly 3. Definition of a


contributing to patient safety.
Since the publication of Standards reporting discrepancy
for Radiology Discrepancy Meetings Attendance at the LDM and
A reporting discrepancy occurs
by the RCR in 2007, regular personal reflection on
when a retrospective review, or
discrepancy meetings have been discrepancies are both categories
subsequent information about
almost universally adopted by of evidence which form part of an
patient outcome, leads to an
radiology departments in the UK.4 enhanced appraisal portfolio for
opinion different from that
revalidation.12
The RCR recognises that learning is expressed in the original report.
the main outcome following review Every radiologist has a duty of
and discussion of reporting candour as defined in the Francis
discrepancies and it is report.1 Reporting and learning
recommended that the title of the from discrepancies cannot be
meetings should be changed to undertaken in isolation from the
learning from discrepancies concept of patient harm and the
meetings (LDMs). LDM must be integrated into the
standard of practice for all
Whereas, in the past, scoring has
individuals who provide reports on
been a feature of these meetings,
diagnostic images. The key
this is no longer considered valid.5–11
principles should be:
A greater emphasis on
understanding error to improve • To accept that discrepancies will
radiologist performance is occur
encouraged through the • To mitigate against discrepancies
categorisation of discrepancies. through QA programmes
The LDM plays a crucial role in • To have processes in place
clinical governance. Alongside to minimise any potential
other inter-related processes, and patient harm
as part of a QA programme, the • To have systems in place for
LDM will facilitate an improvement shared learning from
in the quality of service provided, discrepancies within
and is an important source of a blame-free culture.
Standards for Learning from
6 www.rcr.ac.uk Discrepancies meetings

4. Causes of a 5. Running LDMs • The convener will need to


maintain the anonymity of both
reporting discrepancy There is no prescriptive way of the person who entered the
It is well recognised that radiology running the LDM. A successful case for discussion and also the
discrepancies occur.13–17 Causes can meeting will, however, make a person who issued the imaging
be usefully categorised as significant contribution to patient report in question.
individual or system related.18–21 safety by:
• The convener must encourage
Radiologist-specific causes include: • Focusing on shared learning constructive discussion involving
• Encouraging constructive as many of the attendees as
• Cognitive: the finding was
discussion of contributing factors possible, and summarise the
appreciated but attributed to the
learning points of each case.
wrong cause. This may be due to • Producing a consensus on
a lack of knowledge structured learning outcomes, • The convener must remain
learning points and follow-up impartial and prevent any one
• Perceptual:
actions person from dominating the
– Observational: the finding is meeting by specifically asking for
identifiable but was missed • Recognising professional the opinions of other attendees.
responsibilities to consider the Everyone is entitled to an opinion
– Satisfaction of search: potential for patient harm. and honest, consensus-aimed
detection of one abnormality
Convener discussion is vital when trying to
on a study results in premature
ascertain if a discrepancy is
termination of the search, The success of the meetings will actually an error.
allowing for the possibility depend, to a large extent, on the
of missing other related or convener(s), who should be elected Case collection
unrelated abnormalities by, and have the confidence of, Identifying discrepancies usually
• Ambiguity of wording or their peers. For some departments occurs in one of three ways:
summary of report. it may be more suitable to have two
conveners and other departments • Systematic review as part of the
System-related causes include: may prefer a rota to encourage department QA programme
• Inadequate, misleading or team working and load sharing and • Double reporting/second look at
incorrect clinical information: to minimise bias. multidisciplinary team meetings
the clinical diagnosis has been The convener(s) should have time (MDTMs). This is a particularly rich
shown to change in 50% of in their job plan to prepare for the source of learning material as the
cases following communication meeting, summarise and distribute radiologist will have access to the
between the clinician and the the outcomes and submit a full range of clinical information
radiologist22 bi-annual report to the clinical including outcomes of other
management team. investigations
• Poor imaging technique
The convener(s) will have specific • Ad hoc when undertaking first
• Excessive workload or poor
key roles to ensure a successful reporting through review of
working conditions.
meeting. previous films.
There are no objective
• The convener must avoid a blame Case collection will always be prone
benchmarks for acceptable levels
culture at all costs, and always to sampling error bias since it is not
of observation, interpretation
stress the shared learning aspects possible to collect absolutely every
or ambiguity discrepancies.
of the meetings. LDMs must not discrepancy, minor difference of
There is published literature
be abused or seen as an opinion or unexpected outcome
with radiological reporting
opportunity for harassment or that occurs. A robust method of
discrepancy rates varying from
bullying. case collection is, however, an
3–30%. Case-mix, selection
essential prerequisite for successful
bias, imaging modality and
meetings and is the responsibility of
inter- and intra-observer
the convener. The method chosen
variability render standard
setting very difficult.5–7,14,23–47
Standards for Learning from
Discrepancies meetings www.rcr.ac.uk 7

should make it easy for individuals Preparation for the meeting Attendees can commit their opinion
to submit cases anonymously, so on paper without discussion, but
The convener will need to obtain
that fear of retribution does not cases often contain several facets
the images together with the
discourage submission. and this method can be time-
original request details before the
consuming. As the process is
A secure and anonymous system for meeting so that each case can be
inevitably artificial, honest,
case collection may comprise presented with the information that
consensus-aimed discussion can be
locked boxes situated in was available to the reporter.
more informative and is more likely
appropriate reporting areas, Clinical follow-up, outcome and/or
to emphasise the learning rather
together with short standardised case notes may also be required.
than judgemental aspects if
‘case submission forms’ available
For maximum efficiency, it may be conducted in a non-blaming,
(and regularly replenished) next to
helpful if the relevant information anonymous manner. Further clinical
the collection box. These forms
for each case is entered onto a information may be required during
need only list the essential
standardised learning from the discussion, and having the
information on the case for the
discrepancy form (Appendix 1). clinical notes to hand may be
meeting, and the person entering
The sections of this form to be helpful. All attendees should be
the case should be anonymous. Any
completed prior to submission/ encouraged to contribute to the
case with learning potential
presentation could include: discussion and the convener should
(including false-positives) should be
prevent the discussion from being
entered. Alternatively, electronic • The date of original report
dominated by a few individuals.
methods, such as discrepancy files
• The imaging modality
on picture archiving and
communication systems (PACS), • The date and circumstances
may also work well as long as of detection of reporting Standards
security is maintained. discrepancy, for example, MDTM
All radiologists should
The convener should make • Whether the clinician in charge is regularly participate in
clinicians aware of these meetings aware of the discrepancy radiology LDMs. Individuals
so that they can refer cases when • The reason for entering the case. should achieve at least a 50%
appropriate. attendance rate, and the
The LDM form should also include a attendance record should be
As a guide, approximately five to section for recording the outcomes made available to individual
ten cases for a 1–1.5 hour meeting (as discussed below) which will be radiologists and the clinical
held once a month will usually be completed at the meeting. director.
sufficient for worthwhile
educational discussion. To some To foster shared learning, The minimum frequency of
extent, the numbers will depend on consideration should be given meetings should be every two
the size of the radiology to the inclusion of ‘near misses’ months.
department. and ‘good catches’ in the
case selection. There should be a formal
It is recommended that process for electing a
departments should schedule Conduct of the meeting convener for a fixed term
at least one meeting per year to There are various different ways in (renewable by agreement).
review and reflect on the cases which meetings may be conducted
published in Radiology Errors that can be tailored to local
and Discrepancies circumstances, but the emphasis on
(READ, www.rcr.ac.uk/READ).48 shared learning and maintenance of
anonymity during the presentation
is essential. The cases should be
presented with the information and
images that were available at the
time of reporting, accepting that it
is never possible to recreate the
original reporting conditions.
Standards for Learning from
8 www.rcr.ac.uk Discrepancies meetings

6. Outcomes of LDMs All discrepancies discussed should many different clinical factors
be considered for submission to (including other forms of imaging)
The main outcome of the LDM is READ to promote national shared feed into the clinical decision-
learning. However, radiologists learning.48 making process. If an error has
within the meeting have a significantly adversely affected their
professional duty of care to ensure Categorising discrepancies
care, patients have a right to this
that the discrepancy being The main purpose of categorising information. However,
discussed has not been the cause the cases discussed is to help direct communication with the patient
of patient harm. The General learning towards understanding must be undertaken in a sensitive
Medical Council (GMC) guidance the likely cause of the discrepancy, manner following discussions
Raising and acting on concerns to assess the need for clinical between the radiologist and the
about patient safety (2012) sets action and help the reporter and/or clinical team. There must be no
out the expectation that all doctors clinical director to identify trends fraudulent concealment.
will, whatever their role, take that may need to be addressed
appropriate action to raise and act to mitigate against it happening
on concerns about patient care, again. A structured learning matrix
dignity and safety.49 is included in the LDM form in
Standards
Learning Appendix 1.
There should be a formal
The convenor should guide the LDMs are important for learning
process for recording the
meeting to an agreed learning rather than individual performance
outcome of LDMs. This should
point. After the meeting a summary assessment. Adopting a grading
include:
of each case should be prepared or scoring system to decide if an
along with the learning point(s). This error has occurred is unreliable • Consensus-aimed
could take the form of a single and subjective as there is often discussion of each case
PowerPoint slide which can be poor agreement between • Learning points and action
easily made available to all scorers.5–8 Simply allocating a score points where appropriate
radiologists to encourage reflection to a discrepancy is of questionable
value as it is unlikely, on its own, • Whether the clinician in
and learning.
to lead to a specific outcome charge of the patient is
Confidential feedback to the or action. aware of the discrepancy.
person who reported the original
A scoring culture can fuel a A summary of all cases
investigation using a standardised
blame culture with less collective discussed should be available
feedback form is required, even if
learning from discrepancies/near to all radiologists in the
the individual does not work in that
misses. This has adverse department.
hospital (for example,
teleradiologists, rotating specialist risks and consequences for
registrars, locums, reporting patients, team working and service
radiographers and so on). The improvement.4,8,9,14,51–61
standard feedback form should Clinical significance
include a short summary of the
discussion at the discrepancy Judging the clinical significance for
meeting including the agreed the patient of a false-negative (or
learning point(s). This will allow false-positive) imaging report is
individual reflection which can be much more problematic. Potential
used as evidence of audited impact and actual impact on
practice in the radiologist’s patient management are different,
appraisal/revalidation portfolio.50 may depend on the clinical setting
of the report and are often difficult
to judge at a radiology LDM. The
situation where clinical
management is solely determined
by the imaging report is very
different from the setting where
Standards for Learning from
Discrepancies meetings www.rcr.ac.uk 9

7. Discrepancies been made’.1 It is recognised that radiologist or the clinician to


one of the barriers to reporting escalate this through the
and incident reporting incidents is the lack of clear organisation’s incident reporting
Incident reporting is a well- definitions leading to confusion as system.
established national tool to to what to report and when.
Discrepancies submitted to the
encourage learning. Every
In the majority of cases discussed LDM which are not known to the
radiologist has a duty of candour.
at the LDM, the discrepancy will clinical team can be assessed by the
This is defined by Robert Francis
already be known to the clinical radiologists with a view to deciding
as,‘The volunteering of all relevant
team looking after the patient. whether the nature of the
information to persons who have or
Discussion between the radiologist discrepancy is likely to constitute a
may have been harmed by the
and clinician will establish whether risk to patient care. Where there is
provision of services, whether or not
the discrepancy constitutes an uncertainty, further discussion with
the information has been requested
incident. It should be decided the referring clinician should take
and whether or not a complaint or a
after such discussion whether the place (Figure 1).
report about that provision has
responsibility lies with the

Figure 1. Discrepancy recognised

Is the clinician in charge of the patient aware of the discrepancy?

Yes No

Is the discrepancy Is the discrepancy


of clinical significance? likely to be important?

Yes/possible/
No Yes No
not sure

Forward Consider Discuss


to LDM for incident with referring
shared learning report clinician
Standards for Learning from
10 www.rcr.ac.uk Discrepancies meetings

8. Bi-annual radiological It is important to bear in mind,


when reviewing the bi-annual Standards
LDM report report, that the cases submitted to,
The convener should produce
Recurrent patterns of discrepancies and decisions made at LDMs may
a formal bi-annual report
may only become apparent when be subject to a variety of biases as
documenting key learning
reviewing the cases discussed outlined in Appendix 2.
and action points including
during the year. Consequently, the
The bi-annual report should go to any recurrent patterns in
production of an anonymised
all radiologists attending the LDM discrepancies to demonstrate
bi-annual LDM report is an
and also to the clinical director. It a departmental process for
important way of discovering
should also feed into the trust’s learning from mistakes.
recurrent department-wide
clinical governance process.
discrepancies and alerting
colleagues to be particularly vigilant The percentage attendance record
for these sources of error. Through at LDMs during the year for
the use of such reports, important department members should be
changes in practice can be made available to individuals as it is
achieved, including addressing important for appraisal and
issues such as standardisation of revalidation purposes.
technique, equipment and training
requirements.
Standards for Learning from
Discrepancies meetings www.rcr.ac.uk 11

References 10. Iyer RS, Swanson JO, Otto RK, 20. Forman HP, Larson DB, Kazerooni EA
Weinberger E. Peer review comments et al. Masters of radiology panel
1. The Mid Staffordshire NHS augment diagnostic error discussion: hyperefficient radiology –
Foundation Trust Public Inquiry. characterization and departmental can we maintain the pace? AJR Am J
Chaired by Robert Francis QC. Report quality assurance: 1-year experience Roentgenol 2012; 199(4): 838–843.
of the Mid Staffordshire NHS from a children’s hospital. AJR Am J
Foundation Trust Public Inquiry. 21. The Royal College of Radiologists.
Roentgenol 2013; 200(1): 132–137.
London: The Stationery Office, 2013. Standards for the communication of
11. McCoubrie P, FitzGerald R. critical, urgent and unexpected
2. The Royal College of Radiologists. Commentary on discrepancies in significant radiological findings,
Quality assurance in radiology discrepancy meetings. Clin Radiol second edition. London: The Royal
reporting: peer feedback. London: The 2014; 69(1): 11–12. College of Radiologists, 2012.
Royal College of Radiologists, 2014.
12. The Royal College of Radiologists. 22. Dalla Palma L, Stacul F, Meduri S,
3. The Royal College of Radiologists. Personal reflection on discrepancies Geitung JT. Relationships between
Cancer multidisciplinary team and adverse events. London: The radiologists and clinicians; results from
meetings – standards for clinical Royal College of Radiologists, 2010. three surveys. Clin Radiol 2000; 55(8):
radiologists, Second edition. London: 602–605.
The Royal College of Radiologists, 13. FitzGerald R. Error in radiology.
2014. Clin Radiol 2001; 56(12): 938–946. 23. Quekel LG, Kessels AG, Goei R, van
Engelshoven JM. Miss rate of lung
4. Prowse SJ, Pinkey B, Etherington B. 14. FitzGerald R. Radiological error:
cancer on the chest radiograph in
Discrepancies in discrepancies analysis, standard setting, targeted
clinical practice. Chest 1999; 115(3):
meetings. Results of the UK National instruction and teamworking.
720–724.
Discrepancy Meeting Survey. Eur Radiol 2005; 15(8): 1760–1767.
Clin Radiol 2014; 69(1): 18–22. 24. Shimal AK, Jones R, Vohrah A.
15. Berlin L. Accuracy of diagnostic
Reporting discrepancies in chest
5. Mucci B, Murray H, Downie A, procedures; has it improved over
radiographs at Coventry Hospitals,
Osborne K. Inter-rater variation in the past five decades? AJR Am J
UK: a pictorial review of six years
scoring radiological discrepancies. Roentgenol 2007; 188(5): 1173–1178.
experience. European Congress of
Br J Radiol 2013; 86(1028): 20130245. 16. Berlin L. Radiological errors and Radiology, Vienna, 2006 (Poster C-166).
6. Bender LC, Linnau KF, Meier EN, Anzai malpractice: a blurry distinction.
25. Armato SG 3rd, Roberts RY,
Y, Gunn ML. Interrater agreement in AJR Am J Roentgenol 2007; 189(3):
Kocherginsky M et al. Assessment
the evaluation of discrepant imaging 517–522.
of radiologist performance in the
findings with the Radpeer system. 17. Kim YW, Mansfield LT. Fool me twice: detection of lung nodules:
AJR Am J Roentgenol 2012; 199(6): Delayed diagnosis in radiology with Dependence on the definition of
1320–1327. emphasis on perpetuated errors. ‘truth’. Acad Radiol 2009; 16(1): 28–29.
7. Semelka RC, Ryan AF, Yonkers S, AJR Am J Roentgenol 2014; 202(4):
26. Lim KY, Kligerman SJ, Lin CT,
Braga L. Objective determination of 465–470.
White CS. Missed pulmonary
standard of care: use of blind readings 18. Jones DN, Thomas MJ, Mandel CJ embolism on abdominal CT. AJR Am
by external radiologists. AJR Am J et al. Where failures occur in the J Roentgenol 2014; 202(3): 738–743.
Roentgenol 2010; 195(2): 429–431. imaging care cycle: lessons from the
27. Abujudeh HH, Hani H, Boland G et al.
8. Larson DB, Nance JJ. Rethinking peer Radiology Events Register. J Am Coll
Abdominal and pelvic computed
review: what aviation can teach Radiol 2010; 7(8): 593–602.
tomography (CT): discrepancy rates
radiology about performance 19. Lee CS, Nagy PG, Weaver SJ, among experienced radiologists.
improvement.Radiology 2011; 259(3): Newman-Toker DE. Cognitive and Eur Radiol 2010; 20(8): 1952–1957.
626–632. system factors contributing to
28. Horton KM, Johnson PT, Fishman EK.
9. National Advisory Group on diagnostic errors in radiology.
MDCT of the abdomen. Common
the Safety of Patients in AJR Am J Roentgenol 2013; 201(3):
misdiagnoses at a busy academic
England. A promise to learn – 611–617.
center. AJR Am J Roentgenol 2010;
a commitment to act. Improving the 194(3): 660–667.
safety of patients in England. London:
NHS England, 2013.
Standards for Learning from
12 www.rcr.ac.uk Discrepancies meetings

29. Levine CD, Aizenstein O, Lehavi O, 37. Modic MT, Obuchowski NA, Ross JS 45. Soffa DJ, Lewis RS, Sunshine JH,
Blachar A. Why we miss the diagnosis et al. Acute low back pain and Bhargavan M. Disagreement in
of appendicitis on abdominal CT: radiculopathy: MR imaging findings interpretation: a method for the
Evaluation of imaging features of and their prognostic role and effect on development of benchmarks for
appendicitis incorrectly diagnosed on outcome. Radiology 2005; 237(2): quality assurance in imaging. J Am
CT. AJR Am J Roentgenol 2005; 184(3): 597–604. Coll Radiol 2004; 1(3): 212–217.
855–859.
38. van Rijn JC, Klemetsö N, Reitsma JB 46. Borgstede JP, Lewis RS, Bhargavan M,
30. Blackmore CC, Terasawa T. Optimizing et al. Observer variation in MRI Sunshine JH. RADPEER quality
the interpretation of CT for evaluation of patients suspected of assurance program: A multifacility
appendicitis: modelling health utilities lumbar disk herniation. AJR Am J study of interpretative disagreement
for clinical practice. J Am Coll Radiol Roentgenol 2005; 184(1): 299–303. rates. J Am Coll Radiol 2004; 1(1):
2006; 3(2): 115–121. 59–65.
39. Offiah AC, Moon L, Hall CM,
31. Gangi S, Fletcher JG, Nathan MA et al. Todd-Pokropek A. Diagnostic 47. Wu MZ, McInnes MDF, Macdonald
Time interval between abnormalities accuracy of fracture detection in DB, Kielar AZ, Duigenan S. CT in
seen on CT and the clinical diagnosis suspected non-accidental injury: the adults: Systematic review and
of pancreatic cancer: retrospective effect of edge enhancement and meta-analysis of interpretation
review of CT scans obtained before digital display on observer discrepancy rates. Radiology 2014;
diagnosis. AJR Am J Roentgenol 2004; performance. Clin Radiol 2006; 61(2): 270(3): 717–735.
182(4): 897–903. 163–173.
48. www.rcr.ac.uk/READ (last accessed
32. Briggs RH, Rowbotham E, Johnstone 40. Krief OP, Huguet D. Shoulder pain and 21/08/2014)
AL, Chalmers AG. Provisional disability: comparison with MR
49. General Medical Council. Raising and
reporting of polytrauma CT by on-call findings. AJR Am J Roentgenol 2006;
acting on concerns about patient
radiology registrars. Is it safe? 186(5): 1234–1239.
safety. London: General Medical
Clin Radiol 2010: 65(8): 616–622.
41. Zanetti M, Pfirrmann CW, Schmid MR, Council, 2012.
33. Zan E, Yousem DM, Carone M, Lewin Romero J, Seifert B, Hodler J. Patients
50. The Royal College of Radiologists.
JS. Second-opinion consultations in with suspected meniscal tears:
Personal reflection on discrepancies
neuroradiology. Radiology 2010; prevalence of abnormalities seen on
and adverse events. London: The
255(1): 135–141. MRI of 100 symptomatic and 100
Royal College of Radiologists, 2010.
contralateral asymptomatic knees.
34. Jordan YJ, Jordan JE, Lightfoote JB,
AJR Am J Roentgenol 2003; 181(3): 51. Hussain S, Hussain JS, Karam A,
Ragland KD. Quality outcomes of
635–641. Vijayaraghavan G. Focused peer
reinterpretation of brain CT studies by
review: the end game of peer review. J
subspeciality experts in stroke 42. McCreadie G, Oliver TB. Eight CT
Am Coll Radiol 2012; 9(6): 430–433.
imaging. AJR Am J Roentgenol 2012; lessons that we learned the hard way:
199(6): 1365–1370. an analysis of current patterns of 52. Eisenberg RL, Cunningham ML,
radiological error and discrepancy Siewert B, Kruskal JB. Survey of faculty
35. Erly WK, Ashdown BC, Lucio RW 2nd,
with particular emphasis on CT. perceptions regarding a peer review
Carmody RF, Seeger JF, Alcala JN.
Clin Radiol 2009; 64(5): 491–499. system. J Am Coll Radiol 2014; 11(4):
Evaluation of emergency CT scans of
397–401.
the head: is there a community 43. Smidt N, Rutjes AW, van der Windt DA
standard? AJR Am J Roentgenol 2003; et al. Quality of reporting of diagnostic 53. Alkasab TK, Harvey HB, Gowda V,
180(6): 1727–1730. accuracy studies. Radiology 2005; Thrall JH, Rosenthal DI, Gazelle GS.
235(2): 347–353. Consensus-oriented group peer
36. Jarvik JG, Deyo RA. Moderate versus
review: a new process to review
mediocre: The reliability of spine MRI 44. Shewhart WA. Economic control of
radiologist work output. J Am
data interpretations. Radiology 2009; quality of manufactured product. New
Coll Radiol 2014; 11(2): 131–138.
250(1): 15–17. York: D Van Nostrand Company, 1931.
(Reprinted by ASQC Quality Press,
1980).
Standards for Learning from
Discrepancies meetings www.rcr.ac.uk 13

54. Gerada C, Jones R, Wessely A. Young


female doctors, mental health and the
NHS working environment. London:
British Medical Journal Careers, 2014.

55. Esmail A. The prejudices of good


people. BMJ 2004; 328(7454):
1448–1449.

56. Rosenthal MM. Promise and reality:


professional self-regulation and
problem colleagues. In: Lens P, van der
Wal G (eds). Problem doctors:
a conspiracy of silence. Amsterdam:
JOS Press, 1997.

57. FitzGerald R. Audit, GMC and Serious


Professional Misconduct. Newsletter
of The Royal College of Radiologists
2005; 83: 11–12.

58. Ullström S, Andreen Sachs M, Hansson


J, Ovretveit J, Brommels M. Suffering
in silence: a qualitative study of second
victims of adverse events. BMJ Qual
Saf 2014; 23(4): 325–331.

59. No authors listed. After Mid Staffs: the


NHS must do more to care for the
health of its staff. BMJ 2013;
346: f1503.

60. Dyer C. GMC and vulnerable doctors:


too blunt an instrument. BMJ 2013;
347: f6230.

61. Gunderman R, Chan S. Knowledge


sharing in radiology. Radiology 2003;
229(2): 314–317.

62. Berlin L. Hindsight bias. AJR Am J


Roentgenol 2000; 175(3): 597–601.

63. Berlin L. Outcome bias. AJR Am J


Roentgenol 2004; 183(3): 557–560.
Standards for Learning from
14 www.rcr.ac.uk Discrepancies meetings

Appendix 1. Learning from discrepancies template

Case to be discussed:
Name: DOB: Ref no:

Imaging studies:

Date of study:

Discrepancy to be discussed:

Clinical information provided at the time of request:

Is the clinical team aware of the discrepancy? Yes No

Assessment of discrepancy: learning and outcome

Discrepancy Reporting discrepancy System discrepancy


Effective Clinical Poor imaging/ Working
Perceptual Cognitive communication information patient factors conditions
No

Yes

Agreed learning points:

Agreed outcome/further action:


Standards for Learning from
Discrepancies meetings www.rcr.ac.uk 15

Appendix 2. Biases Information bias specific cause for the exceptionally


poor (or good) performance can be
Sampling bias Information bias may be minimised
pinpointed and allowing
by only giving clinical information
It is not possible to uncover all appropriate action to be taken.
that was available at the time of
radiology discrepancies, and reporting. In summary, variation cannot be
meetings will review only a eliminated and the important
percentage of the radiology Hindsight bias
difference between common cause
discrepancies.4 This sampling bias Hindsight bias is an inevitable result and special cause variation needs
will mean that LDMs cannot be of the fact that the review of cases to be recognised. As common
used to derive error rates for takes place in the setting of an LDM cause variation is inherent in a
individual radiologists. rather than the setting in which the process, its reduction can only be
Selection bias original report was issued.62 brought about by fundamental
changes to the process itself. In
Selection bias can arise in different Outcome bias
contrast, special cause variation is
ways. If only one radiologist There is a recognised tendency to due to factors which are extraneous
interprets a particular type of attribute blame more readily when to the process. Efforts to reduce
examination then there is potential the clinical outcome is serious. This special cause variation need to
for their discrepancies to remain may be reduced by withholding identify such factors so that they
undiscovered. Ultrasound information on the subsequent can be addressed without radically
discrepancies also tend to be clinical course of the patient when altering the whole process.44
under-represented in LDMs coming to a consensus decision on
compared with more easily Commercial bias
the degree of error.63
demonstrated plain film, CT and Commercial bias is when the
MR images. If two radiologists have Attendance bias
perception of commercial gain or
identical accuracy, but one reports Poor attendance at meetings may loss for a group or company in
far more examinations than the result in an inability to reach a competition distorts the fairness
other, the discrepancies of the more reasoned consensus on whether a of review.
productive radiologist are more discrepancy has occurred and its
available for selection. It is also severity because of the lack of
feasible that some may be reluctant critical mass of individuals who
to enter a discrepancy of their carry out the same type of work.
own or of their close colleagues,
yet have a lower threshold for Variation
entering apparent discrepancies All processes are subject to
of a colleague with who there variation in performance over time,
is friction.52,55,56 this is referred to as common cause
Presentation bias variation. Sometimes that variation
is greater than expected,
Presentation bias is difficult to avoid suggesting that there is a specific
as it is frequently necessary to cause for performance falling
select or focus the review to avoid outside the usual range. This is
lengthy and cumbersome reviews referred to as special cause
of large image data sets which variation. When identified, this
would be tedious and impact should lead to all steps in the
adversely on the learning process. process being examined to see if a
Citation details

The Royal College of Radiologists. Standards for For permission to reproduce any of the content
Learning from Discrepancies meetings. London: contained herein, please email: permissions@rcr.ac.uk
The Royal College of Radiologists, 2014.
This material has been produced by The Royal
Ref No. BFCR(14)11 College of Radiologists (RCR) for use internally
© The Royal College of Radiologists, October 2014. within the specialties of clinical oncology and clinical
radiology in the United Kingdom. It is provided for
use by appropriately qualified professionals, and the
making of any decision regarding the applicability and
suitability of the material in any particular circumstance
is subject to the user’s professional judgement.

While every reasonable care has been taken to ensure


the accuracy of the material, RCR cannot accept any
responsibility for any action taken, or not taken, on
the basis of it. As publisher, RCR shall not be liable to
any person for any loss or damage, which may arise
from the use of any of the material. The RCR does not
exclude or limit liability for death or personal injury to
the extent only that the same arises as a result of the
negligence of RCR, its employees, Officers, members
and Fellows, or any other person contributing to the
formulation of the material.

The Royal College of Radiologists


63 Lincoln’s Inn Fields, London WC2A 3JW
Tel: +44 (0)20 7405 1282
Email: enquiries@rcr.ac.uk www.rcr.ac.uk

A Charity registered with the Charity Commission No. 211540

The Royal College of Radiologists

Anda mungkin juga menyukai