Anda di halaman 1dari 4

COMMENTARIES Institute of Medicine report

COMMENTARIES

The Institute of Medicine report on


medical error: Overview and
implications for pharmacy
LINDA T. KOHN
Am J Health-Syst Pharm. 2001; 58:63-6

n October 1996, a tragic medication error involving a newborn occurred.1 Shortly after the birth of a
healthy baby boy, the hospitals staff
learned that the mother had previously had syphilis. There were some
language problems, and it was not
certain whether the mother had been
treated. A physician wrote an order
for penicillin G benzathine 150,000
units to be administered intramuscularly to the infant for possible congenital syphilis. The medication was
administered by two nurses intravenously, and the baby died.
Multiple details complicate this
case. The pharmacist dispensed a 10fold overdose (1.5 million units) but
did not catch the error because she
was not a pediatrics pharmacist and
penicillin G benzathine was a nonformulary drug. When the physician
wrote the order for 150,000 units, the
abbreviation U was used in place of
the word units. The pharmacist
misread the dosage in written drug
references and in the prescription
and dispensed the overdose. Because
the amount of medication that came
from the pharmacy would have required five intramuscular injections
for the baby, the nurses looked for a
way to avoid multiple injections and
the attendant discomfort to the baby.
They changed the route of administration after not finding any specific

warnings against doing so in the references they checked.


The two prefilled 1.2-million-unit
syringes dispensed by the pharmacy
had been labeled I.M. use only by
the manufacturer. Unfortunately, the
warning could not be seen because it
was not prominently placed on the
syringe; the syringe had to be rotated
180 away from the drug name for
the warning to be read. Also, once the
prefilled syringe was assembled for
use, the plunger partially obscured
the warning and the M. in I.M., so
it might have been viewed as saying
I.V. use only.
There is no simple answer to the
question of what caused this tragic
error. The nurses played a role, as did
the pharmacist, the prescribing physician, and the manufacturer. Even
the hospitals administration played a
role in not having in place automated
order-entry systems to avoid handwritten prescriptions and to flag prescriptions that exceeded the maximum
dose. Staff members lacked the tools
they needed to do their jobs safely.
The written analysis of this case
found over 50 things that went wrong.1

LINDA T. KOHN is Senior Program Officer, Division of Healthcare Services, Institute of


Medicine, 2101 Constitution Avenue, NW,
FO3113, Washington, DC 20418.
Presented at the ASHP Annual Meeting,

Each of those 50 items contributed in


part to the error. No single action or
single person was responsible; it was
many things. If only one of them had
been caught, the chain of events that
led to the babys death might have been
broken. Nobody wanted the death to
happen, but it did.
System theory
Medical errors occur because of
failures in the medical system. To reduce the chance of an error, we need
to make the system safer. Often when
an error happens, the first response is
to blame the person who made the
mistake. For example, if a medication
error occurs, the nurse who administered the medication gets blamed. But
the nurse is just the last link in a chain
of events that gets the medication to a
patient. The chain also includes prescribing, dispensing, and monitoring.
And each one of those parts of the process has many subparts.
It does not help to admonish people to be more careful. Most health
professionals are already careful; they
are trained to be careful. Errors happen because of the convergence of a
number of fault points embedded in
the system. Blaming the last person in
the chain of events is not going to
solve much because things may have
started going wrong long before. The
real cause of the error is latent; it is
embedded in the system. To understand what went wrong, we have to
dig deeper and go further back into
the process of care.
What fault points are embedded
Philadelphia, PA, June 5, 2000.
Copyright 2001, American Society of
Health-System Pharmacists, Inc. All rights reserved. 1079-2082/01/0101-0063$06.00.

Am J Health-Syst PharmVol 58 Jan 1, 2001

63

COMMENTARIES Institute of Medicine report

in health care systems? Examples are


flaws in scheduling, training, team
functioning, communication, equipment, information, procedures,
hand-offs, and management structures that add steps and layers and
make things more complicated.
System theory tells us that it is not
any one event that produces an error
but rather the culmination and interaction of multiple events. The key is
the interdependencies within the system. Each component of an airplane
can work perfectly, but it is how all
the parts work together that determines whether the airplane will fly.
The problem is not good or bad
practitioners; the problem is that
we are human. Until the results of the
Human Genome Project can be applied and we can build a better human being, we are going to have to
build better systems. That is where
the improvements must be made.
Quality of Health Care
in America Project
The Institute of Medicine (IOM)
report To Err Is Human: Building a
Safer Health System2 is the first in a
series of reports to be produced as a
part of a larger undertaking called the
Quality of Health Care in America
Project. This project was begun in
1998 to identify strategies for achieving a threshold improvement in quality over the next 10 years.
The project was begun in the area
of patient safety for several reasons.
First, errors are responsible for an
immense burden of patient injury,
suffering, and death. Second, whereas
some aspects of quality are difficult to
communicate about, errors are readily
understandable to both the general
public and providers. Third, we know
from research that many errors are
avoidable. Finally, there is a sizable
body of knowledge and successful experience in other industries that can be
drawn upon in tackling the issue.
Scope of the problem
It has been estimated that at least
44,000 people die each year as a re64

sult of medical errors. That is more


than the number of people who die
from breast cancer, AIDS, or motor
vehicle accidents.
Adverse events are injuries due to
medical treatment. Not all adverse
events are caused by errors; some result from differences among patients
and their responses to treatment.
However, about half of all adverse
events are believed to be preventable.
Errors are very costly. The annual
cost to this nation of preventable adverse events has been estimated at
$17 billion, of which about half consists of direct health care costs. This
counts only the measurable costs and
does not reflect the loss of trust
among patients or of morale among
health professionals.
Several studies suggest that the
most common type of error involves
medications. That is probably because drug therapy is the most common medical intervention. An estimated 2% of people admitted to hospitals are subjected to preventable
adverse drug events, which adds
some $4700 to the cost of hospitalizing each of those patients.
Definitions
One of the first steps in preparing
the IOM report was to define some
basic terms, such as error and
safety. In the report an error is defined as the failure of a planned action to be completed as intended
(i.e., error in execution) or the use of
a wrong plan to achieve an aim (i.e.,
error in planning). An error in execution, for example, would be if a
physician meant to write 1 mg on a
prescription and instead wrote 10
mg. The correct medication was chosen for the patients condition, but
the action did not proceed as intended. However, if a physician selected
the wrong drug because the diagnosis was wrong, that would be an error
in planning. Either way, you do not
end up where you want to be. Not all
adverse events are due to errors.
Some adverse events will still occur,
and patients may still experience bad

Am J Health-Syst PharmVol 58 Jan 1, 2001

outcomes. But as processes of care


are improved and made more reliable, adverse events resulting from
errors should be reduced.
Safety is a broader concept than
error. In the IOM report, safety was
defined as freedom from accidental
injury. This definition takes a more
consumer-oriented perspective. It is
believed that, from the consumers
perspective, the highest-priority
safety goal is freedom from accidental injury.
A four-part plan to reduce errors
in health care
The IOM report contains a fourpart plan for reducing errors in
health care. The first part is to provide leadership and a research focus
on patient safety through the establishment of a center for patient safety.
This undertaking could be modeled
on efforts in other industries. For example, aviation and occupational
health place a heavy emphasis on
safety and have strong research and
development activities related to
safety. The second part of the plan is
to develop reporting mechanisms for
learning about and from errors.
Gathering information about errors
is central to understanding them and
improving performance. The third
part is to establish safety-related performance standards that are explicit
and known by everyone who provides health care or uses it. Such standards can help set expectations for
safety by communicating what is important. The fourth part is ensuring
that health care organizations implement safety improvements within
themselves. The bottom line of all the
initiatives is to improve how health
care is delivered.
Part 1: A center for patient safety.
The IOM report recommends creating a center for patient safety at the
federal Agency for Healthcare Research and Quality. The center
should set national goals for patient
safety, track progress in meeting
those goals, issue an annual report to
the president and Congress on pa-

COMMENTARIES Institute of Medicine report

tient safety, and foster an understanding of errors in health care by


developing a research agenda, funding centers of excellence, evaluating
methods for identifying and preventing errors, and communicating the
findings.
Although this recommendation
received less attention than others,
this is one of the most important recommendations in the report because
it provides a framework for advancing the science of safety in health care
and getting the message out to the
industry and the public.
This recommendation has a couple of implications for pharmacy.
ASHP and other pharmacy organizations should make sure that improved understanding of medication
errors is part of a research agenda
and that resources are allocated for
that. Also, if there is an annual report
to Congress, it is likely that medication errors will be a part of it since
they are believed to be among the
most common types of errors in
health care. Pharmacists should have
input into any goals that may be established for medication safety. Such
goals could be useful benchmarks in
pharmacists own work, but only if
the goals (and their measures) are
relevant and valid.
Part 2: Error-reporting systems.
How can we learn about and from
errors when they happen? In the context of the IOM report, error-reporting systems refer to the reporting of
errors to a body outside the health
care organization.
Reporting programs can have several purposes. One of the core assets
for improvement in quality is information about errors. Reporting programs provide that information.
Also, reporting programs can provide an incentive to health care organizations to set up good internal systems for detecting and preventing errors. By encouraging all health care
organizations to make a minimum
investment in patient safety, a more
level playing field is created.
Reporting systems can offer some

level of assurance to the public that


the most serious concerns are being
reported and investigated. Reporting
systems can communicate to everybody that errors should be identified,
analyzed, and understood.
The IOM report recommends a
two-part approach to reporting.
First, there should be a nationwide
program, implemented at the state
level, that requires reporting of the
most serious errorsthose that result in serious harm or death. Second, voluntary, confidential reporting systems should be encouraged to
focus on errors that do not result in
patient harm (also called near-misses) or that cause less serious harm.
The goal is to gather information on
hazards in the system that have the
potential to cause serious harm.
Many people believe that this is one
of the most promising areas for reducing errors.
Voluntary and confidential external reporting systems already exist in
the medication area. Most pharmacists already know about some of
them, and they are there to be used.
Also, reporting systems inside ones
own health care organization can be
used to create a learning environment and can contribute to the cultural change that says it is acceptable
to identify errors.
Part 3: Setting performance standards. The IOM report recommends
setting expectations and standards
for safety that apply to health care
organizations, health professionals,
and drugs and medical devices. In
terms of health care organizations,
the report examines how licensing
and accrediting bodies address safety
issues and how those efforts can be
enhanced. In terms of health care
professionals, the report considers
how licensing bodies and professional associations address safety. Specifically, licensing bodies should find
better and faster ways for identifying
the small number of unsafe practitioners. Professional societies should
pursue activities to highlight patient
safety through their training pro-

grams, conferences, and journals and


through collaboration with other societies and groups.
The report recommends actions
by FDA in the area of drug product
premarketing and postmarketing.
Specifically, the report discusses the
design of packaging and labeling,
testing drug names to reduce confusion among sound-alike and lookalike drugs, and needed improvements in postmarketing surveillance.
Part 4: Creating a culture of safety in health care organizations.
Health care organizations need to
create a culture of safetyjust as other industries, like aviation, have done
very successfully. The report offers a
few examples of what it means to
build a culture of safety.
Provide leadership for safety. Yes,
safety is everyones responsibility, but
everyone takes his or her responsibilities more seriously when there is a
clear message from the top of the organization that safety is important.
This includes making patient safety a
corporate priority and giving specific
assignments for safety oversight. Providing leadership also means devoting budgetary resources to safety improvement.
Design jobs with safety in mind.
There are some basic design principles, such as simplification and standardization, that may improve safety.
For example, if processes can be
made simplerfewer steps, fewer
peoplethere is usually less chance
of an error happening. Processes of
care should be reviewed to see what
unnecessary steps can be removed.
Another basic design principle is
to design jobs to avoid reliance on
memory. A study by Chassin3 in 1998
looked at the volume of reports on
randomized clinical drug trials that
were published in the peer-reviewed
literature. In 1966 about 100 articles
were published; in 1995 there were
about 10,000. Some 5000 brandname, generic, and nonprescription
drugs crowd the marketplace today,
with half of these appearing since
1990. For those who lack the worlds

Am J Health-Syst PharmVol 58 Jan 1, 2001

65

COMMENTARIES Institute of Medicine report

greatest memory, tools are needed to


cope with the overflow of drug information. Computer technology offers
solutions, including decision-support and order-entry systems.
Promote team functioning. Medical care is complex. A single patient
receives care from a whole team of
people, even if the people are not in
the same room at the same time. Interdisciplinary training programs are
needed so that people who work together are trained together. The airlines spend a good amount of time
looking at crew resource management and how people in the cockpit
work together to know what each
person is doing. There are probably
some lessons there that can be applied to health care.
Expect the unexpected and design
for recovery. The system should have
built-in procedures governing what
should happen if an error occurs. Errors must be spotted quickly and interventions made either to prevent

66

them from reaching the patient or to


deal with the consequences if they do.
Create a learning environment. An
environment is needed in the workplace whereby it is okay to identify
and talk about errors. This environment should include mechanisms of
continuous feedback, since, although
it is important to talk about errors,
the talking will stop unless people
know that things can change. People
need to know that change is possible.
Challenge to pharmacists
I challenge pharmacists to be
agents of change in their own organizations and to extend to others their
knowledge about how to reduce errors. It is incumbent on those with
effective approaches for reducing errors to share them throughout organizations. Also, pharmacists cannot
create safe environments for patients
by staying within the pharmacys
walls. There are effective approaches
to be taken within pharmacy, but

Am J Health-Syst PharmVol 58 Jan 1, 2001

pharmacy is only one step in the continuum of care.


While pharmacists can be proud
of the improvements in safety that
have already been achieved through
the efforts of organizations like
ASHP, more is needed. The profession must continue to demonstrate
how systems can be improved to reduce errors today and, if possible, to
eliminate them tomorrow. Pharmacists know the chain of events in ways
that many others do not, and therein
lies their opportunity to effect real
and lasting improvements.
References
1. Smetzer JL, Cohen MR. Lesson from the
Denver medication error/criminal negligence case: look beyond blaming individuals. Hosp Pharm. 1998; 33:640-2,6456,654-7.
2. Kohn LT, Corrigan JM, Donaldson MS,
eds. To err is human: building a safer
health system. Washington, DC: National
Academy Press; 1999. Available at
www.nap.edu/catalog/9728.html.
3. Chassin MR. Is health care ready for six
sigma quality? Milbank Q. 1998; 76:565-91.

Anda mungkin juga menyukai