Anda di halaman 1dari 79

So, What’s on Your

Dashboard?
Prepared for the
2007 ICSI/IHI Colloquium
Redesigning for Results:
Quantum leaps in Health Care
Prepared and Presented by
Robert Lloyd, Ph.D.
Executive Director Performance Improvement
Institute for Healthcare Improvement

May 17, 2007


“When you can measure what you are speaking
about and express it in numbers, you know
something about it; but when you cannot
measure it, when you cannot express it in
numbers, your knowledge is of a meagre and
unsatisfactory kind.”
Lord Kelvin, May 3, 1883

“In God we trust.


All others bring data.”
W. E. Deming
All Health Systems need…
…processes to continuously measure
clinical quality and value as well as
business performance, at all levels of
a health system, to guide planning,
management and improvement.
Adapted from Dr. Gene Nelson Dartmouth Hitchcock Medical Center
The Specifics of a
Measurement System
1. Process and outcome measures based on evidence and
patient perceptions
• NQF, JCAHO, CMS

2. Ways to improve quality and take out costs


• CMS, Leapfrog

3. Transparent quality and cost measures for public to use


• IOM, CMS, AHA, Dartmouth Atlas, state data commissions

4. High performance measure at all levels of organization to


improve quality & value & operating margin while
attracting a highly engaged workforce
• Boards, Bond Raters (Moody, Standard & Poors)

Adapted from Dr. Gene Nelson Dartmouth Hitchcock Medical Center


Do you have a plan to guide
your performance
measurement journey?
This W
ay!
If you No, this Way!
don’t The
direct choice is
your Wrong W
ay! really
journey, quite
someone simple!
else will!
The Central Question…

H o w
How do you d b y
G o o
identify and e n?
move your W h
dots?
Identifying your … Not
My
Dot

My
dot

My
dot
My
dot
My
My
Not
My dot
dot Dot
Not
My
Dot Not
My My Not
dot Dot My
dot My
Dot
A good starting point
Developing a Measurement Philosophy
• Responsible leadership demands that we know our
data better than anyone else.
• It further requires that we have processes in place
to accurately and consistently collect a balanced
set of measures that monitor clinical outcomes,
customer satisfaction, functional status and
resource utilization.
• Finally, we must use data to develop improvement
strategies and then take action to make these
strategies
a reality.
“In spite of a general
agreement by most senior
leaders of the critical need
for a strategic measurement
set, some organizations stop
short of establishing
quantifiable measures of all
dimensions of their
strategies, except financial.
They would do well to mimic
the same logic they follow in
their financial accounting
system for their strategic
requirements.”
Chip Caldwell, p. 97
ASQ Press, 1995
Categories of Measures
IOM Report Dimensions for Improvement
• Safe - as safe in health care as in our homes
• Effective - matching care to science; avoiding overuse of
ineffective care and underuse of effective care
• Patient-centered - honoring the individual, and respecting
choice
• Timely - less waiting for both patients and those who give
care
• Efficient - reducing waste
• Equitable - closing racial and ethnic gaps in health status

All measures should be connected to your strategic objectives


Every concept can have many measures
Concept Potential Measures
Patient Falls Prevention Percent falls
Fall rate
Number of falls

C-Sections Percent C-sections


Number of C-Sections
C-Section rate

Employee Evaluations Percent of evaluations completed on


time
Number of evaluations completed
Variance from due date
The Quality Measurement Journey
AIM (Why are you measuring?)

Concept
Measures (S + P = O)
Operational Definitions
Data Collection Plan
Data Collection
Analysis ACTION
S +P=O
Structure + Process = Outcome

Source: Donabedian, A. Explorations in Quality Assessment and Monitoring.


Volume I: The Definition of Quality and Approaches to its Assessment. Ann Arbor,
MI, Health Administration Press, 1980.
The Quality Measurement Journey
AIM – freedom from harm
Concept – reduce patient falls
Measure – IP falls rate (falls per 1000 patient days)
Operational Definitions - # falls/inpatient days
Data Collection Plan – monthly; no sampling; all IP units
Data Collection – unit submits data to RM; RM assembles and
send to QM for analysis

Analysis – control chart Tests of


Change
Report Cards, Instrument Panels
and Dashboards: Who Needs What?
Report Card Image
Judgmental . . . Static . . . Anger/Fear

• Who gets them? Students


• Who gives them? Teachers
• Focus? Past Performance
• Who wins? The A’s
• Who loses? Everybody else
• What’s learned? I’m above average, average or
below average
Adapted from “Report Cards or Instrument Panels: Who Needs What?
By Eugene Nelson, et al, Journal of Quality Improvement
volume 21, number 4, April, 1995.
Instrumental Panel or Dashboard Image

Decision making . . . Dynamic . . . Empowered

• Who uses them? Cockpit crew (pilot, copilot, navigator)


• Who interprets? Cockpit crew
• Focus? Present and future
• Utility? Real-time monitoring, predicting the future and taking
action

The instrument panel or dashboard metaphor has an entirely different aura


from that of the report card. It has vitality, timeliness, and a clear-cut utility
that is absent from report card thinking. A key feature is providing critical,
real-time information to the user to prompt wise decisions and, if need be,
make rapid midcourse corrections.

Adapted from “Report Cards or Instrument Panels: Who Needs What?


By Eugene Nelson, et al, Journal of Quality Improvement
volume 21, number 4, April, 1995.
What are the benefits of developing and
using a Dashboard of Strategic Indicators?
• It brings together, in a single management report, many of the
seemingly disparate elements of an organization’s strategic agenda.
• It helps to reduce information overload, by focusing on the “vital few”
indicators.
• It helps to guard against suboptimization by forcing senior managers
to consider all the important measures together and lets them see
whether improvement in one area may be achieved at the expense of
another.
• It puts strategy and vision, rather than control, at the center of an
organization’s effort.
• It is based on an understanding of interrelationships between
functions, not on the performance of individual functions or units.
• It provides an opportunity for organizational learning at the executive
level.
Focus on the Vital Few!
There are many things in life that are
interesting to know. It far more important,
however, to work on those things that are
essential to quality than to spend time
working on what is merely interesting!

The challenge, therefore, is to be disciplined


enough to focus on the essential (or vital
few) things and set aside those things that
might be interesting but trivial!
Cascading Measures

Top Down
Or
Bottom Up?
Traditional View of a Dashboard
(top down)
Measures are
Macro Level Metrics
determined by
the senior
management
Board &
team CEO
and they
cascade
Service
down to the
Lines
frontline
staff
Units, Wards & Departments

Care Givers, Patients & Families


Micro Metrics
System Levels Example
Nursing
Services
Macrosystem

Nursing
Mesosystem Divisions

Frontline
Nursing
Microsystem Units

Source: Hendriks & Bojestig, Jonkoping CC Sweden


Levels of Health System:
IOM Chasm Report Chain of Effect
1. Patient
2. Physician
3. Clinical Unit / Microsystem
4. Clinical Service Line / Mesosystem
5. Health System / Macrosystem
Information System Design Principle: Capture data at lowest level
and aggregate up to higher levels to create a dashboard of metrics
throughout system.
The Big Picture: Inverted Pyramid
Micro
Qm1 + Qm2 + Qm3 + Qm4 = QHS

ED CATH CCU 4-East

1-N 1-N
T1 T2

Meso

A B C D E F 1 2 3 4 5 6
AMI Evidence Base AMI Quality Metrics

Macro
IOM - Chasm Local Competition
NQF - Metrics Pay for Performance
IHI – 100K JCAHO, CMS, NCQA

IHI – Whole System Metrics


©2005, Trustees of Dartmouth College, Nelson, January
Micro
ED CATH CCU 4-East

1-N 1-N
T1 T2
Value Improvement:
Meso
Quality + Costs

A B C D E F Meso 1 2 3 4 5 6
Patient Evidence-Based
Perceptions Macro Quality Metrics

Transparent Performance
Outcomes Margins, ROI
©2005, Trustees of Dartmouth College, Nelson, January
Building an Integrated System of Measures

Micro
Patient & Physician

Micro: Clinical Units

Meso

Meso: Service Lines

Macro

Adapted from Lloyd & Caldwell


A Condescending Set of Strategic Measures
Dispense Administer
Order med Prepare med
med med
Micro

Medication
administration
Meso

Promptness/TLC

Macro
Percent of patients
bragging

Adapted from C. Caldwell, Mentoring Strategic Change in Health Care, ASQ Press, 1995.
Creating Dashboards
and Measuring for Change:
Examples from
The US
Sweden
Scotland
Norway
The Netherlands
The IHI Dashboard of
Whole System
Measures
Aim
To identify a core set of quality
measures that form a system of
measures that can be used to drive
quality improvement in health systems
What are we measuring?
IOM Dimension Measures
Safe • Adverse drug events
• Work days lost
Effective • Hospital Standardized Mortality Ratio (HSMR)
• Functional outcomes
Patient-Centered • Patient satisfaction
• Percent patients dying in hospital
Timely • 3rd next appointment available
Efficient • Costs per capita
• Hospital specific standardized reimbursement
Care Continuum & Measures
Locus of Measure and Frequency
System Measure Outpatient Inpatient Region
Adverse Drug Events X X
Work Days Lost X X
HSMR X
Functional Outcomes: SF -6 X X
Patient Satisfaction X X
Percent Patients Dying in Hospital X
3rd Available Appointment X
Costs Per Capita X
Hospital Specific Standardized Reimbursement X
System View & Measures
Figure 1. Health System and Selected Measures Key Outcomes

SF-6™

Functional
Outcomes

Patients Patients Clinical


Process of Providing Satisfaction
with Health with Health Outcomes
Services
Need Needs Met

HSMR
Patient
Mortality Costs Satisfaction
End of
Access
Life
ADE Rate Work days
Standardized lost
reimbursement
Ÿ
3rd available Percent
appointment dying in
hospital
Costs per capita

Ovals Denote Measures


WORKIN Overall Measures Geisinger Thedacare St John McLeod
G DRAFT Medical Ctrs Clark
Primary Care Appleton Sweden- R Sweden- V

Norfolk + N Devon + R Devon + Bradford +


SAFE
 ADES/1000 ✪ ✬ rate per 1000 doses
0 1 2 3

EFFECTIVE

 HSMR for US ✪ ✬ rate; expected = 100


70 80 90 100 110 120 130

 HSMR for UK ✪ ✬ rate; expected = 100


70 80 90 100 110
 HSMR for Sweden ✪ ✬ rate; expected = 100
60 70 80 90 100

PATIENT-CENTERED

Inpatient Satisfaction (Press Ganey) Score 0-100



✬ ✪
60 70 80 90 100
% ofpatientsin Region dying in Hospital
(age 65+
years) Dartmouth Atlas
0 10 20 30 40 50

TIMELY

 Access; 3rd next available for Primary Care


✪ ✬ # of days
0 1 2 3 4 5 6 7 8 9 10

EFFICIENT

 Hospital costs per discharge ✪ ✬ $ US dollars


$0 $3,000 $6,000 $9,000

 Cost per Capita (age 65+ years) ✪ ✬ Dartmouth Atlas


$4,000 $5,000 $6,000

STAFF TURNOVER

 % voluntary staff turnover per month


0 1 2 3
%
Creating Dashboards
and Measuring for Change
in Sweden
Goran Hendriks
Jonkoping County Council,
Sweden
Ref: Nilsson,Bojestig, Edvinsson,Henriks, Berger
Why dashboards?
• Arena for dialogue
• Transparency
• Realying human capital
• Develop microsystem ability to improve
• Execution
• Develop capability
• Business case
System measures for the County of
Jönköping

Sick leave Staff turnover


Actual year December 2005
worker

SF-6TM

Mortality Functional
Patient Process of Patient with
Health Need Outcomes
with Providing Services
Met
Health Need
Clinical
Outcomes Satisfaction

Access End of life

HSMR Costs Inpatient


Mortality
Satisfaction
Reimbursement 3rd Available Numbers
within 30days Appointment dying in
Hospital
ADE Costs per
Number of care discharge
days during the
last 6 months of Reports
life to PSR
Costs
per
capita
Hospital Standardised Mortality Ratios (HSMRs with 95% CIs) Swedish Counties 1998-2001

140

120

100

HSMR by Hospital
HSMRs 1998-2001

80

60

40

20

Percent Raw Mortality


0
nd
g
m

nd

ge
nd

ar

nd
la

n
as

an

nd
e
na

nd
g

rg
d
nd
in

o
tte

ån

te
er
ol

an
sa

lm
tla

br
la

bo
la

in

la
al
kn

la
ar

la
öp

tla

ot
kh

ob
bo

re
Sk
rrl
al

an
an
öt
pp

ek

Ka

rm

ot
m
al

le
sa

rb
nk

Ö
H
oc

on
G

no

G
er
D

äv

rm
m

Bl
U


or

er
st

Kr
St

er
ra

st

G
N

de
st


st
st



Antal avlidna per vårdtillfällen för Landstinget i Jönköpings län


2002 2003 2004 2005 2006
Number of patients dying 0,030
at hospital related to

Antal avlidna/vårdtillfällen
discharge 1

0,025 UCL=0,02511

2002 mean: 21 deaths/1000


discharges 0,020 _
U=0,01927
2003 mean: 20 deaths/1000
discharges
0,015
2004 mean: 20 deaths/1000
LCL=0,01343
discharges
2005 mean:18 deaths/1000 0,010
discharges
M ri

20 k to i

M ri

20 ov st i

M ri

D k to i

ri

D k to st i

ap r i
ril

o k gu i
to st i
s

s
aj

A aj

Au j
Ja er

n r

F e mb r
br er

F mb r
ru r

r
O Jul

a u ju n
Ja b e

e e

e e
eb e

be
ar

ar

ar
a

ua

ua

a
M

M
Ju
N gu

O gu
03 b

ec b

ec b
nu

nu

04 e m
u
Ja

O
02
20

Tests performed with unequal sample sizes


Datakälla: Master, Ebba, Spas
Andelen palliativa patienter som dör på sjukhus 2004-2006
2004 2005 2006
0,30
One site is missing
0,25 so the numbers are
unsafe for 2006
0,20
Andel

UCL=0,1596
0,15

0,10 _
P=0,0774
0,05

0,00 LCL=0
ua i
M ri

M il

Se A u Ju i
pt gu l i
O mb st i

ua i
M ri

M il
ni
pt gu l i
O mb st i

ua i
m ri

m il
ni
pt gu l i
bei
A rs

A rs

apr s
Ju j

Ju j

ju j
o o r
20 ec mb er
Ja b r
F e nu er

o o r
20 ec mb r
06 em er
fe nu r

r
br a r

br a r

br a r

em st
n
a

N kt e

05 e m e

N kt e
D ve b e

ja b e

a
Se A u Ju

se au ju
pr

pr

r
a

a
D ve b
F e nu
Ja

e
04
20

Tests performed with unequal sample sizes

Percent = Number of palliative care patients that dies at hospital (numerator)


divided by Number of palliative care patients that died that month and are in our
register (denominator)
Datakälla: Inrapportering från palliativa
team via Britt- Louise Ekholm, Göran
Runesson och Catrin Fernholm
Outcomes - Process level

Decrease in hospitalization for Pediatric Asthma


per 10,000 children
# of Hospitalizations per 10,000 children

35

US National
30
average

25 Value for Sweden


1997 is missing

20
Value for Sweden
Sweden 2003
is missing Jönköping
15

10
Goal is less then 10 1
hospitalisations per
10000 children
5

94 95 96 97 98 99 00 01 02 03 04
19 19 19 19 19 19 20 20 20 20 20

1) Special cause: RS-virus epidemic among children aged < 3 year.

Andersson-Gäre, Oldaeus -04


Inpatients B eläggning 140 Watingtime
110 120
900
100 100

dagar
800
90 80
Antal

700

%
80 60
600 40
70
500 20
60
400 0

5,5 Lengt of stay 50 Number of deths Patientsatisfaction


5 10
40 9,5
4,5

antal
dagar

30 9
4

%
8,5
3,5 8
20
3 7,5
2,5 10 7
2
0

Contact with coordinator Percentage of deths R eadmissions within 14 days


12
10
200 8,00
8

%
Antal

150 6,00 6
andel

4
100 4,00 2
50 2,00 0

0 0,00

Staff satisfaction
100
95
90
%

85
80
75
70
Patient
results

Jönköping

QI work

Figure 8 Situation of Jönköping County Council

Quality Management in Health Care submission


Improvements for patients? Findings from an independent case study of the Jönköping improvement program
John Øvretveit
Director of Research, Medical Management Centre, The Karolinska Institutet, Stockholm, and Professor of Health Management, Faculty of Medicine, Bergen
University, Norway.
Anthony Staines MBA, MHA, MPA, researcher, IFROSS, University Lyon III; France – Vice-Chairman of sanaCERT, Accreditation Body for the Swiss Hospitals.
Kaiser Permanente’s
Big Q
Quality Performance
Dashboard
Matt Stiefel
Kaiser Permanente
Oakland, CA
Why Dashboards?
• The various domains of quality, including
clinical effectiveness, safety, risk
management, service and resource
stewardship, were in silo’s
• Too many measures, not enough focus
– 300 page binders of quality performance
data for the Board
– Inspired by IHI to develop “Big Dots”
Clinical Quality
Resource
Stewardship

Service
Big “Q”

Risk
Safety Management
Background: Charge and Goal
Charge
– The KFH/HP Board of Directors Quality and Health
Improvement Committee (QHIC) requested that the data and
information utilized in the oversight process for quality be
streamlined

Goal
– The goal is to develop balanced measures of system
performance to provide QHIC with data that:
• present a top-level, “big dot” view of overall quality
performance
• enable drill-down by geography and measures
• show trends for important outcomes over time
• provides comparative metrics for benchmarking results
across different organizations
• serves as input to strategic quality improvement planning
How?
• Assembled (and introduced) multi-disciplinary
team from the various quality domains
• Developed top-level metrics (Big Dots) in each
domain
– Developed indexes where single measures
didn’t exist
• Reviewed with various stakeholder groups
• Developed preliminary design, reviewed with
dashboard design consultant (yes, they exist –
see reference at end)
• Continuous design and content improvements
through widespread review
• Ongoing training for users
Dashboard Design Tip:
Parsimony
• Highlight a small number of the most
important measures
• Everything has a purpose; no needless
information, precision, detail, decoration,
graphics or clutter
– Eloquence through simplicity
• Use color only to make a point
• Stick to 1 screen, with drill-down
A Note on Indexing
• Top-level index
Top-Level
Index
• Subscales
• Individual measures
– scaled to a common
metric
Subscales
– weighted on basis
of population
health impact
Individual
• Ability to drill down
from level to level Measures
CPI
Indexing is Individual
common, with measures are
applications from organized into a
sports to economic variety of subscales
forecasting to meet the needs
of diverse users

etc.
Asthma
etc.

CHF
etes
Energy
sing

Food

D iab
H ou

Consumer Price Index For example, by chronic condition

… or yet other
subscales to
…or other
subscales for address special
diverse uses needs or interests
Ou ter al

Un it
In linic

Un nit 1
tc me Pro

it 2
Safety

etc.
C

om dia

U
mortali ty
Inpatient
ed)

3
(W e sures
es te

mea IS
ight
HED

Patie nt
By Type of Measure
Ou es
ce

tc
ss

om
es
What’s Next?
• Online dashboard (powered by
Cognos)
• Ability to drill-down by measure
detail or geography
• Broadly available
Creating Dashboards
and Measuring for Change
in Scotland
Pat O’Connor
NHS Tayside, Scotland
The Leadership Challenge
• We have become good at making
improvement happen for one
condition, on one unit, for a while.

• We haven’t learned how to get


measured results, quickly, across
many conditions for the whole
organization.
System Levels
Example
Nursing
Services
Macrosystem

Nursing
Mesosystem Divisions

Frontline
Nursing
Microsystem Units

Source: Bojestig, Jonkoping CC Sweden


Patient Safety Dashboards

• Leadership
• Peri-operative
• General ward
• Intensive Care
• Medicines Management
Patient Safety Peri-operative
Departmental Level Measures
Number of MRSA Bloodstream Infections Department Handwashing Compliance
Surgery Mean % Handwashing Compliance - Total
Total number of MRSA Bloodstream observed hand hygiene mean scores divided
Infections per month by maximum score which could be achieved
and multiplied by 100.
Individual Unit Level Measures
Overall Dashboard
See handout
Narrative Summary
Table of weekly activity by
department on and off
target
Reason for delay
Overall organisational
average off target

Weekly Executive update


Creating Dashboards
and Measuring for Change
in Norway
Ove Kjell Andersen
Oslo, Norway
If we shall make our health
care system
fundamentally better….

 We must transparantly
monitor and evaluate
services!

 We need dashboards!
If possible...
– Measure Results of the Process
– Measure in the Process
– Balanced measurement
Measures on our Dashboards
National Quality Indicators
• Hospital wide infection rate
• Cancelled surgery
• Patients evaluation of the information from
caregivers
• Real waiting time to the first clinical
consultation (psych.)
• Patients evaluation of waiting time
Dashboard for chief of surg.

• Real waiting time for elective surgery


• Patients evaluation of waiting time
• Cancelled surgery
• Infection rate
Dialogconference
Eastern Norway Regional Health Authority

• Written Discharge Report to the patient at the


time of discharge (n=?)
• Discharge Conversation where doctor and
patient go through the report (n=?)

• Disccarge report to doctors


– Medicaments
– Follow up: what and how
– Sick note (Score?)
I-chart on quality of the written report2002/2003
for
patients at Aker University Hospital
Breakthrough in psychiatry (mood disorders) 2002 / 2003

10,0

9,0

8,0

7,0

6,0
Mean
5,0
Score

UCL
4,0 LCL

3,0

2,0

1,0

0,0
10.02.2003

14.02.2003

03.03.2003

07.03.2003

11.03.2003

15.03.2003

19.03.2003

23.03.2003
02.04.2003

10.04.2003
06.04.2003
07.10.2002

11.10.2002

15.10.2002

19.10.2002

23.10.2002

27.10.2002
29.01.2003

06.02.2003

14.04.2003

18.04.2003
02.02.2003

Blocks of 21 days before and after changes


Diagram: Jan Vegard Bakali
Days

0
100
200
300
400
500
600
Sak1:
Sak 2:
Sak 3:
Sak 4:
Sak 5:
Sak 6:
Days

Sak 7:
Sak 8:
Sak 9:
Sak 10
Sak 11:
Sak 12:
Sak 13:
Sak 14:
Sak 15:
Mean

Sak 16:
Sak 17:
Sak 18:
Sak 19:
Sak 20:

Patient number
Sak 21
Sak 22
UCL

Sak 23
Sak 24
Sak 25
Sak 26
Sak 27
Sak 28
Sak 29
Sak 30
Breakthrough psychiatry (ADHD) 2004

Sak 31
Sak 32
Sak 33
Sak 34
LCL

Sak 35
Sak 36
Sak 37
Diagram: Andrea Melø

Sak 38
Sak 39
Sak 40
I-chart for assessment time in days. (Prior to, and after change)
I-chart.: Waiting time (referral =>admittance) Pat.15 - 20 years
Psych. Team, Asker og Bærum
240
220
200
180
160
140
Days

120
100
80
60
40
20
0

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32
Patient number (Results before and after change)
Mean: 62.29 UCL: 241.18 | Mean: 16.67 UCL: 42.13
Measure: Elise Gustavsen/Supervisor Finn Holm
I-chart: Waiting time (Patients > 20 years of age
Psych. Team, Asker og Bærum
220
200
180
160
140
120
Days

100
80
60
40
20
0

1 2 3 4 5 6 7 8 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51
Patient number (Before and after change)
Mean: 97.88 UCL: 216.12 | Mean: 60.55 UCL: 166.81
Measure Elise Gustavsen/Supervisor Finn Holm
Hospital Standardized
Mortality Ratio in the
Netherlands
Laurens Touwen
Reinier de Graaf Group,
The Netherlands
A new instrument
• Real Time Monitoring of
- mortality
- re-operations
- nursing days
- etc
• Modeled after the British institute
Dr Foster, by Sir Brian Jarman
• In Netherland introduced by
”De Praktijkindex”, based on LMR
HSMR
• Do you know how many patients
died in your hospital last year?
• How many deaths could have been
delayed?
• What interventions could improve
the situation?
• How can you measure the results?
Hospital age, sex, race, payer, admission source,
admission type standardised death rate vs age, diagnosis
standardised charge per admission, AHRQ 1997 data
180
160
140
Standardised death rate

120
100
80
60
40
20
0
0 5,000 10,000 15,000 20,000 25,000

Standardised charge $ per admission


Dutch hospital standardised mortality
ratios 2001-3(HSMRs) vs hospital
(standardised for age, sex, urgency/readmission, LOS within 50 CCS
groups
leading to 80% all deaths,excluding small hospitals and those with
140
poor data recording, using year 2000 standard)

120
HSMRs (95% CIs) 2001-2003

100

80

60

40

114
20 72 Difference of 42%
0
1 0 3

1 0 0

1 0 4

1 0 1

1 0 7

1 0 2
9 6

3 5

6 8

1 4

8 3

8 1

5 1

2 5

8 9

5 0

5 2

4 4

8 5

7 8

3 6

1 2

7 2

9 4

1 3

6 5

3 3

3 4

9 5

3 9

9 3

8 2

7 9

2 3

6 1

4 7

3 7

2 0

8 7

9 7

4 5

3 1

1 9

9 8

5 4
3

Hospital number (assigned by BJ)


How to improve?
• We analyzed 50 consecutive deaths of
internal medicine and 50 surgery.
• Adverse events in 6 groups:
0 – no effect
1 – momentary harm, needed intervention
2 – momentary harm, longer hospital stay
3 – contributed to permanent harm
4 – intervention needed to keep patient alive
5 – contributed to death
Step by step in the Netherlands
• 2005: Presentation to country –
boards of umbrella organizations
• 2006: Pilot with 10 hospitals in
analyzing methods and determining
interventions
• 2007: Report “”avoidable deaths?””
• 2007: Introduction RTM
• 2008: Presenting HSMR per hospital
“Quality begins with
intent, which is fixed
by management.”
W. E. Deming, Out of the Crisis, p.5
Framework:
Leadership for Improvement
1. Set Direction: Mission, Vision and Strategy
Make the future attractive PULL
PUSH Make the status quo uncomfortable

3. Build Will 4. Generate Ideas 5. Execute Change


• Plan for Improvement • Understand Organization as a • Use Model for Improvement for
• Set Aims/Allocate Resources System Design and Redesign
• Measure System Performance • Read and Scan Widely, Learning • Review and Guide Key Initiatives
• Provide Encouragement from Other Industries & Disciplines • Spread Ideas
• Make Financial Linkages • Benchmark to Find Ideas • Communicate Results
• Learn Subject Matter • Listen to Patients • Sustain Improved Levels of
• Invest in Research & Development Performance
• Manage Knowledge

2. Establish the Foundation


• Reframe Operating Values • Prepare Personally • Build Relationships
• Build Improvement Capability • Choose and Align the Senior Team • Develop Future Leaders
Resources & References
1. Kaplan, Norton. “The balanced scorecard: translating strategy
into action.” Harvard Business School Press, 1996.
2. Lloyd, Quality Health Care: A Guide to Developing and Using
Indicators. Jones and Bartlett Publishers, 2004.
3. Lloyd, Martin, Nelson. IHI Whole System Measures Toolkit,
Version 2.0, IHI Boston, 2006.
4. Nelson, Batalden, Ryer. The clinical improvement action guide.
JCAHO Press, 1998.
5. Nelson, Mohr, Batalden, Plume: “Improving Health Care, Part 1:
The Clinical Value Compass.” The Joint Commission Journal on
Quality Improvement, 22(4):243-258, April 1996.
6. Few, Stephen: Information Dashboard Design, the Effective
Visual Communication of Data. O’Reilly Media Inc. Publishers,
2006.

Anda mungkin juga menyukai