Dashboard?
Prepared for the
2007 ICSI/IHI Colloquium
Redesigning for Results:
Quantum leaps in Health Care
Prepared and Presented by
Robert Lloyd, Ph.D.
Executive Director Performance Improvement
Institute for Healthcare Improvement
H o w
How do you d b y
G o o
identify and e n?
move your W h
dots?
Identifying your … Not
My
Dot
My
dot
My
dot
My
dot
My
My
Not
My dot
dot Dot
Not
My
Dot Not
My My Not
dot Dot My
dot My
Dot
A good starting point
Developing a Measurement Philosophy
• Responsible leadership demands that we know our
data better than anyone else.
• It further requires that we have processes in place
to accurately and consistently collect a balanced
set of measures that monitor clinical outcomes,
customer satisfaction, functional status and
resource utilization.
• Finally, we must use data to develop improvement
strategies and then take action to make these
strategies
a reality.
“In spite of a general
agreement by most senior
leaders of the critical need
for a strategic measurement
set, some organizations stop
short of establishing
quantifiable measures of all
dimensions of their
strategies, except financial.
They would do well to mimic
the same logic they follow in
their financial accounting
system for their strategic
requirements.”
Chip Caldwell, p. 97
ASQ Press, 1995
Categories of Measures
IOM Report Dimensions for Improvement
• Safe - as safe in health care as in our homes
• Effective - matching care to science; avoiding overuse of
ineffective care and underuse of effective care
• Patient-centered - honoring the individual, and respecting
choice
• Timely - less waiting for both patients and those who give
care
• Efficient - reducing waste
• Equitable - closing racial and ethnic gaps in health status
Concept
Measures (S + P = O)
Operational Definitions
Data Collection Plan
Data Collection
Analysis ACTION
S +P=O
Structure + Process = Outcome
Top Down
Or
Bottom Up?
Traditional View of a Dashboard
(top down)
Measures are
Macro Level Metrics
determined by
the senior
management
Board &
team CEO
and they
cascade
Service
down to the
Lines
frontline
staff
Units, Wards & Departments
Nursing
Mesosystem Divisions
Frontline
Nursing
Microsystem Units
1-N 1-N
T1 T2
Meso
A B C D E F 1 2 3 4 5 6
AMI Evidence Base AMI Quality Metrics
Macro
IOM - Chasm Local Competition
NQF - Metrics Pay for Performance
IHI – 100K JCAHO, CMS, NCQA
1-N 1-N
T1 T2
Value Improvement:
Meso
Quality + Costs
A B C D E F Meso 1 2 3 4 5 6
Patient Evidence-Based
Perceptions Macro Quality Metrics
Transparent Performance
Outcomes Margins, ROI
©2005, Trustees of Dartmouth College, Nelson, January
Building an Integrated System of Measures
Micro
Patient & Physician
Meso
Macro
Medication
administration
Meso
Promptness/TLC
Macro
Percent of patients
bragging
Adapted from C. Caldwell, Mentoring Strategic Change in Health Care, ASQ Press, 1995.
Creating Dashboards
and Measuring for Change:
Examples from
The US
Sweden
Scotland
Norway
The Netherlands
The IHI Dashboard of
Whole System
Measures
Aim
To identify a core set of quality
measures that form a system of
measures that can be used to drive
quality improvement in health systems
What are we measuring?
IOM Dimension Measures
Safe • Adverse drug events
• Work days lost
Effective • Hospital Standardized Mortality Ratio (HSMR)
• Functional outcomes
Patient-Centered • Patient satisfaction
• Percent patients dying in hospital
Timely • 3rd next appointment available
Efficient • Costs per capita
• Hospital specific standardized reimbursement
Care Continuum & Measures
Locus of Measure and Frequency
System Measure Outpatient Inpatient Region
Adverse Drug Events X X
Work Days Lost X X
HSMR X
Functional Outcomes: SF -6 X X
Patient Satisfaction X X
Percent Patients Dying in Hospital X
3rd Available Appointment X
Costs Per Capita X
Hospital Specific Standardized Reimbursement X
System View & Measures
Figure 1. Health System and Selected Measures Key Outcomes
SF-6™
Functional
Outcomes
HSMR
Patient
Mortality Costs Satisfaction
End of
Access
Life
ADE Rate Work days
Standardized lost
reimbursement
Ÿ
3rd available Percent
appointment dying in
hospital
Costs per capita
EFFECTIVE
PATIENT-CENTERED
TIMELY
EFFICIENT
STAFF TURNOVER
SF-6TM
Mortality Functional
Patient Process of Patient with
Health Need Outcomes
with Providing Services
Met
Health Need
Clinical
Outcomes Satisfaction
140
120
100
HSMR by Hospital
HSMRs 1998-2001
80
60
40
20
nd
ge
nd
ar
nd
la
n
as
an
nd
e
na
nd
g
rg
d
nd
in
o
tte
ån
te
er
ol
an
sa
lm
tla
br
la
bo
la
in
la
al
kn
la
ar
la
öp
tla
ot
kh
ob
bo
re
Sk
rrl
al
an
an
öt
pp
ek
Ka
rm
ot
m
al
le
sa
rb
nk
gö
Ö
H
oc
on
G
no
G
er
D
Jä
äv
rm
m
Bl
U
Vä
or
Jö
er
st
Kr
St
er
ra
st
G
N
de
st
Vä
Vä
st
st
Sö
Vä
Vä
Antal avlidna/vårdtillfällen
discharge 1
0,025 UCL=0,02511
20 k to i
M ri
20 ov st i
M ri
D k to i
ri
D k to st i
ap r i
ril
o k gu i
to st i
s
s
aj
A aj
Au j
Ja er
n r
F e mb r
br er
F mb r
ru r
r
O Jul
a u ju n
Ja b e
e e
e e
eb e
be
ar
ar
ar
a
ua
ua
a
M
M
Ju
N gu
O gu
03 b
ec b
ec b
nu
nu
04 e m
u
Ja
O
02
20
UCL=0,1596
0,15
0,10 _
P=0,0774
0,05
0,00 LCL=0
ua i
M ri
M il
Se A u Ju i
pt gu l i
O mb st i
ua i
M ri
M il
ni
pt gu l i
O mb st i
ua i
m ri
m il
ni
pt gu l i
bei
A rs
A rs
apr s
Ju j
Ju j
ju j
o o r
20 ec mb er
Ja b r
F e nu er
o o r
20 ec mb r
06 em er
fe nu r
r
br a r
br a r
br a r
em st
n
a
N kt e
05 e m e
N kt e
D ve b e
ja b e
a
Se A u Ju
se au ju
pr
pr
r
a
a
D ve b
F e nu
Ja
e
04
20
35
US National
30
average
20
Value for Sweden
Sweden 2003
is missing Jönköping
15
10
Goal is less then 10 1
hospitalisations per
10000 children
5
94 95 96 97 98 99 00 01 02 03 04
19 19 19 19 19 19 20 20 20 20 20
dagar
800
90 80
Antal
700
%
80 60
600 40
70
500 20
60
400 0
antal
dagar
30 9
4
%
8,5
3,5 8
20
3 7,5
2,5 10 7
2
0
%
Antal
150 6,00 6
andel
4
100 4,00 2
50 2,00 0
0 0,00
Staff satisfaction
100
95
90
%
85
80
75
70
Patient
results
Jönköping
QI work
Service
Big “Q”
Risk
Safety Management
Background: Charge and Goal
Charge
– The KFH/HP Board of Directors Quality and Health
Improvement Committee (QHIC) requested that the data and
information utilized in the oversight process for quality be
streamlined
Goal
– The goal is to develop balanced measures of system
performance to provide QHIC with data that:
• present a top-level, “big dot” view of overall quality
performance
• enable drill-down by geography and measures
• show trends for important outcomes over time
• provides comparative metrics for benchmarking results
across different organizations
• serves as input to strategic quality improvement planning
How?
• Assembled (and introduced) multi-disciplinary
team from the various quality domains
• Developed top-level metrics (Big Dots) in each
domain
– Developed indexes where single measures
didn’t exist
• Reviewed with various stakeholder groups
• Developed preliminary design, reviewed with
dashboard design consultant (yes, they exist –
see reference at end)
• Continuous design and content improvements
through widespread review
• Ongoing training for users
Dashboard Design Tip:
Parsimony
• Highlight a small number of the most
important measures
• Everything has a purpose; no needless
information, precision, detail, decoration,
graphics or clutter
– Eloquence through simplicity
• Use color only to make a point
• Stick to 1 screen, with drill-down
A Note on Indexing
• Top-level index
Top-Level
Index
• Subscales
• Individual measures
– scaled to a common
metric
Subscales
– weighted on basis
of population
health impact
Individual
• Ability to drill down
from level to level Measures
CPI
Indexing is Individual
common, with measures are
applications from organized into a
sports to economic variety of subscales
forecasting to meet the needs
of diverse users
etc.
Asthma
etc.
CHF
etes
Energy
sing
Food
D iab
H ou
… or yet other
subscales to
…or other
subscales for address special
diverse uses needs or interests
Ou ter al
Un it
In linic
Un nit 1
tc me Pro
it 2
Safety
etc.
C
om dia
U
mortali ty
Inpatient
ed)
3
(W e sures
es te
mea IS
ight
HED
Patie nt
By Type of Measure
Ou es
ce
tc
ss
om
es
What’s Next?
• Online dashboard (powered by
Cognos)
• Ability to drill-down by measure
detail or geography
• Broadly available
Creating Dashboards
and Measuring for Change
in Scotland
Pat O’Connor
NHS Tayside, Scotland
The Leadership Challenge
• We have become good at making
improvement happen for one
condition, on one unit, for a while.
Nursing
Mesosystem Divisions
Frontline
Nursing
Microsystem Units
• Leadership
• Peri-operative
• General ward
• Intensive Care
• Medicines Management
Patient Safety Peri-operative
Departmental Level Measures
Number of MRSA Bloodstream Infections Department Handwashing Compliance
Surgery Mean % Handwashing Compliance - Total
Total number of MRSA Bloodstream observed hand hygiene mean scores divided
Infections per month by maximum score which could be achieved
and multiplied by 100.
Individual Unit Level Measures
Overall Dashboard
See handout
Narrative Summary
Table of weekly activity by
department on and off
target
Reason for delay
Overall organisational
average off target
We must transparantly
monitor and evaluate
services!
We need dashboards!
If possible...
– Measure Results of the Process
– Measure in the Process
– Balanced measurement
Measures on our Dashboards
National Quality Indicators
• Hospital wide infection rate
• Cancelled surgery
• Patients evaluation of the information from
caregivers
• Real waiting time to the first clinical
consultation (psych.)
• Patients evaluation of waiting time
Dashboard for chief of surg.
10,0
9,0
8,0
7,0
6,0
Mean
5,0
Score
UCL
4,0 LCL
3,0
2,0
1,0
0,0
10.02.2003
14.02.2003
03.03.2003
07.03.2003
11.03.2003
15.03.2003
19.03.2003
23.03.2003
02.04.2003
10.04.2003
06.04.2003
07.10.2002
11.10.2002
15.10.2002
19.10.2002
23.10.2002
27.10.2002
29.01.2003
06.02.2003
14.04.2003
18.04.2003
02.02.2003
0
100
200
300
400
500
600
Sak1:
Sak 2:
Sak 3:
Sak 4:
Sak 5:
Sak 6:
Days
Sak 7:
Sak 8:
Sak 9:
Sak 10
Sak 11:
Sak 12:
Sak 13:
Sak 14:
Sak 15:
Mean
Sak 16:
Sak 17:
Sak 18:
Sak 19:
Sak 20:
Patient number
Sak 21
Sak 22
UCL
Sak 23
Sak 24
Sak 25
Sak 26
Sak 27
Sak 28
Sak 29
Sak 30
Breakthrough psychiatry (ADHD) 2004
Sak 31
Sak 32
Sak 33
Sak 34
LCL
Sak 35
Sak 36
Sak 37
Diagram: Andrea Melø
Sak 38
Sak 39
Sak 40
I-chart for assessment time in days. (Prior to, and after change)
I-chart.: Waiting time (referral =>admittance) Pat.15 - 20 years
Psych. Team, Asker og Bærum
240
220
200
180
160
140
Days
120
100
80
60
40
20
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32
Patient number (Results before and after change)
Mean: 62.29 UCL: 241.18 | Mean: 16.67 UCL: 42.13
Measure: Elise Gustavsen/Supervisor Finn Holm
I-chart: Waiting time (Patients > 20 years of age
Psych. Team, Asker og Bærum
220
200
180
160
140
120
Days
100
80
60
40
20
0
1 2 3 4 5 6 7 8 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51
Patient number (Before and after change)
Mean: 97.88 UCL: 216.12 | Mean: 60.55 UCL: 166.81
Measure Elise Gustavsen/Supervisor Finn Holm
Hospital Standardized
Mortality Ratio in the
Netherlands
Laurens Touwen
Reinier de Graaf Group,
The Netherlands
A new instrument
• Real Time Monitoring of
- mortality
- re-operations
- nursing days
- etc
• Modeled after the British institute
Dr Foster, by Sir Brian Jarman
• In Netherland introduced by
”De Praktijkindex”, based on LMR
HSMR
• Do you know how many patients
died in your hospital last year?
• How many deaths could have been
delayed?
• What interventions could improve
the situation?
• How can you measure the results?
Hospital age, sex, race, payer, admission source,
admission type standardised death rate vs age, diagnosis
standardised charge per admission, AHRQ 1997 data
180
160
140
Standardised death rate
120
100
80
60
40
20
0
0 5,000 10,000 15,000 20,000 25,000
120
HSMRs (95% CIs) 2001-2003
100
80
60
40
114
20 72 Difference of 42%
0
1 0 3
1 0 0
1 0 4
1 0 1
1 0 7
1 0 2
9 6
3 5
6 8
1 4
8 3
8 1
5 1
2 5
8 9
5 0
5 2
4 4
8 5
7 8
3 6
1 2
7 2
9 4
1 3
6 5
3 3
3 4
9 5
3 9
9 3
8 2
7 9
2 3
6 1
4 7
3 7
2 0
8 7
9 7
4 5
3 1
1 9
9 8
5 4
3