NHS Plymouth (Plymouth Primary Care Trust), Building One, Brest Road, Plymouth PL6 5QZ, UK
Health Economics Unit, Public Health Building, University of Birmingham, B15 2TT, UK
a r t i c l e i n f o
a b s t r a c t
Article history:
Available online 2 October 2013
In England from 2002 to 2013, Primary Care Trusts (PCTs) were responsible for commissioning
healthcare for their local populations. The NHS has recently undergone rapid organisational change
whereby clinicians have assumed responsibility for local commissioning decisions. This change in
commissioning arrangements alongside the current nancial pressures facing the NHS provides an
impetus for considering the use of technical prioritisation methods to enable the identication of
savings without having a detrimental effect on the health of the population. This paper reports on the
design and implementation of a technical prioritisation method termed PBMA applied within NHS
Plymouth, an English PCT responsible for commissioning services for a population of approximately
270,000. We evaluated the effectiveness of the process, the extent to which it was appropriate for local
healthcare commissioning and whether it identied budget savings. Using qualitative research methodology, we found the process produced clear strategic and operational priorities for 2010/11, providing
staff with focus and structure, and delivered a substantial planned reduction in hospital activity levels.
Participants expressed satisfaction with the process. NHS Plymouth adhered to the PBMA process,
although concerns were raised about the evidence for some priorities, decibel rationing, and a lack of
robust challenge at priority-setting meetings. Further work is required to enhance participants understanding of marginal analysis. Participants highlighted several external benets, particularly in
terms of cultural change, and felt the process should encompass the whole local health and social care
community. This evaluation indicates that the prioritisation method was effective in producing priorities for NHS Plymouth, and that PBMA provides an appropriate method for allocating resources at a
local level. In order for PBMA to identify savings, cultural and structural barriers to disinvestment must
be addressed. These ndings will interest other healthcare commissioners in developing their own
approaches to priority-setting.
2013 Elsevier Ltd. All rights reserved.
Keywords:
Programme budgeting
Marginal analysis
Priority-setting
Resource allocation
Commissioning
Introduction
The recent changes to the NHS in England, whereby clinicians
have assumed responsibility for local commissioning decisions,
provide an opportunity to improve the methods used for allocating
healthcare resources. Prior to this Primary Care Trusts (PCTs)
commissioned healthcare for their local populations. PCTs traditionally based resource allocation decisions on historical data,
Design methods
Mitton and Donaldsons seven-step approach to PBMA informed
the design of the process (see Appendix 1) (Mitton & Donaldson,
2004). In a novel development of this approach, we split the process into two stages to generate two levels of priorities: high-level
Strategic Improvement Priorities (the SIPs), and more detailed
priorities for changes to specic services (the initiatives). This
aimed to improve the alignment between the PCTs strategic and
operational planning. The process is summarised in Fig. 1.
PCT analysts compiled a programme budget, locally dubbed the
Evidence Bank, using activity, cost, needs, performance, quality,
and user experience data. They analysed the data to produce a
number of recommendations, which were debated at a meeting of
the PCTs Executive Team with representatives of Plymouth Hospitals NHS Trust and Plymouth City Councils Adult Social Care
Department. This resulted in the adoption of nine SIPs for NHS
Plymouth (see Appendix 2).
Methods
What is PBMA?
The rst stage of the PBMA process involves drawing up a
programme budget. This comprises a map of existing activity and
expenditure across all programme areas (e.g. cancer, obstetrics)
and provides an understanding of the existing deployment of
resources. Using this knowledge, a multi-disciplinary panel made
up of managers, clinicians, and other stakeholders devise a list of
options for change to the existing pattern of resource allocation.
There are three types of option: service redesign to provide the
same output for fewer resources (improving technical efciency);
service improvements requiring additional resources; and disinvestments, i.e. services that could be scaled back or discontinued.
These options are then scored and ranked against a predetermined set of criteria, which reect the aims and values of the
organisation. The ranked list is then used to trade-off options
that require additional investment against those that will yield a
release of resources, substituting items with least benet to fund
items with the most benet (thus improving allocative efciency).
Through this marginal analysis, the resources available to the
healthcare organisation are shifted towards programmes that
contribute the most to the organisations strategic objectives
(Mitton & Donaldson, 2004).
Although this paper focuses primarily on reporting the evaluation of PBMA, a great deal of resource was invested in adapting the
process to t with the unique features of NHS Plymouth as a local
organisation. The rst part of this section describes this adaptation
process; the research methods developed for the evaluation of
PBMA then follow.
163
164
A small group of staff developed and piloted a set of prioritisation criteria, which were based on the values of NHS Plymouth as
published in its Strategic Framework (see Appendix 3). The Professional Executive Committee (PEC) approved the criteria, thereby
involving clinicians and directors (Mitton & Donaldson, 2004).
Nine multidisciplinary Health Programme Groups, comprising
clinicians and managers from a range of disciplines, developed
proposals and business cases for initiatives to deliver the SIPs, using
the Evidence Bank to identify potential quality or productivity
improvements. Finance staff created a nancial template to enable
the net present value of each initiatives short and long term
nancial impacts to be calculated (Law, 2004). A multidisciplinary
panel scored each initiative against the prioritisation criteria.
Finally, the PEC debated the suggested initiatives on the basis of
their scores against the prioritisation criteria and nancial information, involving clinicians and directors from NHS Plymouth and
Plymouth Hospitals NHS Trust in selecting the nal set of initiatives
for implementation.
Evaluation methods
We formulated research questions on the basis of a conceptual
framework, which was informed by a review of the literature on
implementing PBMA and issues raised in discussions with stakeholders (see Appendix 4) (Miles & Huberman, 1994).
Qualitative methods are particularly suited to process evaluation. While quantitative methods can tell you what has happened,
qualitative methods are better for illuminating why things have
happened, what effect this has had, and how things could be
improved (Clarke & Dawson, 1999). We undertook semi-structured
expert interviews with staff involved in the prioritisation process,
as they were best placed to describe its effectiveness and to suggest
improvements. One-to-one interviewing allows in-depth exploration of individuals views and experiences. The semi-structured
approach ensures coverage of all aspects of the research question
dened a priori, while enabling the participant to bring his/her own
perspective to bear, often revealing important unexpected ndings
(Flick, 2002; Marshall & Rossman, 1995; Posavac & Carey, 1992).
The main researcher for this evaluation was a Health Economist
employed as a permanent member of NHS Plymouth staff, who had
also led the design and implementation of the prioritisation process. She was responsible for developing the sampling strategy and
interview guides, conducting all interviews, recording these using a
digital Dictaphone, overseeing the verbatim transcription of recordings, and leading the data analysis. We adopted a purposive
sampling strategy (Silverman, 2010) and selected 13 from a possible
26 staff members in order to represent different roles within the
process, roles within the organisation (e.g. clinician, director,
manager), and functions (e.g. nance, primary care, public health).
We developed different interview guides for participants with
different roles, varied the wording and order of the questions to
allow the interviews to unfold naturally, and spontaneously added
additional questions to probe new themes that emerged during the
interviews (see Appendix 5) (Flick, 2002).
We identied the key themes emerging from the interviews
using a thematic coding process (Flick, 2002). Although the process
was primarily inductive, there was also a deductive element, as the
research questions inuenced some themes. We noted each new
theme, and gradually organised the themes into categories using
the qualitative analysis software NVIVO. We explored how the
themes related to one another by applying the resulting coding
frame back onto the data (Coffey & Atkinson, 1996). Analytic rigour
was enhanced by checking the interpretation of the data with the
second author and with research participants subsequent to
interview.
The fact that the interviews and analysis were conducted by the
main architect of the prioritisation process could constitute a risk of
biasing the results: the interviewer may have a vested interest in a
favourable result, and participants may feel uncomfortable criticising the process. Previous research into PBMA implementation,
however, suggests that this is mitigated by the fact that the interviewer had worked closely with the research participants, as a
colleague, to develop and implement the process (Patten, Mitton, &
Donaldson, 2006). From the researchers point of view, the aim was
to develop the process over a number of years, rather than getting it
right rst time, enabling criticism from colleagues to be perceived
as a constructive factor in achieving a long-term vision. This was
assisted by keeping a reexive research diary, which included
personal reactions to critical comments. Participants were assured
that constructive criticism was welcome, and central to improving
the process. Interview error is common in evaluation research, as
staff may avoid expressing views of which managers may disapprove (Marshall & Rossman, 1995). Reassurance was provided by
ensuring that all participants were aware of condentiality.
Research participants were advised that their views on the
process were being sought in order to evaluate the process and its
implementation, and to inform future improvements. All participants signed consent forms prior to interview (Gray, 2004). No
person-identiable information was used. Digital recordings and
transcripts were password-protected, recordings were deleted
once the transcriptions had been checked, and all transcripts were
anonymised.
Results
All 13 people approached were happy to participate in the interviews, although one was unable to attend. The interviews ranged
from 35 to 80 min in length, and their content varied considerably
depending upon the interests and concerns of each interviewee.
Those interviewed included: SIP leads, i.e. the service improvement
and commissioning managers who led the teams responsible for
developing the initiatives; clinicians, nance, performance and
contracts staff, who contributed to the development of the initiatives; those who ran the implementation of the prioritisation
process; and members of the PEC, who were responsible for
agreeing the nal list of prioritised initiatives. Interviewees
comprised staff responsible for the acute care contract, primary
care, mental health and learning disability services, and public
health. The results for each dimension of the conceptual framework
that related to the outcomes of the prioritisation process are discussed below.
How satised were participants with the process?
People were unanimously happy with the process as a concept.
A key reason for this was that the process was universally considered an improvement. The process was applauded for being
robust, structured, evidence based and systematic, in
contrast to the PCTs previous approach to resource allocation,
which was criticised by several participants e particularly those
lower down the organisational hierarchy e for involving decibel
rationing. These participants welcomed this more inclusive,
evidence-based process as a means of limiting the power of inuential individuals to determine priorities. At the same time, the
combination of technical and political approaches to prioritisation
inherent in the process was valued by some, including both clinicians, who appreciated the opportunity to offer expert opinion.
For some participants the ability to compare initiatives and
trade off investments against disinvestments was an important
feature of the process, while others struggled to grasp the concept.
This difculty was apparent both during the interviews and during
the implementation, where the concept of trading off proved to be
the most challenging aspect of the process to explain to
participants.
Although participants were satised with the concept of PBMA,
several of those who were involved in developing initiatives reported that tight timescales and uncertainty around some aspects
of the process implementation had caused anxiety and pressure.
Fortunately, however, most felt that the PCT had made a good start.
Even as we were struggling to try and deliver things quickly, all
the way through the process I just thought that I actually like
what were trying to achieve.
Participants were condent that the process would be used again
in the future. They did suggest a number of improvements, none of
which were fundamental: better timing, more capacity for developing business cases, improved quality of business cases, more
stakeholder involvement, and better links to capacity planning.
165
Table 1
Change in hospital activity levels (general and acute), 2009/10e2011/12.
Non-elective First outpatient
Elective
Elective
Elective
attendancesc
total
total
ordinary
daycase
a,b
admissions
admissions admissions admissions
England
2009/10
2010/11
2011/12
Total
change
1,628,113
1,593,215
1,567,446
3.73%
NHS Plymouth
2009/10 9041
2010/11 7624
2011/12 7494
Total
17.11%
change
a
5,267,244
5,547,697
5,830,730
10.70%
6,895,357
7,140,818
7,398,177
7.29%
5,235,766
5,458,026
5,404,048
3.21%
15,276,762
15,836,204
15,914,410
4.17%
25,362
24,010
26,216
3.37%
34,403
31,634
33,710
2.01%
29,231
27,790
25,675
12.17%
74,969
70,937
72,751
2.96%
166
2009/10
2010/11
2011/12
2012/13
6715
6562
5607
4947
876
936
793
719
2721
2732
2413
2276
7366
6479
7039
6310
362
712
751
1808
2200
2156
2004
NA
NA
690
631
Sources: Dr Foster; NHS Northern, Eastern and Western Devon Clinical Commissioning Group.
167
168
References
Clarke, A., & Dawson, R. (1999). Evaluation research: An introduction to principles,
methods and practice. London: Sage.
Coffey, A., & Atkinson, P. (1996). Making sense of qualitative data: Complementary
research strategies. Thousand Oaks: Sage.
Department of Health. Hospital Activity Statistics 2009/10e2011/12. http://www.
dh.gov.uk/en/Publicationsandstatistics/Statistics/Performancedataandstatistics/
HospitalActivityStatistics/DH_129868 (Last Accessed 19.11.12).
Department of Health. (2011). 2010/11 programme budgeting PCT benchmarking tool.
London: Department of Health. http://webarchive.nationalarchives.gov.uk/
/www.dh.gov.uk/en/Managingyourorganisation/Financeandplanning/
Programmebudgeting/DH_075743 (Last Accessed 01.10.12).
Eddama, O., & Coast, J. (2009). Use of economic evaluation in local health care
decision-making in England: a qualitative investigation. Health Policy, 89, 261e
270.
Flick, U. (2002). An introduction to qualitative research (2nd ed.). London: Sage.
Gibson, J. L., Martin, D. K., & Singer, P. A. (2005). Priority setting in hospitals: fairness, inclusiveness and the problem of institutional power differences. Social
Science & Medicine, 61, 2355e2362.
Gray, D. E. (2004). Doing research in the real world. London: Sage.
Halma, L., Mitton, C., Donaldson, C., & West, B. (2004). Case study on priority setting
in rural Southern Alberta: keeping the house from blowing in. Canadian Journal
of Rural Medicine, 9, 26e36.
Harrison, A., & Mitton, C. (2004). Physician involvement in setting priorities for
health regions. Healthcare Management Forum, 17, 21e27.
Kemp, L., Fordham, R., Robson, A., Bate, A., Donaldson, C., Baughan, S., et al. (2008).
Road testing programme budgeting and marginal analysis (PBMA) in three English
regions: Hull (Diabetes), Newcastle (CAMHS), Norfolk (Mental health). York:
Yorkshire and Humber Public Health Observatory. http://www.yhpho.org.uk/
resource/item.aspx?RID10049 (Last Accessed 01.10.12).
Law, M. A. (2004). Using Net Present Value as a decision-making tool. Air Medical
Journal, 23, 28e33.
Marshall, C., & Rossman, G. B. (1995). Designing qualitative research (2nd ed.).
Thousand Oaks: Sage.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded
sourcebook (2nd ed.). Thousand Oaks: Sage.
Mitton, C., & Donaldson, C. (2004). Health care priority setting: principles, practice
and challenges. Cost Effectiveness and Resource Allocation, 2, 3e10.
Mitton, C., Donaldson, C., Waldner, H., & Eagle, C. (2003). The evolution of PBMA:
towards a macro-level priority setting framework for health regions. Health
Care Management Science, 6, 263e269.
Mitton, C., Patten, S., Waldner, H., & Donaldson, C. (2003). Priority setting in health
authorities: a novel approach to a historical activity. Social Science & Medicine,
57, 1653e1663.
Mitton, C., Peacock, S., Donaldson, C., & Bate, A. (2003). Using PBMA in health care
priority setting: description, challenges and experience. Applied Health Economics and Health Policy, 2, 121e134.
Patten, S., Mitton, C., & Donaldson, C. (2006). Using participatory action research to
build a priority setting process in a Canadian Regional Health Authority. Social
Science & Medicine, 63, 1121e1134.
Posavac, E. J., & Carey, R. G. (1992). Program evaluation: Methods and case studies.
New Jersey: Prentice Hall.
Silverman, D. (2010). Doing qualitative research: A practical handbook (3rd ed.).
London: Sage.
Wilson, E. C. F., Peacock, S. J., & Ruta, D. (2009). Priority setting in practice:
what is the best way to compare costs and benets? Health Economics, 18,
467e478.