Anda di halaman 1dari 17

Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

Chapter 1: Introduction to (Actuarial) Modelling


“I have yet to see any problem, however complicated, which, when
you looked at it in the right way, did not become still more
complicated.”

Poul Anderson,
Science fiction writer,
New Scientist, 25th September, 1969.
Models
A model is a simple, stylised caricature of a real world system or process. A model is not
designed to capture the full complexity of reality, but rather capture the essence or key
features of the system. Accordingly, as Poul Anderson observed above, building a model
requires simplification of complex reality. The power of the model comes from the fact
that it does not faithfully reflect reality in all its sophistication but throws into relief how
the key influences determine the state of the system.

Models can be used to understand how a system will evolve in the future. More
pragmatically, models are used to predict how the process might respond to given
changes thus enabling results of possible actions to be assessed. Using a model, we can
explore the consequences of our actions, thus allowing us to select that action that leads
to the most desirable outcome.

Rather than build a model, the alternative way of studying a system is to experiment and
observe how the real world system reacts to changing influences. This approach, though,
is often too slow, too unethical, too risky, or too expensive.

More philosophically, a model aids the organisation of empirical observations in such a


way that the consequences of that organisation can be deduced. It highlights the relevant
and shows, at times, the need for detail.

Definition: A model is a simplification of a real system, facilitating understanding,


prediction and perhaps control of the real system.

Models can take many forms. Analog models, for instance, use a set of physical
properties to represent the properties of the system studied. The models treated here
may be termed abstract, or mathematical, or computer-based, models.

Example 1: A life offices that sells life assurance to individuals wants to model the
number and size of claims in each future year so it can set up suitable
reserves.

Here the nature of the problem prohibits the wait-and-see attitude: we need
to set up reserves now. The model will require inputs such as the age, sex,
and other influences determining the mortality experience of policyholders,
the interest that can be earned on reserves and the likely future expenses of
administrating the policies.

Page 1 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

Example 2: The central bank wants to control inflation but not curtail economic growth
in an economy. Its chief tool to achieve its ends is that it can adjust the
interest rate on short-term deposits.

The central bank will need a macro-econometric model of the economy,


incorporating short-term interest rate as an independent variable. Outputs
would include both inflation and GDP growth. Other variables that might
also be included would be current and past GDP, inflation, unemployment
rates, etc.

Clearly, it would be too expensive (in terms of the opportunity cost of lost
potential growth) to experiment with the economy rather than build such a
model.

Example 3: An astronomer wants to know the location of Mars at each future period.

A very accurate model would be based on Newton’s Law of Gravitation and


the Laws of Motion, with inputs the masses of Mars and the Sun (and
perhaps some neighbouring planets) and the current location and momentum
of each.

Most modelling exercises cannot achieve this level of accuracy as either the
relationship between the driving variables is not fully understood or the
inputs cannot be measured with sufficient accuracy.

“The astonishing success of celestial mechanics in predicting


the behaviour of the solar system has set a standard of
predictability that is impossible for models of more open
systems [i.e., free from external influences] to attain. Our
standards for models might not be so high [in terms of
accuracy] if the center of the solar system were twin stars rather
than a single dominant sun.”
James Hickman (1997)

Example 4: An investor wants to model the maximum value that a share will attain in the
next two years, so that s/he may sell it.

Consider, as an exercise, the inputs that might be required to build such a


model.

The above examples show that good modelling requires a thorough understanding of the
system modelled.

As noted above models are built for a purpose – either to understand a phenomenon or
to help anticipate how the system will involve under several different scenarios. The
purpose or objective of the model is paramount in assessing the adequacy or otherwise
of the model. In short, a model is satisfactory if it meets the objectives of the modelling

Page 2 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

exercise satisfactorily. Note that the ‘best’ model does not generally coincide with the
most accurate model: there is a need to balance cost with benefits. In actuarial
applications, cost (which includes timeliness) typically is the key constraint.

Categorizing and Decomposing Models


Models can be categorized as to whether they are deterministic or stochastic.

A deterministic model has a unique output for a given set of inputs. The output is not a
random variable or, more strictly, takes a single value (a degenerate random variable) for
each input.

On the other hand, the output of a stochastic model is a (non-degenerate) random variable.
Perhaps some inputs are also random variables.

A deterministic model can be seen as a special case of a stochastic model – a degenerate


stochastic model. Better, a stochastic model can be seen as a richer form of a
deterministic model where not only an estimate of the output is given but also the range
of uncertainty about that estimate. Stochastic models give a range of outputs with an
associated probability.

Example 5: It is desired to model the salary progression of an actuarial student from


graduation (time 0) until retirement in 44 years’ time. The following model
has been proposed for the salary level at time t years since graduation:
Salary (t ) = €25000e.05t
A graph of Salary(t) against time t is given below.

€ 250,000

€ 225,000

€ 200,000

€ 175,000

€ 150,000

€ 125,000

€ 100,000

€ 75,000

€ 50,000

€ 25,000

€0
36
38
40
42
44
10
12
14
16
18
20
22
24
26
28
30
32
34
0
2
4
6
8

This model is deterministic because Salary(t) is a function of t – a constant not


a (non-degenerate) random variable.

On the other hand, if the following model was used:

Page 3 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

Salary (t ) = €25000e.05t + X t
where
X t N (0, (500t ) 2 )

Then the model is stochastic as the value of Salary(t) at each future time is a
random variable. Salary(t) is graphed against t for several different possible
outcomes overleaf.
€ 250,000

€ 225,000

€ 200,000

€ 175,000

€ 150,000

€ 125,000

€ 100,000

€ 75,000

€ 50,000

€ 25,000

€0
10

12

14

16

18

20

22

24

26

28

30

32

34

36

38

40

42

44
0

Note the distinction between the two is that the latter has several values – a
distribution of values – associated with each t.

Models can be decomposed into two main components. The structural part of the model
establishes the relationship between the variables modelled (the inputs) so as to
determine the functioning of the system (the outputs). These relationships are generally
expressed in logical or mathematical terms. The complexity of model is a function of the
number of variables modelled and the form of relationship posited between them. The
other part of the model is the parameters, that is, the estimated value of the fixed inputs of
the model. The parameters are often estimated from past data, using appropriate
statistical techniques, but can also include current observation, subjective assessment, or
other forms of estimation.

Models can also be sub-divided into whether time is modeled as being discrete or
continuous. Accordingly, we might have a discrete time stochastic model or process or a
continuous time stochastic model or process. Similarly, the state space – meaning the set of all
possible values (or states) the process can take – can be modelled with a discrete set or a
continuous set. With two ways of modelling time and another two ways of modelling the
state space this gives a fourfold classification of stochastic models.

Note that the decision of which of the four model types above to use in a particular
situation has more to do with the purpose of the model than the underlying real system
being modelled. For instance, the price a security can take is clearly discrete (as the
smallest change is a cent), yet in many modelling situations the price is assumed to be
continuous – the state space taken as all non-zero real numbers. This is done for
modelling convenience.

Page 4 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

Building a Model
“It is clear, then, that the idea of a fixed method [in building scientific
models], or a fixed theory of rationality, rests on too naïve a view of man
and his social surroundings…it will become clear that there is only one
principle that can be defended under all circumstances and in all stages of
human development. It is the principle: anything goes.”
Paul Feyerabend (1993), Against Method, (3rd Edition) Verso Press,
London. Quote is from pp. 18-19.

Feyerabend, the philosopher of science, claims that there is no unique method to


building scientific models – anything goes. Accordingly, it is not possible to give a formulaic
way of building a model. But we can give helpful hints.

One overall tip is to have modest ambitions. It is difficult and time-consuming to come
up with a half-way decent model – even for a relatively straightforward system. The
econophysicist Bertrand Roehner classified the complexity of models by the number of
distinct objects simultaneous modelled and the nature of their interactions. He concludes
that science has had notable successes in modelling phenomena of first and second level
complexity but has, as yet, no model of level 3 or higher.

Orders of Complexity in Modelling


Level 1 - Two body problem
e.g., gravity, light through prism, etc.
Level 2 - N-identical body with local interaction
e.g., Maxwell-Boltzmann’s thermodynamics
Ising model of ferromagnetism
Level 3 - N-identical body with long-range interaction
Level 4 - N-non-identical body with multi-interactions
Modelling Markets
Modelling economics systems generally
General actuarial modelling
The history of science gives us no example of a complex problem of Level 3 or 4
being adequately modelled.
(Adapted From Roehner, B.M. (2002), Patterns of Speculation: A Study in Observational
Econophysics, Cambridge University Press)

As all actuarial applications are level 4 in the above ordering, the history of science
underlines the need to be modest in the aim of the model and the accuracy of the
forecasts from the model. In particular, actuaries must not be too dismissive of scientific
models and their failure to faithfully reproduce the complexity of reality, but adapt the
insights to enrich their modelling (see, for instance, Whelan (2006) for a discussion of the
tension historically between pragmatic actuaries and theoretical financial economists).

We can give further and more pragmatic advice in helping to build a model. A key
constraint is building models in a business context is that they must be completed on
time and within budget. The following ten steps set out a logical and practical approach
to building models in such a context. Note that typically one cycles between the steps a
few times before completing the modelling exercise.

Page 5 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

The 10 Step Guide to Building a Model

1. Set well-defined objectives for the model.


2. Plan how model is to be validated
o i.e., the diagnostic tests to ensure it meets the objectives
3. Define the essence of the structural model – the 1st order approximation.
Refinement, if necessary, can come later.
4. Collect and analyse the data to estimate the parameters in the model.
5. Involve experts on the real world system to get feedback on conceptual model.
o A Turing-type test (explained in lectures)
6. Decide how to implement model
o e.g. C, Excel, some statistical package.
o Often random number generator needed.
7. Write and debug program.
8. Test the reasonableness of the output from the model
o and otherwise analyse output.
9. Test sensitivity of output to input parameters
o We do not want a chaotic system (explained in lectures) in actuarial
applications.
10. Communicate and document results and the model.

Finally, the model needs to be constantly updated in the light of new data and other
changes. This might be regarded as the 11th point above: to monitor changes and update
the model in the light of these changes.

The step-by-step guide makes modelling seem a routine activity. It is anything but…it
requires insight, diligence, and patience. The mathematician turned modeller Bernard
Beauzamy gives an excellent summary of the challenges facing a modeller in his short
essay, Real Life Mathematics, Irish Math. Soc. Bulletin 48 (Summer 2002), 43–46. Some
quotes from his essay are given below to lend colour and context to our earlier guide.

“It is always our duty to put the problem in mathematical terms, and this part of
the work represents often one half of the total work…”

“My concern is, primarily, to find people who are able and willing to discuss with
our clients, trying to understand what they mean and what they want. This
requires diplomacy, persistence, sense of contact, and many other human
qualities.”

“Since our problem is “real life”, it never fits with the existing academic tools, so
we have to create our own tools. The primary concern for these new tools is the
robustness.”

“… real life mathematics do not require distinguished mathematicians. On the


contrary, it requires barbarians: people willing to fight, to conquer, to build, to
understand, with no predetermined idea about which tool should be used.”

Page 6 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

Advantages of Modelling
Modelling can claim all the advantages of the scientific programme over any other –
logical, critical, and evidence-based study of a phenomenon that builds, often
incrementally, to a body of knowledge. Models offer a structured framework to update
our knowledge, as a model is only superseded when a better one comes along.

Models help us in our study of complex systems, including stochastic systems that are
otherwise not tractable mathematically (in closed form). Models allow the consequences
of different policy actions can be assessed – in a shorter timeframe and with less
expensive than alternatives methods of assessment.

However, the modelling exercise has pitfalls that must be guarded against. A check-list of
such drawbacks is given below.

Checklist of Pitfalls in Modelling

1. Model building requires considerable investment of time and expertise...so not


free.
2. Often time-consuming to use – many simulations needed and the results must be
analysed.
3. Sometimes difficult to interpret output from model.
4. The model only as good as the parameter inputs – so need to critically assess
quality and credibility of data.
5. Models are often not especially good at optimising outputs but are better at
comparing results of input variations.
6. Impressive-looking models (especially complex ones) can lead to overconfidence
in model.
7. Must understand the limitations of the model to apply it properly.
8. Must recognise that a model may become obsolete with a change in
circumstances.

Computers & Modelling


The computer is revolutionising modelling, in particular actuarial modelling. Hickman &
Heacox (1999) gives an excellent overview of how of they shaped actuarial science and
actuarial practice in the couple of decades following World War II.

The first generation of electronic business computers (say, the UNIVAC computer of
1951, the IBM 650 in 1955 or the earlier IBM 702/705 machines) were used as
calculators - performing repetitive calculations. The first of these UNIVAC machines
was bought by the Census Bureau, the second by A.C. Nielson Market Research and the
third, for actuarial purposes, by the Prudential Insurance Company. By the end of the
1950s the life assurance industry was further advanced in re-engineering their businesses
to harness the potential of computers than any other industry in the US (and therefore,
the world).

The first applications of electronic computers to actuarial work was to do similar


calculations as had been done manually but now faster, with less approximation, and with
less chance of error. The computers, though extraordinary limited and tedious to use

Page 7 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

compared to modern machines, made more practical the building of stochastic models,
explored using simulation. An early example of such an analysis was Boermeester’s paper
of 1956 estimating the distribution of costs of an annuity. While simulations had been
done previously (see Chapter 3), such computing power made the technique more
practical. Simulation was to develop as an important technique in modelling from this
time and actuaries, such as Phelim Boyle, have played an important role in its
development and dissemination.

The second generation of computers (say the IBM 360 series) were used not only as
calculators but also as real time databases – for airline reservations processing and
inventory control. Once again, actuaries were quick to exploit their new capabilities, by
semi-automating the back office of insurance companies and using them to help in
valuation work. In fact, from the early 1950s, it was apparent that the event of such
computers necessitated the re-engineering of life office organisations and actuaries played
a key role in planning and managing the changes.

Subsequent generations of computers have developed even greater uses. Significantly,


they now aid in modelling of all kinds of things, from the design of cars and airplanes
designed to designing the next generations of computer chips. J. Bradford Delong noted
the general use of computers in modelling and, in particular, the spreadsheet program as
a general modelling (or ‘what-if’) tool:

“The value of this use as a what-if machine took most computer


scientists and computer manufacturers by surprise…nobody before Dan
Bricklin programmed Visicalc had any idea of the utility of a spreadsheet
program…Indeed, the computerization of America’s white-collar offices
in the 1980s was largely driven by the spreadsheet program’s utility – first
Visicalc, then Lotus 1-2-3, and finally Microsoft Excel.”
DeLong, J.B. (2002), Productivity Growth in the 2000s. Working Paper
(Draft 1.2), University of California at Berkeley and NBER. Quote is
from pp. 35-36.

Again, such increased utility has been exploited in actuarial applications by, for instance,
simulating the future profitability, and the associated risks, of life offices (the so-called
‘model life office’). This latter stage also saw the introduction of new assurance products
such as the unit-linked policies from the early 1970s and the flexible “universal life”
policies from the late 1970s. The administration of such products, with their embedded
options and choices, was brought down by the computer to the price customers were
willing to pay. Rieder (1948), quoted in Hickman & Heacox (1999), had earlier foreseen
this later evolution:

“If the new electronic machinery, with its tremendous computing and
memory capacity, had been available from the outset, we might have
developed life contracts and procedures along entirely different lines…it
might have been possible to design one policy which would have been
flexible enough to meet every policyholder’s insurance needs for the rest
of his lifetime.”

The impetus to change is not slowing as computer speed rises and memory costs
continue to fall exponentially. Actuarial science, the science of the possible in risk
protection, has been revitalised by the possibilities - both in the possibility of modelling

Page 8 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

risks previously intractable and in developing products to transfer them efficiently.


Stochastic modelling, feasible by computing speed, has allowed us to price investment
guarantees and to contemplate even modelling the complete risks faces by a company,
so-called ‘enterprise risk’. The computer, as cost-efficient record-keeper, has allowed
transactions sizes to fall and numbers to increase – enabling ever greater volumes of risk
to transfer to traditional intermediary institutions or to the capital markets. The change is
a revolution in financial services generally and is far from complete. Perhaps, with all the
new developments, actuarial practice must take leave of some traditional ones – such as
the with-profits policy and the defined benefit pension promise – which appear
increasingly anachronistic in the brave new world.

Evaluation of Suitability of a Model


“And what is good, Phædrus,
And what is not good...
Need we ask anyone to tell us these things?”
Plato, The Phaedrus (and quoted in Robert M. Pirsig's Zen and the Art of
Motorcycle Maintenance)

On near-completion of the model, the key question is whether the proposed model is
‘fit-for-purpose’? This requires a critical appraisal of the model. A helpful checklist of
considerations that might form part of the evaluation is set out below.

Checklist for evaluation of model


1. Evaluate in context of objectives and purpose to which it is put.
2. Consider the data and techniques used to calibrate the model, especially
estimation errors. Are the inputs credible?
3. Consider correlation structure between variables driving the model.
4. Consider correlation structure of model outputs.
5. Consider the continued relevance of model (if using previously developed
model).
6. Consider the credibility of outputs.
7. Be alert to the dangers of spurious accuracy.
8. Consider the ease of use and how results can be communicated.

The checklist above is by no means complete. The objective of the model is clearly
paramount and the checklist must be structured so that the model is evaluated in that
context. Some further considerations are set out below that are often important in
evaluating models for actuarial applications.

Further evaluation checklist for actuarial models


1) Consider the short and long run properties of model
i. Are the coded relationships stable over time?
ii. Should we factor in relationships that are second order in the short-term
but manifest over the long-term?
2) Analysing the output –
i. Generally by statistical sampling techniques…but beware as observations
are, in general correlated. IID assumption never, in general, valid.
ii. Use failure in Turing-type (or Working) test to better model.
3) Sensitivity Testing

Page 9 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

i.Check small changes to inputs produce small changes to outputs. Check


results robust to statistical distribution of inputs.
ii. Explore and, perhaps, expand on key sensitivities in model.
iii. Use optimistic, best estimate, and pessimistic assumptions.
4) Communication & documentation of results
i. Take account of knowledge and background of audience.
ii. Build confidence in model so seen as useful tool.
iii. Outline limitations of models.

Macro-Econometric Modelling: A Case Study


[This section is largely based on Economic Models and Policy-Making. Bank of England,
Quarterly Bulletin, May 1997.]

A case study can give focus to some of the above comments. Many actuarial models
require as an input some estimate of future general economic growth, of inflation, of
interest rates. Macro-economic models, or more accurately macro-econometric models
provide forecasts of these variables so it is useful to evaluate the reliability of these
forecasts. Moreover, the history of developing macro-econometric models provides us
with valuable lessons that are pertinent to any model building exercise. Accordingly, we
take as a case study the development of macro-econometric models in the UK.

Macro-economics and Macro-econometrics


Macro-economics and macro-econometrics study the same thing. The subject matter for
both is the modeling of aggregate measures for the whole economy, such as: inflation
and monetary growth, economic growth, capital investment, unemployment, trade,
interest rates both long and short, levels of stock markets, exchange rates. Macro-
economics contents itself with describing the structural form of the relationship between
the variables (see earlier) while macro-econometrics takes the structural form and
completes the model by estimating all unknown parameters and the variance of any error
terms from historic economic data. Clearly, macro-econometric models are more useful
to policymakers (and actuaries).

So how good is macro-economics at describing the form of the relationships between


macro-economic variables? Then, building on this, how good are the calibrated
econometric models?

The answer to both questions is the same: pretty much awful. The four decades macro-
econometrics has been around informing economic policy decisions has been one of
constant retreat into uncertainty. With so many and so big mistakes made in its brief
history, the development of macro-econometrics provides us with a great many lessons.

Model Building
It all began in the 1960s, and the fad was most acute in the UK. The macroeconomists
thought they had cracked how the economy worked. They thought they had a reliable
model of the whole economy. Macro-economists were high on their enthusiasm and
convinced governments to spend heavily on building complex models of the economy
that could answer all questions. The logic was that, as one thing in an economy is
connected to everything else, either one models everything or nothing.

Page 10 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

Below we graph how two key macro-economic variables in the UK economy evolved
since WWII to the late 1960s.

UK Inflation and Real Economic Growth, 1948-1968.

30.0%

25.0%

20.0%

15.0%

10.0%

5.0%

0.0%
1949 1952 1955 1958 1961 1964 1967 1970 1973 1976 1979 1982 1985 1988 1991 1994 1997

-5.0%

UK Real GDP Growth UK Inflation

Growth was the primary output variable to be maximised – the ‘target’ variable as they
called it. The input variables were those under the control of the government – short-
term interest rates, taxation, public spending, etc. Inflation can be seen as an unintended
output and if it rises then it breaks down the order in the system. In short, rising inflation
is a key undesirable output.

Real GDP growth was healthily positive over the 1960s and much of the 1960s and
inflation was benign. In short, it was a great time. Being slightly cynical, maybe the
certainty that economists had in their modelling was connected to the extraordinary
economic growth and well-being of the 1960s: everyone likes to take credit for a good
thing.

In fairness to the UK economists, there was enthusiasm worldwide that all the major
problems in macro-economics were solved or nearly solved. There was a dominant
economic doctrine, with few challengers. As Richard Nixon declared: “We are all
Keynesians now”. And we find legislation giving credence to that belief that economic
growth could be controlled in the US, UK, and even monetarist Germany in the 1967
Act to Promote Economic Stability & Growth.

By the early 1970s, four main models were developed in the UK out of generous support
from public funds. These were at the:

• Bank of England
• Treasury
• NIESR (the National Institute of Economic and Social Research)
• London Business School

Page 11 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

These were the original black-box. They were a labyrinth of small equations with the
output of one forming the input of another – a total of between 500 and 1,000 equations.
The models were calibrated and initialized to prevailing conditions. Now, if one of the
inputs – say, interest rates or taxation or government spending – was altered, the model
would predict its effects, as the change rick-shades through all the equations to ultimately
impact on everything else. The key output variables monitored at that time were
economic growth, employment and inflation.

The Models in Practice


All four models were fundamentally the same. Each was based on the dominant
economic theory of the time – Keynesianism. But just because it is popular did not make
it right…. Extending the earlier graph of the two key macro-economic outputs up to
1980 shows the trajectory taken by the UK economy was diametrically opposite that
intended.

UK Inflation and Real Economic Growth, 1948-1980.


30.0%

25.0%

20.0%

15.0%

10.0%

5.0%

0.0%
1949 1952 1955 1958 1961 1964 1967 1970 1973 1976 1979 1982 1985 1988 1991 1994 1997

-5.0%

UK Real GDP Growth UK Inflation

The graph above shows inflation takes off into double digits while economic growth
crosses the x-axis twice indicating two recessions in the 1970s. What went wrong?

Errors in Modelling
There are three generic sorts of errors associated with any modelling exercise. They are,
in ascending order of importance:
• General uncertainty in model, represented by the error term in the model. This is
because the model does not model everything relevant – just the main drivers –
and the effect of the rest is gathered in this term.

• Parameter mis-estimation – the form of the model is right but the parameters are
not. This can lead the modeled output being systematically above or below its
true value.

Page 12 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

• Model misspecification – the form of the model is wrong so that what is


observed in practice might not be anticipated at all by the model.

The macro-econometric models made all three errors. First, captured in the general
uncertainty in the model, was the oil price shock which pushed models to extremes. This
could be put down to bad luck. Second, Bretton Woods, the fixed exchange rate system,
failed at the start of the 1970s so that exchange rates became floating. With no data to
model floating exchange rates, the models just ignored it. This might be regarded as
parameter mis-estimation. Finally, all the models suffered the most serious defect of all:
model misspecification.

The 1974-1975 stagflation – that is, inflation rising at the same time that economic
growth falters simply was not thought possible under Keynesian economics. Remember
macroeconomic theory should give the right structure of the model. Here though,
something happened that was not thought possible in theory. There was model failure –
as soon as the models had begun to be used to inform policy decisions.

The Finger-pointing
The models could and were adjusted and some new models came along (e.g., Liverpool’s
new classical model and the City University Business School Model). But the models
were getting even more complicated and it was increasingly difficult to know how to fix
them when they went wrong. The models were not learning by experience and
sometimes the outputs were just plain silly. We can see this best from some quotes.

Treasury forecasters [in 1980] were predicting the worst economic downturn since the Great Slump
of 1929-1931. Yet they expected no fall in inflation at all. This clearly was absurd and underlined
the inadequacies of the model.

Nigel Lawson, The View from No. 11.

The users could no longer explain why the model was producing the results it was or why
one model gave one result and another model a significantly different one. Soon
everyone turned cold on the big macro-econometric models. They turned on the
modellers:

Modelling was seen as a second-rate activity done by people who were not good enough to get proper
academic jobs.

Earlier expectations of what models might achieve had evidently been set too high, with unrealistic
claims about their reliability and scope.
Quoted from Economic Models and Policy-Making. Bank of England,
Quarterly Bulletin, May 1997, p. 165 & p. 164.

Lessons Learned
We can make two observations:
• Economists adopted a very optimistic view of their creations when selling the
blueprints to the government agencies for financing and, thus committed, they
could not judge its output impartially
• There were large vested interests in the models by those who signed the cheques.
Who was going to shout that the emperor had no clothes on?

Page 13 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

These big monolithic models faded out of existence in the early 1980s. When
Keynesianism was challenged by the monetarism of Milton Friedman, there was no
longer a widespread consensus on the structural form of the models so no uncontentious
assumptions around how to build macro-econometric models.

Yet policymakers, governments in the fiscal sphere and central banks in the monetary
one, still require some idea of the impact of their policies. As Alan Blinder, a former
Vice-Chairman of the Board of Governors of the Federal Reserve, remarked: “you can
get your information about the economy from admittedly fallible statistical relationships,
or you can ask your uncle” (Blinder, A.S. (1999)).

However lessons have been learned and the ambitions of macro-econometric modellers
have been reduced. Models in use today satisfy four criteria:

• Models and their outputs can be explained in a way consistent with basic
economic analysis.
• The judgement part of the process is made explicit
• Models must be able to produce results consistent with historic economic
episodes.
• Results must be consistent over time (e.g., parameters must not be sensitive to
the period studied)

The above criteria tend to result in small-scale models. Smaller more stylised models are
now the order of the day – recognising the uncertainty inherent in the underlying
economic structure, not just to parameter values but to structural changes too. Models
are now seen as flexible friends.

In fact, the most complicated model used in practice is probably the Dornbusch
Overshooting Model, that is an equation linking five key variables – growth, money,
prices, exchange rates, and interest rates. Models should get no more elaborate than that
and, in fact, the modeler should begin with just 2 or 3 parameters – introducing just
enough parameters to reduce the error term in the model to be acceptable for the
purpose. The user of the model must then allow for the risk of model misspecification
(or as economists say to deflect blame from their modelling, ‘structural shifts in the
economy’). No model can be used blindly when so much uncertainty surrounds how the
underlying economy functions.

The Brave New World


So what has policy decision-making been like under this new, less ambitious way of
modelling? Well projecting forward from 1980, the time that Nigel Lawson was ridiculing
the output of the old-style Treasury model, we see how the two key variables evolved in
the UK.

Page 14 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

UK Inflation and Real Economic Growth, 1948-1980.


30.0%

25.0%

20.0%

15.0%

10.0%

5.0%

0.0%
1949 1952 1955 1958 1961 1964 1967 1970 1973 1976 1979 1982 1985 1988 1991 1994 1997

-5.0%

UK Real GDP Growth UK Inflation

The graph shows that there was no great depression as the Treasury model predicted but,
on the contrary, 1981 was pretty much the turning point with inflation falling and growth
picking up. And the picture has become brighter since the early 1990s with strong
growth and low inflation.

How much credit can the new style of modelling take for the strength of the UK
economy? First, the modellers do not confidently make the terrible, unthinking, blunders
– they are more modest and more circumspect. This is the major breakthrough. Indeed,
the new models are less sure that they are modelling cause-and-effect and can be seen, at
one level, as a pithy short-hand of the past – simple summaries of past behaviour.
Second, structural changes are still occurring in the economies – meaning that even the
appropriate form of macroeconomic model is changing. In fact, the US economy – the
largest in the world – entered a new phase of growth in 1995/1996 when it extended its
growth cycle without inflation rising, despite the unemployment rate falling below the
6% non-accelerating inflation rate of unemployment (NAIRU). Almost a decade later the
debate is still raging as to what is going on – with the equity market taking a very positive
view and then, dramatically (as is the way of the market) a not-so-positive view. Some
economist conjecture that the ubiquitous use of computers in modeling at the micro and
macro level have helped to shift sustainable economic growth in the developed
economies higher.

In Conclusion
So what are the lessons for other modellers from the history of econometric modelling?
This can be summarised simply:

• Have limited ambitions of models


• Build disposal models
• Small, stylised, parsimonious, models are beautiful.

Page 15 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

Concluding Thoughts
Modelling obviously requires a sound knowledge of the process to be modelled but it
also requires something from the art of the modeller. The good modeller will have the
insight of knowing what is possible, the toolbox of robust methods, and the experience
of managing projects and teams to deliver on time and within budget a model that is ‘fit-
for-purpose’.

Actuaries have deep but narrow expertise in building models for actuarial applications.
However, there is evidence to believe that this modelling experience can be applied in
broader fields. Hickman & Heacox (1998) recounts how the modelling expertise of
American actuaries was called upon during World War II. General purpose modellers
were required in the war effort to model anything from the optimum depth to explode an
underwater bomb, or from devising optimum aircraft search strategies for enemy
submarines, to how to calibrate radar systems. Actuaries were recruited to the original
Anti-Submarine Warfare Operations Research Group (ASWOG), which on
demonstrable success, widened its scope to all US navy operations when it became the
Operations Research Group (ORG), and then the Operations Evaluation Group (OEG).
This group comprised of some 80 scientists (many very distinguished, such as the Nobel
Prize winner William Shockley), of which no fewer than 18 were actuaries. This was the
origin of the discipline Operations Research, defined as the application of the scientific
method to providing decision-makers with a quantitative basis for decisions on
operations under their control. Donald Cody, one of the actuaries with the original
ASWOG, suggests that the modeller must pay unswerving attention to the objectives and
“must treat truly important problems with the simplest available techniques and not seek
any old problems which enable use of fancy techniques” (op. cit., p. 8).

We conclude with a final remark on models from one of the twentieth century’s most
insightful modellers:

“One thing I have learned in a long life: that all our science, measured
against reality, is primitive and childlike – and yet it is the most precious
thing we have.”

Albert Einstein (1879-1955)

Page 16 of 17
Lecture Notes: Models- Stochastic Models Dr Shane Whelan, FFA, FSAI, FSA

Further Reading

Beauzamy, B. (2002), Real Life Mathematics, Irish Math. Soc. Bulletin 48 (Summer), 43–46.
Currently available on the web at: www.maths.tcd.ie/pub/ims/bull48/M4801.pdf

Blinder, A.S. (1999), Central Banking in Theory and Practice, MIT Press.

Bowie, D. et al. (1996), Models, Useful Models, and Reality. The Actuary (UK), December
1996, 27-28.

DeLong, J.B. (2002), Productivity Growth in the 2000s. Working Paper (Draft 1.2),
University of California at Berkeley and NBER.

Feyerabend, P. (1993), Against Method, (3rd Edition) Verso Press, London.

Hickman, J.C. (1997), Introduction to Actuarial Modeling, North American Actuarial Journal,
1, 3, 1-5.

Hickman, J.C. & Heacox, L. (1998), Actuaries in History: The Wartime Birth of Operations
Research. North American Actuarial Journal, 2, 4 (Oct.), 1-10.

Hickman, J.C. & Heacox, L. (1999), Actuaries at the Dawn of the Computer Age. North
American Actuarial Journal, 3, 3, 1-13.

Faculty/Institute of Actuaries Core Reading for Subject 103: Stochastic Modelling - Chapter 1.
Edinburgh, London & Oxford.

Roehner, B.M. (2002), Patterns of Speculation: A Study in Observational Econophysics.


Cambridge University Press.

Whelan, S. (2006), Not Such a Great Controversy: Actuarial Science & Financial Economics. The
Actuary (US), Society of Actuaries (US). December 2006/January 2007.

Whitley, J. (1997), Economic Models and Policy-Making. Bank of England, Quarterly Bulletin,
May 1997.

© Shane Whelan 2006

Page 17 of 17

Anda mungkin juga menyukai