Anda di halaman 1dari 18

Session 1A: Overview

Session 1A: Overview


John Geweke
Bayesian Econometrics and its Applicatoins

August 13, 2012

Session 1A: Overview


Motivation

Motivating examples
Drug testing and approval
Climate change
Mergers and acquisition
Oil rening

Session 1A: Overview


Motivation

Common features of decision-making


1

Must act on the basis of less than perfect information.

Must be made at a specied time.

Important aspects of information bearing on the decision, and


the consequences of the decision, are quantitative. The
relationship between information and consequences is not
deterministic.

Multiple sources of information bear on the decision.

Session 1A: Overview


Motivation

Investigators and clients


Investigator: Econometrician who conveys quantitative
information so as to facilitate and thereby improve decisions
Client
Actual decision-maker (known)
Another scientist (known or anonymous)
A reader of a paper (anonymous)

Session 1A: Overview


Motivation

Communicating eectively with clients


1

Make all assumptions explicit.

Explicitly quantify all of the essentials, including the


assumptions.

Synthesize, or provide the means to synthesize, dierent


approaches and models.

Represent the inevitable uncertainty in ways that will be useful


to the client.

Session 1A: Overview


An example

An example: value at risk


pt : Market price of portfolio, close of day t
Value at risk:
Specify t = t + s, s xed
Dene vt,t : P (pt pt
vt,t ) = .05

Return at risk:
yt = log (1 + rt ) = log (pt /pt
rt = (pt pt 1 ) /pt 1

1)

(Overly) simple model:


iid

yt s N , 2

Session 1A: Overview


Observables, unobservables and objects of interest

Putting models in context


George Box: All models are wrong; some are useful.
John Geweke: And with inspiration and perspiration they can
be improved.
Well-known example: Newtonian physics
Works ne in sending people to the moon.
Doesnt work so ne using an electronic navigation system to
drive a few kilometers

Session 1A: Overview


Observables, unobservables and objects of interest

A rst pass at models (and notation)


y: a vector of observables.
: a vector of unobservables (think widely)
Part of the model
p (y j )
This may restrict behavior, but is typically useless you know
nothing about .
Examples: the gravitational constant, and the value at risk
simple model

Session 1A: Overview


Observables, unobservables and objects of interest

Information about unobservables


Representing what we know about :
p ()
Then, formally,
p (y ) =

p () p (y j ) d .

This is potentially useful.


Important part of our technical work this week:
How we obtain information about
How p () changes in response to new information

Session 1A: Overview


Observables, unobservables and objects of interest

Conditioning on a model
We have been implicitly conditioning on a model.
Lets make this explicit:
p (y j A , A)
p (A j A)
A 2 A RkA

Dierent models lead to dierent conclusions.


This week, we shall see how to avoid conditioning on a
particular model.
The overriding principle: Use distributions of the things you
dont know conditional on the things you do know.

Session 1A: Overview


Observables, unobservables and objects of interest

The vector of interest


: The vector of interest
Directly aects the consequences of a decision
(We will be more precise in the next session.)

The model must specify


p ( j y, A , A)
Otherwise, it cant be used for the decision at hand.
Example: : 5 1, value of the portfolio at the close of the
next 5 business days

Session 1A: Overview


Observables, unobservables and objects of interest

A complete model A
Three components:
p (A j A)

p (y j A , A)

p ( j y, A , A)

Implies the joint probability density


p (A , y, j A) = p (A j A) p (y j A , A) p ( j y, A , A) .

Session 1A: Overview


Conditioning and updating

Ex ante and ex post


A critical distinction
Before we observe the observable, y, it is random
After we observe the observable it is xed.

To preserve this distinction


y: ex ante
yo : ex post

Implication: the relevant probability density for a decision


based on the model A is
p ( j yo , A)
This is the single most important principle in Bayesian
inference in support of decision making.

Session 1A: Overview


Conditioning and updating

Details and notation


Prior density:
p (A j A)
Observables density:
p (y j A , A)
The distribution of the unobservable A , conditional on the
observed yo , has density
p (A , yo j A)
p (A j A) p (yo j A , A)
=
p (y o j A)
p (y o j A)
o
p (A j A) p (y j A , A) .

p (A j yo , A) =

This is the posterior density of A .

Session 1A: Overview


Conditioning and updating

Being explicit about time


For t = 0, . . . , T dene
Yt0 = y10 , ..., yt0
where Y0 = f?g

Then

p (y j A , A) =

p (yt j Yt

1 , A , A) .

t =1

This forward recursion is the way we construct dynamic


models in economics.
A generalization of time in this context: Information

Session 1A: Overview


Conditioning and updating

Bayesian updating
Suppose Yto 0 = (y1o 0 , ..., yto 0 ) is available, but yto+0 1 , ..., yTo 0 is
not. Then
p (A j Yto , A) _ p (A j A) p (Yto j A , A)
t

= p (A j A) p (yso j Yso 1 , A , A) .
s =1

When

yto+1

becomes available, then


t +1

p (A j Yto+1 , A) _ p (A j A) p (yso j Yso

1 , A , A)

s =1

_ p (A j Yto , A) p (yto+1 j Yto , A , A) .

The concepts of prior (ex ante) and posterior (ex post) are
relative, not absolute.
Bayesian updating changes prior into posterior
Example: August 13, 2013 closing value of the S&P 500 index

Session 1A: Overview


Conditioning and updating

Concluding our rst session


The probability density relevant for decision making is
o

p ( j y , A) =

p (A j yo , A) p ( j A , yo , A) d A .

If youve only seen non-Bayesian econometrics, this is really


dierent.
Likelihood-based non-Bayesian statistics conditions on A and
A , and compares the implication p (y j A , A) with yo .
This avoids the need for any statement about the prior density
p (A j A), at the cost of conditioning on what is unknown.
Bayesian statistics conditions on yo , and utilizes the full
density p (A , y, j A) to build up coherent tools for decision
making, but demands specication of p (A j A).

The conditioning in Bayesian statistics is

driven by the actual availability of information,


fully integrated with economic dynamic theory

Session 1A: Overview


Conditioning and updating

Bayesian updating: Practical example


1

Name and institution

Do you require formal evaluation of your work in this course?

Did you bring a laptop?

If so: operating system (e.g. Windows XP, Mac OS X, Linux,


...)?

If so: does it have Matlab installed?

Have you used mathematical applications software in


econometrics (e.g. R, Stata, SAS, ...)

Specically: Have you used Matlab at all?