IN A NUTSHELL
A short course
MALCOLM SAMBRIDGE
RESEARCH SCHOOL OF EARTH SCIENCES
OCTOBER 2012
COURSE OVERVIEW
3
AUSCOPE INVERSION LAB
The AuScope ilab is a component of the Australian Geophysical Observing System (AGOS)
for development and distribution of advanced data inference software tools to the geoscience
community (2011-2014).
• Parallelized uniform nested grid search in hyper cube and hyper sphere
•Direct search in multi-dimensional parameter space
(Neighbourhood algorithm, Parallel Tempering)
- ilab
ilab Software
SOME REFERENCES
Integrals and data fitting
NUMERICAL RECIPES (3RD ED.) CHAPTERS 7 & 10.
PRESS, W. H., TEUKOLSKY, S. A., VETTERLING, W. T. AND FLANNERY, B. P.,
CAMBRIDGE, 2007. HTTP://WWW.NR.COM
Error analysis
BOOTSTRAP METHODS AND THEIR APPLICATIONS.
DAVISON, A. C.; HINKLEY, D. (2006).
CAMBRIDGE: CAMBRIDGE SERIES IN STATISTICAL AND PROBABILISTIC MATHEMATICS. SOFTWARE.
Data fitting
MONTE CARLO ANALYSIS OF INVERSE PROBLEMS
MOSEGAARD, K., AND SAMBRIDGE, M., INVERSE PROBLEMS, 18, R29-R54, 2002.
There are many resources around. Google and wikipedia can be a good place to start.
HEALTH WARNING:
THIS SHORT COURSE MAY CONTAIN
SOME MATHEMATICS
'There was a tale about a mathematician who went slightly mad, and was greatly affected by the moon.
He was so madly obsessed with the moon, that he made a beautiful model of it, and then he became
convinced that his model was the real moon, and the thing in the sky merely a figment of the
imagination ! Need I say more than "Mathematicians beware of being seduced by your beautiful models".
‘Beware of mathematicians, and all those who make empty prophecies. The danger already exists that
the mathematicians have made a covenant with the devil to darken the spirit and to confine man in the
bonds of Hell.’
St. Augustine, (AD 354 - 451) De Genesi ad Litteram
‘I think that most of us have met the type of mathematician who is so dazzled by the beauty of his
mathematics that he applies it blindly to all and sundry without adequate analysis of the premises on
which he bases his deductions, the type whose mind has been made over-rigid by pure mathematics! An
important feature of applied mathematics is that it tends to correct this type of mind and, when well
taught, to focus needed attention on the problem of initial premises.’
6
WHAT ARE MONTE CARLO
METHODS ?
A BROAD CLASS OF COMPUTATIONAL METHODS FOR
SOLVING SEVERAL TYPES OF PROBLEM
PROPERTIES
CAN BE CONCEPTUALLY SIMPLE BUT COMPUTATIONALLY INTENSIVE.
IN SOME CASES THEY ARE THE ONLY WAY FORWARD (WHEN MATHEMATICAL
SOLUTIONS DO NOT EXIST).
MONTE CARLO
A SHORT HISTORY
WHERE IS THAT?
The Monte Carlo Casino
9
MONTE CARLO: HISTORY
10
S = k ln W
http://en.wikipedia.org/wiki/Boltzmann_equation
11
12
MONTE CARLO: HISTORY
13
14
MONTE CARLO
SOLVING PROBLEMS WITH
RANDOM NUMBERS
15
WHAT ARE RANDOM NUMBERS?
Computers generate a sequence of numbers between 0 and 1 which have
statistical properties that approximate a uniform random sequence.
A seed number is required to start the sequence off (usually a large integer)
a = ran(iseed)
Here’s a sequence from a random number generator with seed = 61254557
60
600 6000
Frequency
Frequency
Frequency
50
500 5000
40
400 4000
30
300 3000
20
200 2000
10
100 1000
x x x
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
16
17
Experiment
h red needles cross the line out of n throws
18
SOLVING PROBLEMS WITH
RANDOM NUMBERS
A major use of MC techniques is in solving integrals
What is the probability that my random dart will hit the dart board?
As = Area of square
Ac = Area of circle
Ac h
P = =
As N
19
20
I= f (x)dx (x)dx = N
D d
where (x) is the known sampling density and the error term is given by
⇤
N N
⇥2 ⌅1/2 Hey presto
⌥ fi2 1 ⌥
⇥M C =⇧
fi ⌃ you can use this
2 N
i=1 i i=1 i
to solve any integral !
21
MONTE CARLO INTEGRATION
PRACTICAL
The mass of irregularly shaped objected
(1,1)
I= f (x, y)dxdy
R
R1
R3
(-1,-1)
With no analytical solution we can get a numerical approximation through Monte Carlo integration
22
1 (x xo ) 2
p(x) = e 2 2
a = gasdev(iseed) ⇥ 2
x
There are many computer codes available to generate Normally distributed random deviates.
Often these computer routines are just generating uniform deviates and transforming them to Normal
deviates. E.g. using the Box-Muller method
Histogram of deviates 7000
x2 6000
x2 = 2 ln y1 sin 2 y2
5000
4000
3000
x1 = 2 ln y1 cos 2 y2 2000
1000
0
-3 -2 -1 0 1 2 3
x1
x
23
6000
5000
4000
A B
3000
x xi 2000
1000
0
-3 -2 -1 0 1 2 3
24
WHY ALWAYS GAUSSIAN ?
Given five arbitrary but independent probability distributions, p(xi ), (i = 1, . . . , 5)
what is the PDF of their sum ? p(X); X = x1 + x2 + x3 + x4 + x5
100000
X
deviates
The Central
Limit Theorem
25
26
MONTE CARLO
UNCERTAINTY PROPAGATION
`He believed in the primacy of doubt, not as a blemish on our ability to know but
as the essence of knowing.'
James Gleick's quote on Feynmann
27
MONTE CARLO
UNCERTAINTY PROPAGATION
In the Earth Sciences we often do the following
For example
The outputs y1 , y 2 may represent the age of a zircon,
uplift rate of a coral reef, location of an earthquake
or anything else.
If f(x) is simple then calculus can solve this problem. But as it gets more complicated Monte Carlo
methods become a powerful alternative.
28
MONTE CARLO
UNCERTAINTY PROPAGATION
The general procedure
xi xi + i , (i = 1, . . . , n)
This is where the random deviates are added in
29
MONTE CARLO
UNCERTAINTY PROPAGATION
Example estimating the error in an earthquake hypocentre
With travel time data to each seismic station we can calculate a best
fit estimate of the earthquake’s position (x,y,z) and origin time, t.
t1 t1 + 1
Relocate Repeat many times
t2 t2 + 2
earthquake
... etc.
30
MONTE CARLO
UNCERTAINTY PROPAGATION
Earthquake hypocentre location: Results of Monte Carlo error simulation
31
Data Important
Data
processing result !
Errors in Errors in
inputs outputs
32
Bootstrap
The
travels
and
surprising
adventures
of
Baron
Munchausen
Raspe
This idea is often attributed to Raspe’s book about Baron Munchausen (1785),
‘to pull one self up by ones bootstraps’, but he actually did it with his pigtail!
33
INDEPENDENT IDENTICALLY
DISTRIBUTED (I.I.D.)
Caution: the bootstrap method can ONLY BE APPLIED when the data are
Independent and Identically Distributed random deviates. This means
they are independently drawn from the same probability distribution.
All things being equal the following are examples of IID data
a sequence of outcomes of spins of a roulette wheel is IID.
If a roulette wheel lands on red 20 times in a row it is no more likely to land on
red on the next throw as black. Because the the spins are independent.
34
6000
5000
Data
y1 set 1
Procedure Result 1 X1
4000
3000
2000
1000
0
-3 -2 -1 0 1 2 3
x
X
y2 Data
set 2
Procedure Result 2 X2 Could calculate
mean, ,
...
...
confidence intervals
...
...
...
yM
XM
With many repeat data sets we could look at the distribution of X values and estimate uncertainty !
35
The bootstrap method is a way of generating new data from your existing data.
Bootstrap principle:
The relationship between uncollected data and your actual data
is the same as between your data and resamples of your data
uncollected collected
Real world data data We don’t have this !
resamples of
collected
Bootstrap world data
collected
data But we do have this !
By using the data itself we can generate ‘new’ data sets by randomly resampling it with replacement
36
BOOTSTRAP: AN EXAMPLE
Suppose I have five measured ages of my rock, y = (1.1, 2.4, 1.7, 3.1, 4.5)
and the spread in the values is due to some random uncertainty associated with unknown factors
in the measurement process. I decide to take the mean of my data as my best estimate of the age,
5
1
X0 = yi = 2.56. Data Procedure Result
5 i=1
I would like to know how the errors in the data map into this mean. I do not know anything about the
probability distribution of the errors in the data except that each measurement is independent. We
have I.I.D. data and can use the bootstrap!.
...
...
Interval
X
37
BOOTSTRAP FORMULAE
What can you do with the bootstrap samples?
38
The beauty of the bootstrap is that it can be applied to ANY type of data errors
(not just Gaussian), and ANY function of data (not just the mean).
Input Output
The number of bootstrap samples needed depends on the number of data, n, and type
of output, x. Rule of thumb: 1000 for a quick look, more for the details.
There are many codes available for advanced bootstrap implementations in statistical
programming languages like S-Plus, SAS, SPSS and Minitab, or you can write your own.
39
BOOTSTRAP PRACTICAL
68 %
Bootstrap
(yi , ti )
Elevation (m)
solutions
Best fit
solution
Time (s) m1
1
ri = yi m01 m02 ti + m03 t2i
2
1 0 2
yjb = rjb + m01 m02 tj m t
2 3 j
Bootstrap resampling of residuals gives new data sets and from these we get Bootstrap solutions.
40
41
MONTE CARLO
BASED METHODS FOR DATA
FITTING
42
FITTING MODELS TO DATA
Suppose we have a set of unknowns x = (x1 , x2 , x3 , . . . , xm )
Suppose we have a measure of how well the predictions fit the data, e.g.
n
(yi yp,i )2
⇥(x) = 2
i=1 i
We want to find the values of the unknowns, x which best fit the data, and so the
data fit problem becomes an optimization problem
43
We represent our model as a set of unknowns and try to find their values which
fit observations and other criteria well.
.. field of nonlinear optimization
yp = f (x)
Data fit
Uses Derivatives
Un
kn
ow
1
wn
n2
44
...Parallel Tempering
Genetic algorithms and
Evolutionary computation
(> 1992)
45
UNIFORM SEARCH
IN NESTED GRIDS
Parameter space
46
3 bits 5 bits
ilab Software
47
48
PARAMETER SEARCH
PRACTICAL
Searching for the minimum of the Rosenbrock function
Uniform nested
Rosenbrock function
grid sampling
Finding parameters that provide best fit to data misfit function is complex
49
RANDOM SEARCH
IN GEOPHYSICS
Press (1968)
50
x1
51
RANDOM SAMPLING OF THE
HYPERSPHERE
2r
n Nc Vns /Vnc
Curse of
2 1024 78%
dimensionality
4 1.05E6 31%
10 1.3E30 0.2%
16 1.5E48 3.6E-4%
1
Uniform
0.8
Sampling density
0.6
0.4
Gaussian
0.2
0
0 0.5 1 1.5 2 2.5 3
Radius r
N s = 804 N g = 1024
Estimated T̃nsp 2n(b 1) n/2 T̃ngp N
= = ilab Software
compute time t̄ np ( n2 + 1) t̄ np
52
NEIGHBOURHOOD ALGORITHM
Motivation
Given the value of the objective function at a set of (random) positions in
parameter space, how can we make use of the ensemble as a whole to determine
where to sample next ?
53
HOW TO DEFINE
NEIGHBOURHOODS
Nearest neighbour regions are Voronoi cells (Descarte, 1644, Voronoi 1908).
54
VORONOI CELLS ARE
EVERYWHERE
55
NA EXPLAINED
56
Conceptually simple
57
NA EXPLAINED
A RANDOM WALK INSIDE VORONOI CELLS
1 2 3
4 5 6
Cheese
58
NA EXAMPLE
Mt. Kosciuszko
59
NA EXAMPLE: INFRASOUND
60
NA EXAMPLE: SEISMIC RECEIVER
FUNCTIONS
61
A COMPARISION OF METHODS
62
Searching
for best fit
solutions
Searching
for all
acceptable
fit solutions
63
BAYESIAN
TRANS-DIMENSIONAL
INFERENCE
& RESEARCH TRENDS
64
A PROBABILISTIC FRAMEWORK
Likelihood
The theory of probabilities is at bottom nothing but common sense reduced to calculus.
Laplace, Théorie analytique des probabilités, 1820
65
Before After
Gaussian distribution
1 1 (m m̄)
2
p(m) = e 2 2
1 1
10.0)2
Where we start Prior PDF p(m) = e 2 (m
2
1 2(m 11.2)2
Measure m = 11.2 ± 0.5 Likelihood p(d = 11.2|m) = e
0.5 2
1 1 (m 10.96)
2
Where we end Posterior PDF p(m|d = 11.2) = e 2 1/5
66
BAYESIAN INFERENCE PRACTICAL
Given numbers of winners in each division of a lottery, calculate how many tickets were sold.
i 1 2 3 4 5 6
di 14 169 3059 149721 369543 802016
pi 1
8145060 678756 36696 732 300 144
Find an answer by
67
Maximum
Markov chain
Parameter Monte Carlo sampling
space
Gradient optimization
68
M-H ⇥
Acceptance p(m2 |d)q(m1 |m2 )
=1 EXAMPLE
Issues: REJECTION
Efficiency ALGORITHM
Dimension
M-H
ALGORITHM
69
SIMULATION OF A MULTI-MODAL
PROBABILITY DISTRIBUTION
A sum of two Gaussians
70
SIMULATION OF A MULTI-MODAL
PROBABILITY DISTRIBUTION
71
TRANSDIMENSIONAL INVERSION
Variable dimension
⇥
p(d|m2 , k2 )p(m2 |k2 )p(k2 )q(m1 , k1 |m2 , k2 )
=1 |J|
p(d|m1 , k1 )p(m1 |k1 )p(k1 )q(m2 , k2 |m1 , k1 )
k2 = k1 ± 1 k
m(x) = mj j (x)
j=1
72
EXAMPLE REGRESSION
' Make everything as simple as possible, but not simpler’
Einstein
Birth-Death example 60
1-D Regression Data
40 Partition modelling
20
−60
0 2 4 6 8 10
Properties of the ensemble x
may be examined
TRUE
Uncertainty in
ENSEMBLE
AVERAGE
model dimension
DATA
ilab Software
73
REGRESSION
WITH TRANSDIMENSIONAL MCMC
Dynamically fit the height and the position of discontinuity
Bayesian sampling
of partition models
N=6 N=11
Mean solution
True solution
No explicit regularization
Self adaptive
N=8 N=3
MCMC PRACTICAL
1-D regression problems with multiple partitions and unknown noise
Original signal and data Bayesian sampling results Estimating data noise
Data (xi , yi )
75
1-D SPATIAL INFERENCE
m(z)
m1
z1
z i-2
Depth z
i-1 th layer
mi-1
z i-1
i-th layer
mi
hi
zi
76
10 Uncertainty in
discontinuity position
20
30
Depth (m)
40
50
60
70
80
Courtesy
R. Brodie, M. Hartley
ilab Software
77
78
NOISE AND MODEL COMPLEXITY
BOTH VARIABLE
N
⇥
1 1 ⇤ |d g(m)|2
L(d|m) = ⇤ exp
( ⇤ 2⇥)N 2 2
i
⇤2
Transdimensional with fixed data uncertainty Transdimensional with variable data uncertainty
mT = (f , x, k)T and k mT = (f , x, , k)T
σestimated = 4 are related
A) 60
40 Model complexity and data noise variable
20
Model too complex A) 60
B)
40
B)
20 Model complexity OK 1
Estimated Model
Data fit good
0
complexity is accurate
−20
−40
0 2 4 6 8 10 0
0 10 20 30 40 50
σestimated = 30 Number of Partitions
60
C) C) 1
40 TRUE
20
Estimated Data ESTIMATED
Model too smooth noise is accurate
0 DATA
Data under fit
−20
0
ilab Software −40 5 10 15 20 25
0 2 4 6 8 10 σ
79
GEOCLIMATE EXAMPLE
80˚ 100˚ 120˚
mT = (fi , i , x, k)
T
East Asian Winter Monsoon
40˚ 40˚
Hongyuan
Data from 6m core of Hongyuan peat bog located on eastern Qinghai Tibetan plateau (From Large et al. 2009)
East Asian Summer
Monsoon
20˚ 20˚
Indian Summer Monsoon Geochemical proxies for constraining Holocene evolution of the East Asian
monsoon
80˚ 100˚ 120˚
DATA RESULTS
HongyuanH
2
Total C (g/int./m2)
6000
1
conditions 0.5
−3
30 2
δ13C(°/°°)
28
Data noise differs between 26
1
−1
4
N (wt %)
−2
2
−3
Probability for400change
Effect of abrupt changes not
0 100 200 300 500 600
Depth (cm)
0
1
60
C (wt %)
Probability
40
Better
p(change point)
20
0 Poorer
Do not know number of 6
change points
H (wt %)
4
0
2 0 100 200 300 400 500 600
Depth (cm)
0
0 100 200 300 400 500 600
ilab Software Depth (cm) From Gallagher et al. (2010)
80
number of layers
Noise magnitude and correlation length
-5 0 5 10 15 20 25 30
Time (s)
b)
4.0
Dispersion data
3.5
Group velocity
3.0
110˚
−10˚
120˚ 130˚ 140˚ 150˚
−10˚
110˚
−10˚
120˚ 130˚ 140˚ 150˚
−10˚ Tikhonov
−20˚ −20˚ −20˚ −20˚
solutions
−30˚ −30˚ −30˚ −30˚
Smoothing
−40˚ −40˚ −40˚ −40˚
−10˚ −10˚
120˚ 130˚ 140˚ 150˚
110˚ 120˚ 130˚ 140˚ 150˚
110˚
−10˚ −10˚
−10˚ −10˚
−20˚ −20˚
−30˚ −30˚
−20˚ −20˚
−20˚ −20˚
−40˚ −40˚
−30˚ −30˚
−30˚ −30˚
50 x 35
−40˚ −40˚
−40˚
B-splines nodes
−40˚
82
o
9 S o
9 S o
9 S
o
18 S o
18 S o
18 S
o o
27 S 27 S o
27 S
o o
36 S 36 S o
36 S
o o
o o
120 E 130 E 140oE 150 oE o o
120 E 130 E 140oE 150 oE o o
130 E 140oE
45 S110o E 45 S110o E o
45 S110o E 120 E 150 oE
o
0
o
9 S
Birth-Death McMC with unknowns mT = (f , x, k)T
o
18 S
Point by point averaging of output gives an ensemble mean, covariance
o
27 S
measures.
o o o
130 E 140oE
resolution.
45 S110o E 120 E 150 oE
20 Voronoi meshes
83
2-D TOMOGRAPHY
THE WISDOM OF THE CROWD
fit model
120˚ 130˚ 140˚ 150˚ 120˚ 130˚ 140˚ 150˚
120˚ 130˚ 140˚ 150˚ 110˚ 110˚
110˚
Optimization solution 62
Number of cells
1750 cells + regularization Ensemble mean
140˚
60
vs
Number of Cells
130˚ 150˚
iterations
120˚ 130˚ 140˚ 150˚
110˚ 120˚ 58
110˚
−10˚ −10˚ −10˚
−10˚
56
54
50
48
0 0.5 1 1.5 2 2.5 3 3.5 4
−30˚ −30˚ −30˚ −30˚
Iterations x 10
5
−20˚ −20˚ −20˚ −20˚ 2.0 2.2 2.4 2.6 2.8 3.0 3.2 3.4
−30˚ −30˚ −30˚ −30˚
0 10 20 30 40 50 60 70 80
Number of cells
True Data
84
Uncertainty estimates for free
True Data
85
mT = (v, i , x, k)
T
1 120˚ 130˚ 140˚ 150˚
110˚
−10˚ −10˚
p(n|d)
p(a|d)
p(b|d)
−20˚ −20˚
−30˚ −30˚
0 0 0
0.2 0.3 0.4 0.6 0.8 1 1.2 0 0.5 1 1.5
−30˚ −30˚ λ a b
−40˚
PDF Data uncertainty
−40˚
140˚ 150˚ Velocity cell density Output: Error in the ensemble mean
120˚ 130˚ 140˚ 150˚
110˚ 130˚ 140˚
120˚ 150˚
−10˚ −10˚ 110˚
−10˚ −10˚
−20˚ −20˚
−20˚ −20˚
−40˚ −40˚
140˚ 150˚ −40˚
−40˚
ilab Software
From Bodin et al. (2011)
86
AUSMOHO EXAMPLE
Constraining the Moho depth using seismic Reflections, Refraction, receiver
function, tomography, historical seismicity and gravity (oceans)
87
AUSMOHO EXAMPLE
Constraining the Moho depth using seismic Reflections, Refraction, receiver function,
tomography, historical seismicity and gravity (oceans)
Linear inversion
with fixed parametrization
Tkalčić et al. (2002)
Transdimensional
Probabilistic sampling
(mobile in position
and variable in complexity)
89
Malinverno, A. & Briggs, V., 2004. Expanded uncertainty quantification in inverse problems: Hierarchical Bayes and empirical Bayes,
Geophysics, 69, 1005.
Climate histories
Hopcroft, P., Gallagher, K., & Pain, C., 2007. Inference of past climate from borehole temperature data using Bayesian Reversible Jump
Markov chain Monte Carlo, Geophys. J. Int., 171(3), 1430–1439.
Hopcroft, P., Gallagher, K., & Pain, C., 2009. A Bayesian partition modelling approach to resolve spatial variability in climate records from
borehole temperature inversion, Geophys. J. Int., 178(2), 651–666.
Thermochronology
Gallagher, K., Charvin, K., Nielsen, S., Sambridge, M., & Stephenson, J., 2009. Markov chain Monte Carlo (MCMC) sampling methods to
determine optimal models, model resolution and model choice for Earth Science problems, Marine and Petroleum Geology, 26(4), 525–535.
Stratigraphic modelling
Charvin, K., Gallagher, K., Hampson, G., & Labourdette, R., 2009. A Bayesian approach to inverse modelling of stratigraphy,
part 1: method, Basin Research, 21(1), 5–25.
Geochronology
Jasra, A., Stephens, D., Gallagher, K., & Holmes, C., 2006. Bayesian mixture modelling in geochronology via Markov chain Monte Carlo,
Mathematical Geology, 38(3), 269–300.
Geochemistry
Gallagher, K., Bodin, T., Sambridge, M., Weiss, D., Kylander, M., & Large, D., 2011. Inference of abrupt changes in noisy geochemical records
using Bayesian Transdimensional changepoint models, Earth and Planetary Science Letters, doi:10.1016/j.epsl.2011.09.015.
90
SOME PUBLICATIONS ON TRANS-DIMENSIONAL INVERSION
Bodin, T. & Sambridge, M., 2009. Seismic tomography with the reversible jump algorithm,
Geophys. J. Int., 178(3), 1411–1436.
Bodin, T., Sambridge, M., and Gallagher, K., 2009. A self-parameterising partition model approach to tomographic inverse problems,
Inverse Problems, 25, 055009, doi:10.1088/0266-5611/25/5/055009.
Sambridge, M., Bodin, T., Reading, A., and Gallagher, K., 2010. Data inference in the 21st Century: Some ideas from outside the box ,
Australian Soc. of Exploration Geophysics 21st International Geophysics Conference and Exhibition, Extended Abstracts, 22-26 August, Sydney,
Australia.
Reading, A., Bodin, T., Sambridge, M., Howe, S. and Roach, S., 2010. Down the borehole but outside the box: innovative approaches to
wireline log data interpretation , Australian Soc. of Exploration Geophysics 21st International Geophysics Conference and Exhibition,
Extended Abstracts, 22-26 August, Sydney, Australia.
Reading, A. M., Cracknell, M. J., Sambridge, M., and Forster, J. G., 2011. Turning geophysical data into geological information, or, why a broader
range of mathematical strategies is needed to better enable discovery, Preview,, 140, 24-29, doi:10.1071/PVv2011b151p24.
T. Bodin, M. Sambridge, H. Tkalcic, P. Arroucau, K. Gallagher,and N. Rawlinson, 2012. Transdimensional Inversion of Receiver Functions
and Surface Wave Dispersion, J. Geophys. Res., 117, B02301. doi:10.1029/2011JB008560.
Brodie, R. C. and Sambridge, M., 2012. Transdimensional Monte Carlo Inversion of AEM Data
Australian Soc. of Exploration Geophysics 22nd International Geophysics Conference and Exhibition,
Extended Abstracts, 26-29 February, Brisbane, Australia.
Bodin, T., Sambridge, M., Rawlinson, N., and Arroucau, P., 2012. Transdimensional tomography with unknown data noise,
Geophys. J. Int., 189, 1536-1556, doi: 10.1111/j.1365-246X.2012.05414.x.
91
92
MONTE CARLO
We have seen that Monte Carlo methods are at the essence computational
techniques that use (pseudo) random numbers
Applications to
93
MONTE CARLO PRACTICALS
R1
99 %
m2 95 %
R3
(-1,-1)
68 %
Bootstrap
solutions
2. Bootstrap example
Best fit
solution
m1
3. Parameter search
94