Anda di halaman 1dari 31

Chapter 1 Mathematical Methods 1.1 Time average vs.

ensemble average
time average: mean: mean-square: variance: auto-correlation:

ensemble average: mean: mean-square:

variance:

covariance :

first-order probability density function second-order probability density function If time average = ensemble average ergodic ensemble

1.2 Stationary vs. non-stationary processes


If k-th order probability density function is invariant with respect to the shift of time origin,

stationary of order k If a stochastic process is stationary of any order k = 1, 2, , strictly stationary


2

If and if

and

are independent of is dependent only of

, i.e. constants, ,

wide-sense stationary (weakly stationary) If If ergodic in the mean ergodic in the autocorrelation

example 1: example 2:

a basket full of batteries stationary but not ergodic

(uniform dist.) ergodic in both mean and autocorrelation

1.3 Basic stochastic processes


1.3.1 Probability distribution functions and characteristic functions
probability mass function (PMF): P(x) discrete random variable z-transform:
3

probability density function (PDF): f(x) continuous random variable s-transform:

variance: third order cumulant: fourth order cumulant:


4

1.3.2 The Bernoulli process A. Bernouilli trial


PMF:

z-transform:

2 x = p, s x = p ( - p ) 1

B. Binomial distribution
A series of independent Bernouilli trials with the same probability of success produces k0 successes. z-transform:

no success

one two success success

PMF:

definition of z-transform

k = np, s = np( - p ) 1

2 k

C. Geometric distribution
Number of Bernoulli trials after any one success and before the first next success, including this events. PMF:

successive failure z-transform:

first success

l1 =

1 1- p , s l21 = 2 p p

1.3.3 The Poisson process A. Poisson distribution


A series of identical and independent Bernoulli trials, one every , with a probability of success p = lDt:
k k

for a sufficiently small

Number of successes over a time interval [0, t] ?

mutually exclusive histories

(continuous limit)

iterative solution for k = 0, 1, 2.. with an initial condition

: average number of successful events : PMF z-transform:

k = m , s k2 = m

B. Erlang distribution
Time interval between any one success and the r-th success after that.

(r 1) successful events in PDF:

one successful event in

(exponential distribution) s-transform:

is the sum of independent random variables

l1 =

1 1 , s l21 = 2 l l

lr =

r r , s l2r = 2 l l

C. Addition and random deletion of Poisson process


i) w = x + y two independent Poisson random variables

e e

x y

(z -1) (z -1)

compound PMF (due to independence of x and y)

(x
w

+ y

)(z -1)

w0

- w

w0 !

(w

= x + y

)
9

: Poisson distribution cf. The weak law of large numbers

Mn = y

ii)

: initial Poisson distribution random deletion

binomial distribution

D. Binomical to Poisson distribution

final Poisson distribution

(very small probability of success)

10

Poisson distribution A sequence of single Bernoulli trials with a constant and small probability of success produces a Poisson distribution. independent Bernoulli trials with the probability of success ( : constant, ) = definition of Poisson process Physically, it corresponds to a memoryless system with a very fast internal relaxation.

1.3.4 The Gaussian process


binomial distribution: n: very large p, 1 p: not very close to zero A pronounced peak at

can be considered as a function of a continuous variable .

11

Taylor series expansion of small deviation

about k0 :

binomial PMF:

pn

Truncate the Taylor expansion


12

Gaussian distribution cf. The central limit theorem Regardless of the individual random variable PDF, the sum of n independent identically distributed random variables converges to the Gaussian PDF as . s-transform:

1 4 0 0 0.5 1 p

1.4 Burgess variance theorem

n-constant

(1-p) : random deletion


13

binomial distribution If the number of incident particles fluctuates, the final particle number does not obey a simple binomial distribution.

initial distribution

binomial distribution

14

: Burgess variance theorem


attenuation initial factor variance partition noise

1.5 Fourier transform (analysis)


If x(t) is absolutely integrable, , the Fourier transform of x(t) exists and is defined by

The inverse relation is

A statistically stationary process is not absolutely integrable, so strictly speaking, its Fourier transform does not exist.

15

gated function:

absolutely integrable

1.5.1 Parseval theorem

: Parseval theorem For example,

16

: energy theorem

: complex amplitude of : energy density of at

component of

1.5.2 Power spectral density


average power of

ensemble average

ensemble averaged power of

power spectral density

: statistically stationary process

: statistically non-stationary process

17

1.5.3 Wiener-Khintchine theorem


Parseval theorem

ensemble average

en. av. auto-correlation (covariance)

power spectral density

: non-stationary process

: stationary process inverse relation:

18

power spectral density

ensemble averaged auto-correlation

Fourier transform pair Example 1 White noise

: mean square

correlation time (memory time)

: Lorentzian If (infinitesimally short memory time), the power spectrum becomes white. Example 2 Wiener-Levy process

19

x(t) t=0

statistically-stationary noisy waveform t

t=0

statistically-nonstationary noisy waveform t

Stationary vs. nonstationary noisy waveforms

20

covariance

If x(t) is ergodic in the correlation,

(Wiener-Khintchine theorem)

If

(white noise)

21

diffusion constant : cumulative process

: no correlation

physical systems laser Brownian particle current carrying resistor

x(t) frequency w(t) velocity v(t) current i(t) phase f(t) position x(t) charge q(t)

22

Autocorrelation Function

Unilateral Power Spectrum

1 0.5

10-1 10-2 10-2

-1

10-1

10

The autocorrelation function and unilateral power spectrum of a stationary noisy waveform.

Autocorrelation Function

Unilateral Power Spectrum


48 D yT 2

10-1 10-2 10-2

-1

10-1

10

The autocorrelation function and unilateral power spectrum of a nonstationary noisy waveform y(t).
23

1.5.4 Cross-correlation
x(t), y(t): statistically stationary process cross-correlation function

cross-spectral density

: c-number (carry the amplitude and phase) Parseval theorem

: generalized Wiener-Khintchine theorem coherence function


24

1 -1 0

: complete positive correlation : complete negative correlation : no correlation

1.6 Random pulse train


1.6.1 Carson theorem
a2 a1 t1 t2 a3 ak t

t3 tk

random pulse train


random variables

Fourier transform power spectral density

i) k = m
25

: average rate of pulse emission : mean-square of the pulse amplitude

ii)

If a pulse emission time is a Poisson-point-process and a pulse amplitude is completely independent,

: mean of the pulse amplitude

tk is uniformly distributed in [0, T]

e i wt m

26

Carson theorem

1.6.2 Campbells theorem


: Wiener-Khintchine theorem

Parseval theorem

1/2

: mean-square

27

Campbells theorem of mean-square

Campbells theorem of mean 1.6.3 Shot noise in a vacuum diode cathode QC = -CV - + -q - + electron + + + + + + + + + + surface charge QA = CV anode

vacuum diode i(t)

V When an electron is thermionically emitted, this event creates an additional surface charge of +q on the cathode. This surface charge shields the electric field created by the electron and realizes charge neutrality inside the cathode conductor.

As the electron travels from the cathode to the anode, the surface charge on the cathode decreases and the surface charge on the anode increases. This change in the surface charge is achieved by an external circuit current.

28

Ramo Theorem
If an external circuit has a negligible resistance, the voltage between the two electrodes is kept constant. constant voltage operation energy gain by an electron circuit relaxation current electron velocity energy supply by a current transit time

If each electron emission is independent, such a memoryless system obeys Carsons theorem. If the electron transit time is much shorter than any relevant time constants, we can assume the relaxation current pulse is an impulse with a constant area q. infinite noise power Carsons theorem

Si (w ) = 2v a 2

a2 = q2

white

: Shottky formula of shot noise

29

If the electron transit time is not negligible, the Fourier transform of i(t) provides the information about the cut-off of shot noise component. finite noise power If an external circuit has a finite resistance and thus a finite circuit relaxation time , the voltage between the two electrodes is no more constant. However, if the average inter-emission time of electrons is much longer than the circuit relaxation time , each electron emission process is still considered as an independent process. constant voltage = memory-less

i(t) t V(t) CRS

t If the average inter-emission time of electrons becomes shorter then the circuit relaxation time, electron emission process becomes self-regulated sub-Poissonian process. constant current operation
30

ensemble averaged autocorrelation


i (t ) i (t )
2 2

: Campbells theorem Wiener-Khintchine theorem


2qi (t )

i (t ) d (w )

full shot noise

0 cut-off frequency

31

Anda mungkin juga menyukai