Anda di halaman 1dari 42

SYSTEMS

Identification

Ali Karimpour
Assistant Professor
Ferdowsi University of Mashhad

Reference: System Identification Theory For The User Lennart Ljung(1999)

lecture 2

Lecture 2
Introduction
Topics to be covered include:

Impulse responses and transfer functions.

Frequency domain expression.

Stochastic Process.

Signal spectra

Disturbances

Ergodicity

2
2
Ali Karimpour Sep 2010

lecture 2

Impulse responses
It is well known that a linear, time-invariant, causal system can be described as:

y (t )

g (t )u ( )d

y (t )

g ( )u (t )d

Sampling

y (kT )

g ( )u (kT )d

Most often, the input signal u(t) is kept constant between the sampling instants:

u (t ) uk
So

y (kT )

kT t (k 1)T

g ( )u (kT )d
l 1

lT

( l 1)T

g ( )u (kT )d

u g (l ) u
g
(

)
d

k l
T
k l

( l 1)T

l 1
l 1
lT

3
3
Ali Karimpour Sep 2010

lecture 2

Impulse responses

y (kT ) gT (l ) uk l

Where

l 1

lT

( l 1)T

g ( )d gT (l )

For ease of notation assume that T is one time unit and use t to enumerate
the sampling instants

y (t ) g (k ) u (t k )

t 0,1, 2, 3 .....

k 1

Ali Karimpour Sep 2010

lecture 2

Transfer functions
Define forward and backward shift operator q and q-1 as
qu (t ) u (t 1)
q 1u (t ) u (t 1)
Now we can write output as:

k 1

k 1

g (k )q

y (t ) g (k )u (t k ) g (k ) q u (t )
k

k 1

G(q) is the transfer operator or transfer function


Similarly for disturbance we have

u (t ) G (q )u (t )

G (q ) g (k )q k
k 1

v(t ) H (q )u (t )
So the basic description for a linear system with additive disturbance is:

y (t ) G (q)u (t ) H (q )e(t )

Ali Karimpour Sep 2010

lecture 2

Transfer functions
Some terminology
G(q) is the transfer operator or transfer function

G (q ) g (k )q k

or

k 1

G ( z ) g (k ) z k
k 1

We shall say that the transfer function G(q) is stable if

g (k )
k 1

This means that G(z) is analytic on and outside the unit circle.
We shall say the filter H(q) is monic if h(0)=1:

H ( q ) h( k ) q k
k 0

Ali Karimpour Sep 2010

lecture 2

Frequency-domain expressions
Let

u (t ) cos t

It will convenient to write


u (t ) Re e jt
Now we can write output as:

y (t ) g (k ) Re e

i ( t k )

k 1

Re g (k )ei ( t k )
k 1

Re e . g (k ) e
k 1

it

i k

i t
i

Re
e
G
(
e
)

So we have

y (t ) G (e i ) cos t arg G (e i )

Ali Karimpour Sep 2010

lecture 2

Frequency-domain expressions
u (t ) cos t

y (t ) G (e i ) cos t arg G (e i )

u (t ) Re e jt

Now suppose

cos t
0

u (t )

y (t ) g (k ) Re e

Re e . g (k ) e
k 1

i t

i k

Re e . g (k ) e
k 1

i ( t k )

k 1

Re e jt t 0
u (t )
t0
0

t0
t0

i t

Re e . g (k ) e
k t

i t

i k

i k

i t

Re e . g (k ) e
k t

y (t ) G (e ) cos t arg G (e ) Re e . g (k ) e i
k t

it

i k

For stable system

Ali Karimpour Sep 2010

lecture 2

Periodograms of signals over finite intervals

f (t )

Fourier transform (FT)


1
i t
g ( )
f
(
t
)
e
dt

g ( )

f (t ) g ( )e it d

Discrete Fourier transform (DFT)


1
U N ( )
N

u (t )e

it

t 1

u (1) u (2) u (3)


u (4) ...... u ( N )
N
1

1
u (t )
N

U
k 1

(2k / N )ei 2 kt / N

2
4
6
) UN ( ) UN ( )
N
N
N
8
2 N
UN ( )
......
UN (
)
N
N

UN (

U N ( )

Exercise1: Show that u(t) can be derived by putting UN() in u(t).

Ali Karimpour Sep 2010

lecture 2

Periodograms of signals over finite intervals


Some property of UN()

__________

U N ( ) U N ( )

U N ( 2 ) U N ( )

The function UN() is therefore uniquely defined by its values over the interval
[ 0, 2 ]. It is, however, customary to consider UN() for the interval [ - , ]. So
u(t)can be defined as

1
u (t )
N

N /2

i 2 kt / N
U
(
2
k

/
N
)
e
N

k N / 2 1

The number UN() tell us the weight that the frequency carries in the
decomposition. So
2

U N ( )

Is known as the periodogram of the signal u(t), t= 1 , 2 , 3 , ..


Parsevals relationship:
N

2
U
(
2
k

/
N
)

u
(
t
)
N

k 1

t 1

10

Ali Karimpour Sep 2010

lecture 2

Periodograms of signals over finite intervals


Example: Periodogram of a sinusoid

u (t ) A cos 0t
Suppose u (t ) is periodic so 0 2 / N 0 for some integer N 0 1
Let t 1, 2 , 3 , ... , N where N is a multiple of N 0 ( N sN 0 )
1
U N ( )
N

A cos 0t e
t 1

i t

2 N

2 N

i 0 t

t 1

e i 0 t e i t

i (0 ) t

e i ( 0 ) t

t 1

2
2
if 0

s
N0
N

2k
if
, ks
N

A2
N

U N ( )

11

Ali Karimpour Sep 2010

lecture 2

Periodograms of signals over finite intervals


Discrete Fourier transform (DFT)
u (1) u (2) u (3)
u (4) ...... u ( N )

1
U N ( )
N

u(t )e

it

t 1

2
4
6
) UN ( ) UN ( )
N
N
N
8
2 N
UN ( )
......
UN (
)
N
N

UN (

u1N

The periodogram defines, in a sense, the frequency contents of a signal over a


finite time interval.
12 ).
But we seek for a definition of a similar concept for signals over the interval [1,

1
U N ( )
N

u (t )e

it

t 1

But this limits fail to exist for many signals of


practical interest.

12
12
Ali Karimpour Sep 2010

lecture 2

Transformation of Periodograms
As a signal is filtered through a linear system, its Periodograms changes.

Let:
Define:

Claim:
where

13

Ali Karimpour Sep 2010

lecture 2

Transformation of Periodograms
Claim:
Proof:

Now

14

Ali Karimpour Sep 2010

lecture 2

Transformation of Periodograms
Claim:
Proof:

Now

15

Ali Karimpour Sep 2010

lecture 2

Transformation of Periodograms
Claim:
Proof:

So

16

Ali Karimpour Sep 2010

lecture 2

Stochastic Processes
A random variable (RV) is a rule (or function) that assigns a real number to every
outcome of a random experiment.
The closing price of Iranian power market observed from Apr. 15 to Sep. 22, 2009.
For scalar (RV)

For vector (RV)


Probability density function (PDF)

If e may assume a certain value with nonzero probability then fee contains function.
Two random variables e1 and e2 are independent, if we have:

P (e1 x1 e2 x2 ) P (e1 x1 ).P (e2 x2 )


Definition: The expectation E[e] of a random variable e is:

Definition: The variance, Cov[e], of a random variable, e, is:

17

Ali Karimpour Sep 2010

lecture 2

Stochastic Processes
A stochastic process is a rule (or function) that assigns a time function
to every outcome of a random experiment.
Consider the random experiment of tossing a dice at t = 0 and observing
the number on the top face.
The sample space of this experiment consists of the outcomes
{1, 2, 3, , 6}.
For each outcome of the experiment, let us arbitrarily assign a
function of time t in the following manner.

The set of functions {x1(t), x2(t), , x6(t)} represents a stochastic process.


18

Ali Karimpour Sep 2010

lecture 2

Stochastic Processes
Mean of a random process X(t) is
In general, mX(t) is a function of time.
Correlation RX(t1, t2) of a random process X(t) is

Note RX(t1, t2) is a function of t1 and t2.


Autocovariance CX(t1, t2) of a random process X(t) is defined as the covariance
of X(t1) and X(t2):

In particular, when t1 = t2 = t, we have

19

Ali Karimpour Sep 2010

lecture 2

Stochastic Processes
Example Sinusoid with random amplitude

20

Ali Karimpour Sep 2010

lecture 2

Stochastic Processes
Example Sinusoid with random phase

21

Ali Karimpour Sep 2010

lecture 2

Stochastic Processes
x(t) is stationary if
Example Sinusoid with random phase

Clearly x(t) is a stationary (WSS).


Example Sinusoid with random amplitude

Clearly x(t) is not a stationary.


This may be a limiting definition.

?????

22

Ali Karimpour Sep 2010

lecture 2

Signal Spectra
A Common Framework for Deterministic and Stochastic Signals

y (t ) G (q )u (t ) H (q )e(t )

y(t) is not a
stationary process

Ey (t ) G (q )u (t )

This may be a limiting definition.

?????

To deal with this problem, we introduce the following definition:

Quasi-stationary signals

23

Ali Karimpour Sep 2010

lecture 2

Stochastic Processes
x(t) is stationary if

Quasi-stationary signals: A signal {s(t)} is said to be quasi-stationary if it is subject to

and

(
If {s(t)} is a deterministic
sequence

)
Quasi-stationary
means
{s(t)} is a bounded sequence and

1
Rs ( ) lim
N N
If {s(t)} is a stationary
stochastic process

s(t )s(t )

Exist

t 1

It is quasi stationary since

Es (t ) s (t ) Rs ( ) does not depend


24 on t.

Ali Karimpour Sep 2010

lecture 2

Signal Spectra

Notation:

The notation means that the limit exists.

Quasi-stationary signals: A signal {s(t)} is said to be quasi-stationary if it is subject to

and

Sometimes with some abuse of notation, we call it Covariance function of s.


Exercise2: Show that sometime it is exactly covariance function of s.

25

Ali Karimpour Sep 2010

lecture 2

Signal Spectra
Two signals {s(t)} and {w(t)} are jointly quasi-stationary if:
1- They both are quasi-stationary,
2- the cross-covariance function
exist.

Uncorrelated

26

Ali Karimpour Sep 2010

lecture 2

Signal Spectra
Discrete Fourier transform (DFT)
u (1) u (2) u (3)
u (4) ...... u ( N )

1
U N ( )
N

u(t )e

it

t 1

2
4
6
) UN ( ) UN ( )
N
N
N
8
2 N
UN ( )
......
UN (
)
N
N

UN (

u1N

The periodogram defines, in a sense, the frequency contents of a signal over a


finite time interval.
27 ).
But we seek for a definition of a similar concept for signals over the interval [1,

1
U N ( )
N

u (t )e

it

t 1

But this limits fail to exist for many signals of practical interest.
So we shall develop a frame work for describing signals and their spectra that is
27
27
applicable to deterministic as well as stochastic signals.
Ali Karimpour Sep 2010

lecture 2

Signal Spectra
Use Fourier transform of covariance function (Spectrum or Spectral density)
We define the (power) spectrum of {s(t)} as

s ( )

i
R
(

)
e
s

When following limits exists:


and cross spectrum between {s(t)} and {w(t)} as

sw ( )

i
R
(

)
e
sw

When following limits exists:


Exercise3: Show that spectrum always is a real function but cross spectrum is in
28
general a complex-valued function.
Ali Karimpour Sep 2010

lecture 2

Signal Spectra
Exercise4 : Spectra of a Periodic Signal: Consider a deterministic, periodic
signal with period M, i.e., s(t)=s(t+M)
Show that

s ( ) sp ( ) F ( , M )
Where
M 1

( ) Rs ( )e
p
s

and

F ( , M )

ilM
e

And finally show that

2
s ( )
M

M 1

s (2k / M ) ( 2k / M ), 0 2
k 0

29

Ali Karimpour Sep 2010

lecture 2

Signal Spectra
Exercise5: Spectra of a Sinusoid: Consider

u (t ) A cos 0t

to the interval

[1, )

Show that

A2
( 0 ) ( 0 ) .2
u ( )
4

30

Ali Karimpour Sep 2010

lecture 2

Signal Spectra
Example Stationary Stochastic Processes: Consider v(t) as a stationary
stochastic processes

We will assume that e(t) has zero mean and variance . It is clear that:

Rv ( ) E v(t )v(t ) Ev(t )v(t )


The spectrum is:

v ( )

R ( )e

k max( 0 , )

(I )

k max( 0 , )

h(k )h(k )e

Where

h (k )h(k )

.............

H (e )

H (e ) h( s )e is
i

s 1

Exercise6: Show (I)

31

Ali Karimpour Sep 2010

lecture 2

Signal Spectra
Spectrum of Stationary Stochastic Processes

The stochastic process described by v(t)= H(q)e(t), where {e(t)}


is a sequence of independent random variables with zero mean
values and covariances , has the spectrum
i

v ( ) H (e )

32

Ali Karimpour Sep 2010

lecture 2

Signal Spectra
Spectrum of a Mixed Deterministic and Stochastic Signal

deterministic

Rs Ru Rv

Stochastic:
stationary and zero
mean

s ( ) u ( ) v ( )

Exercise7: Proof it.

33

Ali Karimpour Sep 2010

lecture 2

Transformation of Spectra by Linear Systems


Theorem: Let{w(t)} be a quasi-stationary with spectrum w ( ) , and let G(q) be
a stable transfer function. Let
Then {s(t)} is also quasi-stationary and
i

s ( ) G (e ) w ( )
sw ( ) G (e i ) w ( )

34

Ali Karimpour Sep 2010

lecture 2

Disturbances
There are always signals beyond our control that also affect the system.
We assume that such effects can be lumped into an additive term v(t) at
the output
v(t)
u(t)

So

y(t)

y (t ) g (k ) u (t k ) v(t )
k 1

There are many sources and causes for such a disturbance term.

Measurement noise.

Uncontrollable inputs. ( a person in a room produce 100 W/person)

35

Ali Karimpour Sep 2010

lecture 2

Disturbances
Characterization of disturbances
Its value is not known beforehand.
Making qualified guesses about future values is possible.
It is natural to employ a probabilistic framework to describe
future disturbances.
We put ourselves at time t and would like to know disturbance at
t+k, k 1 so we use the following approach.

v(t ) h(k ) e(t k )


Where e(t) is a white noise.

k 0

This description does not allow completely general characteristic of


all possible probabilistic disturbances, but it is versatile enough.
36
Ali Karimpour Sep 2010

lecture 2

Disturbances
Consider for example, the following PDF for e(t):

Small values of are suitable to describe classical disturbance patterns, steps,


pulses, sinuses and ramps.

A realization of v(t) for propose e(t)

Exercise8: Derive above figure for =0.1 and =0.9 and a suitable h(k).

37

Ali Karimpour Sep 2010

lecture 2

Disturbances
On the other hand, the PDF:

A realization of v(t) for propose e(t)

Often we only specify the second-order properties of the sequence {e(t)} that is the
mean and variances.
Exercise9: What is a white noise?

38 Sep 2010
Exercise10: Derive above figure for =0.1 and =0.9 and a suitable h(k). Ali Karimpour

lecture 2

Disturbances
We will assume that e(t) has zero mean and variance . Now we want to know
the characteristic of v(t) :

Mean:

Covariance:

k 0 s 0 h(k )h( s ) (k s )

k 0 h(k )h(k ) Rv ( )

We know that h(r ) 0 for r 0 (Since of causality)

39

Ali Karimpour Sep 2010

lecture 2

Disturbances
We will assume that e(t) has zero mean and variance . Now we want to know
the characteristic of v(t) :

Mean:

Covariance:
Rv ( ) Ev(t )v(t )

Since the mean and covariance are do not depend on t, the


process is said to be stationary.

40

Ali Karimpour Sep 2010

lecture 2

Ergodicity
Suppose you are concerned with determining what the most visited parks in a
city are.
One idea is to take a momentary snapshot: to see how many people are this moment
in park A, how many are in park B and so on.
Another idea is to look at one individual (or few of them) and to follow him for a
certain period of time, e.g. a year.
The first one may not be representative for a longer period of time, while the second
one may not be representative for all the people.
The idea is that an ensemble is ergodic if the two types of statistics give the same result.
Many ensembles, like the human populations, are not ergodic.

41

Ali Karimpour Sep 2010

lecture 2

Ergodicity
Let x(t) is a stochastic process

Most of our computations will depend on a given realization of a quasi


stationary process.
Ergodicity will allow us to make statements about repeated experiments.42

Ali Karimpour Sep 2010

Anda mungkin juga menyukai