Anda di halaman 1dari 15

Lyapunov stability theory Consider the nonlinear system

= f ( x) x

(1)

Let us assume that xeq = 0 is an equilibrium point of (1). Q.: What if the equilibrium point xeq 0? A.: There is no loss of generality by this assumption. We can always choose a shifted coordinates system in the form y = x xeq. The derivative of y is given by
=x = f ( x ) = f ( y + xeq ) = g ( y ) = 0 y

The system described by the means of the new variable has the equilibrium at the origin.

Definition The equilibrium point x = 0 of (1) is stable, if for each > 0, there is = () > 0 such that
x (0) < x(t ) < , t 0

unstable, if not stable asymptotically stable, if it is stable and can be chosen such that
x(0) < lim x(t ) = 0
t

marginally stable if it is stable but not asymptotically stable

Asymptotically stable

xeq

Marginally stable

unstable

Lyapunov First Method (The indirect method) According to the basic definitions, stability properties depend only on the nature of the system near the equilibrium point.

Let us linearize the system description!

For small deviations from the equilibrium point, the performance of the system is approximately governed by the linear terms. These terms dominate and thus determine stability provided that the linear terms do not vanish. The idea of checking stability by examination of a linearized version of the system is referred to as Liapunovs first method or Liapunovs indirect method.

Theorem Let x = 0 be an equilibrium point of a nonlinear system


= f ( x) x

where f : D Rn is continuously differentiable and D is the neighborhood of the equilibrium point. Let i denote the eigenvalues of the matrix
A= f x
x= 0

1. If Re i < 0 for all i then x = 0 is asymptotically stable for the nonlinear system. 2. If Re i > 0 for one or more i then x = 0 is unstable for the nonlinear system. 3. If Re i 0 for all i and at least one Re j = 0 then x = 0 may be either stable, asymptotically stable or unstable for the nonlinear system Conclusion: Except for the boundary situation, the eigenvalues of the linearized system completely reveal the stability properties of an equilibrium point of a nonlinear system. If there are boundary eigenvalues, a separate analysis is required

An example K M B
d 2 y (t ) dy (t ) +B + Ky (t ) = f (t ) 2 dt dt

f y

Moreover, since we are interested in stability properties, f(t) = 0.


M y + By + Ky = 0 ,

equilibrium point : y = 0

State variables

x1 (t ) = y(t ) (t ) x2 (t ) = y
1 = y = x2 x K B x = y = x1 x2 2 M M

The total stored energy is given by


V (t ) = 1 1 2 2 Kx1 + Mx2 2 2

which have the following properties: positive for all nonzero values of x1(t) and x1(t) equals zero when x1(t) = x1(t) = 0 The time derivative of V(t) is given by:
dV (t ) V (t ) V (t ) 1 + 2 = x x dt x1 x2 dV (t ) 2 = Bx2 dt

dV/dt is negative the state must move from its initial state in the direction of smaller values of V(t)

V = C3 V = C2 V = C1

x2

x1

C1 < C2 < C3

Lyapunov Second Method (The direct method) Theorem Let x = 0 be an equilibrium point of a nonlinear system
= f ( x) x

Let V : D R be a continuously differentiable function on a neighborhood D of x = 0, such that V(0) = 0 and V(x) > 0 in D {0},
( x) 0 V

in D

Then, x = 0 is stable.
( x ) < 0 in D {0} then x = 0 is asymptotically stable Moreover, if V

The task: To find V(x), called a Lyapunov function, which must satisfy the following requirements: V is continuous V(x) has a unique minimum at xeq with respect to all other points in D Along any trajectory of the system contained in D the value of V never increases

What if the stability of x = 0 has been established? The first Lyapunov method determines stability in the immediate vicinity of the equilibrium point. The second Lyapunov method allows to determine how far from the equilibrium point the trajectory can be and still converge to it as t approaches

region of asymptotic stability (region of attraction, basin) Let (t;x) be the solution of the system equation that starts at initial state x at time t = 0. Then the region of attraction is defined as the set of all points x such that limt (t;x) = 0 If c = { x Rn | V(x) c } is bounded and contained in D, then every trajectory starting in c remains in c and approaches the equilibrium point as t . Thus, c is an estimate of the region of attraction.

Types of stability with reference to the region of attraction: local stability (stability in the small) when a system remains within an infinitesimal region around the equilibrium when subjected to small perturbation finite stability when a system returns to the equilibrium point from any point within a region R of finite dimensions surrounding it global stability (stability in the large) if the region R includes the entire state space Theorem Let x = 0 be an equilibrium point of a nonlinear system
= f ( x) x

Let V : Rn R be a continuously differentiable function such that V(0) = 0 and V(x) > 0 x 0,

||x|| V(x)
( x) 0 V

x 0

then x = 0 is globally asymptotically stable

Another example (a pursuit problem) Suppose a hound is chasing a rabbit (in such a way that his velocity vector always points directly toward the rabbit). The velocities of the rabbit and the hound are constant and denoted by R and H, respectively (see the picture)

Let xr, yr, and xh, yh denote the x and y coordinates of the rabbit and hound, respectively. Then
r = R x y r = yr = 0
2 2 h h x +y = H2

The fact that velocity vector of the hound always points toward the rabbit means that
h = k ( xh xr ) x y h = k ( y h y r )

k a positive constant

So
h = x h = y H ( xh xr )
2 ( xh xr ) 2 + y h

Hy h
2 ( x h xr ) 2 + y h

Let us introduce the relative coordinates the coordinates of the difference in position of the hound and the rabbit:
x = xh x r y = yh

= x h = y

Hx x2 + y 2 Hy

(*)

x2 + y 2

Will the hound always catch the rabbit?

Will a trajectory with an arbitrary initial condition eventually get to the point where the relative coordinates are zero?

We can consider the origin as an equilibrium point

What are conditions for global stability of the system?


We have to find a suitable Lyapunov function for the system given by (*) Let us choose as a Lyapunov function V(x,y) = x2 + y2 Then
( x, y ) = 2 H V x 2 + y 2 2 Rx

If H > R:
( x, y ) < 0 if x = 0 and y 0, it is clear that V

if x 0 then
H x 2 + y 2 Rx < ( H R ) x < 0

( x, y ) < 0 for all x, y except the origin. Thus, V

If the hound runs faster than the rabbit, he always catches the rabbit

Comments on the second Lyapunovs method: determines stability without actually having to solve the differential equation can be applied even if the system model cannot be linearized allows to estimate the stability region in some cases there are natural Lyapunov function candidates, like energy functions in electrical or mechanical systems

the stability conditions are sufficient, but not necessary

there is no systematic method for finding Lyapunov functions sometimes a matter of trial and error a Lyapunov function for any particular system is not unique

Nonlinear phenomena: Finite escape time: The state of an unstable linear system goes to infinity as time approaches infinity; a nonlinear systems state, however, can go to infinity in finite time. Multiple isolated equilibria: a linear system can have only one isolated equilibrium point; hence it can have only one steadystate operating point which attracts the state of the system irrespective of the initial state. A nonlinear system can have more than one isolated equilibrium point. The state may converge to one of the several steady-state operating points, depending on the initial state of the system. Limit cycles: For a linear time-invariant system to oscilate, it must have a pair of eigenvalues on the imaginary axis, which is a nonrobust condition that is almost impossible to maintain in the presence of perturbations. Even if we do, the amplitude of the oscillation will be dependent on the initial state. In real life stable oscillation must be produced by nonlinear systems. There are nonlinear systemswhich can go into an oscillation of fixed amplitude and frequency, irrespective of the initial state (so called limit cycle)

Subharmonic, harmonic or almostperiodic oscillations: A stable linear system under a periodic input produces an output of the same frequency. A nonlinear system under periodic excitation can oscillate with frequencies which are submultiples or multiples of the input frequency. It may even generate an almostperiodic oscillation, an example of which is the sum of periodic oscillations with frequencies which are not multiples of each other Chaos: A nonlinear system can have more complicated steadystate behavior that is not equilibrium, periodic oscillation or almostperiodic oscillation. Such behavior is usually referred to as chaos. Some of these chaotic motions exhibit randomness, despite the deterministic nature of the system Multiple modes of behavior: It is not unusual for two or more modes of behavior to be exhibited by the same nonlinear system. An unforced system may have more than one limit cycle. A forced system with periodic excitation may exhibit harmonic, subharmonic or more complicated steady-state behavior, depending upon the amplitude and frequency of the input. It may even exhibit a discontinuous jump in the mode of behavior as the amplitude or frequency of the excitation is smoothly changed

Anda mungkin juga menyukai