Lyapunov stability

Stability analysis of dynamical systems is often done by the Lyapunov stability theory.
If an equilibrium point is stable; all solutions which start in the neighbourhoud stay nearby while time increases. Solutions which don't just stay nearby but move closer towards the equilibrium position when time goes to infinity are asymptotically stable. Solutions which do not stay nearby the equilibrium state are unstable. The methods described below are used to determine Lyapunov stability.

Any scalar function which conforms to certain constraints, is called a Lyapunov function. Take for example a function which determines the energy contained by a system. If at the equilibrium point the function is zero, it is larger than zero anywhere else in the domain, and its time derivative is smaller or equal than zero, then the equilibrium point is stable. If the time derivative is negative definite instead of negative semidefinite, local asymptotic stability is proved.

Lyapunov's linearization method

Consider a nonlinear system which obtains at least one equilibrium state. The stability of an equilibrium position can be determined by performing a linearization of the nonlinear system equations. The linearization leads to a state space model with system matrix A. If the location of the equilibrium position is substituted in matrix A, the eigenvalues of the obtained matrix determine the stability of the system:
- If all eigenvalue have a negative real part, the equilibrium point is stable.

- If one or more of the eigenvalues has a positive real part, the equilibrium point is unstable.

- If one or more of the eigenvalues has a zero real part, nothing can be said about the stability of the equilibrium point.

This stability analysis theory is known as Lyapunov’s linearization method, or Lyapunov’s first method

Lyapunov's direct method

In cases where Lyapunov linearization doesn’t give a conclusion about the stability of an equilibrium point, Lyapunov's direct method (also known as second method) can be used. Any scalar function which conforms to certain constraints, is called a Lyapunov function. Take for example a function which determines the energy contained by a system. If at the equilibrium point the function is zero, it is larger than zero anywhere else in the domain, and its time derivative is smaller or equal than zero, then the equilibrium point is stable. If the time derivative is negative definite instead of negative semidefinite, local asymptotic stability is proved.

Several conditions are available to determine the type of stability; local, global and asymptotically for example. More information about whether an equation is suitable to conclude stability of an equilibrium state can be found on this page: Lyapunov function.

If by the use of a Lyapunov stability can’t be concluded, it does not have to mean the system is not stable; of course it can be unstable, but it also might just mean you haven’t found the right function yet.