The article's lead section may need to be rewritten.(December 2021) |
Part of a series on |
Astrodynamics |
---|
Various types of stability may be discussed for the solutions of differential equations or difference equations describing dynamical systems. The most important type is that concerning the stability of solutions near to a point of equilibrium. This may be discussed by the theory of Aleksandr Lyapunov. In simple terms, if the solutions that start out near an equilibrium point stay near forever, then is Lyapunov stable. More strongly, if is Lyapunov stable and all solutions that start out near converge to , then is said to be asymptotically stable (see asymptotic analysis). The notion of exponential stability guarantees a minimal rate of decay, i.e., an estimate of how quickly the solutions converge. The idea of Lyapunov stability can be extended to infinite-dimensional manifolds, where it is known as structural stability, which concerns the behavior of different but "nearby" solutions to differential equations. Input-to-state stability (ISS) applies Lyapunov notions to systems with inputs.
Lyapunov stability is named after Aleksandr Mikhailovich Lyapunov, a Russian mathematician who defended the thesis The General Problem of Stability of Motion at Kharkov University in 1892. [1] A. M. Lyapunov was a pioneer in successful endeavors to develop a global approach to the analysis of the stability of nonlinear dynamical systems by comparison with the widely spread local method of linearizing them about points of equilibrium. His work, initially published in Russian and then translated to French, received little attention for many years. The mathematical theory of stability of motion, founded by A. M. Lyapunov, considerably anticipated the time for its implementation in science and technology. Moreover Lyapunov did not himself make application in this field, his own interest being in the stability of rotating fluid masses with astronomical application. He did not have doctoral students who followed the research in the field of stability and his own destiny was terribly tragic because of his suicide in 1918 [ citation needed ]. For several decades the theory of stability sank into complete oblivion. The Russian-Soviet mathematician and mechanician Nikolay Gur'yevich Chetaev working at the Kazan Aviation Institute in the 1930s was the first who realized the incredible magnitude of the discovery made by A. M. Lyapunov. The contribution to the theory made by N. G. Chetaev [2] was so significant that many mathematicians, physicists and engineers consider him Lyapunov's direct successor and the next-in-line scientific descendant in the creation and development of the mathematical theory of stability.
The interest in it suddenly skyrocketed during the Cold War period when the so-called "Second Method of Lyapunov" (see below) was found to be applicable to the stability of aerospace guidance systems which typically contain strong nonlinearities not treatable by other methods. A large number of publications appeared then and since in the control and systems literature. [3] [4] [5] [6] [7] More recently the concept of the Lyapunov exponent (related to Lyapunov's First Method of discussing stability) has received wide interest in connection with chaos theory. Lyapunov stability methods have also been applied to finding equilibrium solutions in traffic assignment problems. [8]
Consider an autonomous nonlinear dynamical system
where denotes the system state vector, an open set containing the origin, and is a continuous vector field on . Suppose has an equilibrium at so that then
Conceptually, the meanings of the above terms are the following:
The trajectory is (locally) attractive if
for all trajectories that start close enough to , and globally attractive if this property holds for all trajectories.
That is, if x belongs to the interior of its stable manifold, it is asymptotically stable if it is both attractive and stable. (There are examples showing that attractivity does not imply asymptotic stability. [9] [10] [11] Such examples are easy to create using homoclinic connections.)
If the Jacobian of the dynamical system at an equilibrium happens to be a stability matrix (i.e., if the real part of each eigenvalue is strictly negative), then the equilibrium is asymptotically stable.
Instead of considering stability only near an equilibrium point (a constant solution ), one can formulate similar definitions of stability near an arbitrary solution . However, one can reduce the more general case to that of an equilibrium by a change of variables called a "system of deviations". Define , obeying the differential equation:
This is no longer an autonomous system, but it has a guaranteed equilibrium point at whose stability is equivalent to the stability of the original solution .
Lyapunov, in his original 1892 work, proposed two methods for demonstrating stability. [1] The first method developed the solution in a series which was then proved convergent within limits. The second method, which is now referred to as the Lyapunov stability criterion or the Direct Method, makes use of a Lyapunov function V(x) which has an analogy to the potential function of classical dynamics. It is introduced as follows for a system having a point of equilibrium at . Consider a function such that
Then V(x) is called a Lyapunov function and the system is stable in the sense of Lyapunov. (Note that is required; otherwise for example would "prove" that is locally stable.) An additional condition called "properness" or "radial unboundedness" is required in order to conclude global stability. Global asymptotic stability (GAS) follows similarly.
It is easier to visualize this method of analysis by thinking of a physical system (e.g. vibrating spring and mass) and considering the energy of such a system. If the system loses energy over time and the energy is never restored then eventually the system must grind to a stop and reach some final resting state. This final state is called the attractor. However, finding a function that gives the precise energy of a physical system can be difficult, and for abstract mathematical systems, economic systems or biological systems, the concept of energy may not be applicable.
Lyapunov's realization was that stability can be proven without requiring knowledge of the true physical energy, provided a Lyapunov function can be found to satisfy the above constraints.
The definition for discrete-time systems is almost identical to that for continuous-time systems. The definition below provides this, using an alternate language commonly used in more mathematical texts.
Let (X, d) be a metric space and f : X → X a continuous function. A point x in X is said to be Lyapunov stable, if,
We say that x is asymptotically stable if it belongs to the interior of its stable set, i.e. if,
A linear state space model
where is a finite matrix, is asymptotically stable (in fact, exponentially stable) if all real parts of the eigenvalues of are negative. This condition is equivalent to the following one: [12]
is negative definite for some positive definite matrix . (The relevant Lyapunov function is .)
Correspondingly, a time-discrete linear state space model
is asymptotically stable (in fact, exponentially stable) if all the eigenvalues of have a modulus smaller than one.
This latter condition has been generalized to switched systems: a linear switched discrete time system (ruled by a set of matrices )
is asymptotically stable (in fact, exponentially stable) if the joint spectral radius of the set is smaller than one.
A system with inputs (or controls) has the form
where the (generally time-dependent) input u(t) may be viewed as a control, external input, stimulus, disturbance, or forcing function. It has been shown [13] that near to a point of equilibrium which is Lyapunov stable the system remains stable under small disturbances. For larger input disturbances the study of such systems is the subject of control theory and applied in control engineering. For systems with inputs, one must quantify the effect of inputs on the stability of the system. The main two approaches to this analysis are BIBO stability (for linear systems) and input-to-state stability (ISS) (for nonlinear systems)
This example shows a system where a Lyapunov function can be used to prove Lyapunov stability but cannot show asymptotic stability. Consider the following equation, based on the Van der Pol oscillator equation with the friction term changed:
Let
so that the corresponding system is
The origin is the only equilibrium point. Let us choose as a Lyapunov function
which is clearly positive definite. Its derivative is
It seems that if the parameter is positive, stability is asymptotic for But this is wrong, since does not depend on , and will be 0 everywhere on the axis. The equilibrium is Lyapunov stable but not asymptotically stable.
It may be difficult to find a Lyapunov function with a negative definite derivative as required by the Lyapunov stability criterion, however a function with that is only negative semi-definite may be available. In autonomous systems, the invariant set theorem can be applied to prove asymptotic stability, but this theorem is not applicable when the dynamics are a function of time. [14]
Instead, Barbalat's lemma allows for Lyapunov-like analysis of these non-autonomous systems. The lemma is motivated by the following observations. Assuming f is a function of time only:
Barbalat's Lemma says:
An alternative version is as follows:
In the following form the Lemma is true also in the vector valued case:
The following example is taken from page 125 of Slotine and Li's book Applied Nonlinear Control. [14]
Consider a non-autonomous system
This is non-autonomous because the input is a function of time. Assume that the input is bounded.
Taking gives
This says that by first two conditions and hence and are bounded. But it does not say anything about the convergence of to zero, as is only negative semi-definite (note can be non-zero when =0) and the dynamics are non-autonomous.
Using Barbalat's lemma:
This is bounded because , and are bounded. This implies as and hence . This proves that the error converges.
The logistic map is a polynomial mapping of degree 2, often referred to as an archetypal example of how complex, chaotic behaviour can arise from very simple nonlinear dynamical equations. The map, initially utilized by Edward Lorenz in the 1960s to showcase irregular solutions, was popularized in a 1976 paper by the biologist Robert May, in part as a discrete-time demographic model analogous to the logistic equation written down by Pierre François Verhulst. Mathematically, the logistic map is written
In probability theory, the central limit theorem (CLT) states that, under appropriate conditions, the distribution of a normalized version of the sample mean converges to a standard normal distribution. This holds even if the original variables themselves are not normally distributed. There are several versions of the CLT, each applying in the context of different conditions.
In mathematics, the Lyapunov exponent or Lyapunov characteristic exponent of a dynamical system is a quantity that characterizes the rate of separation of infinitesimally close trajectories. Quantitatively, two trajectories in phase space with initial separation vector diverge at a rate given by
In the theory of ordinary differential equations (ODEs), Lyapunov functions, named after Aleksandr Lyapunov, are scalar functions that may be used to prove the stability of an equilibrium of an ODE. Lyapunov functions are important to stability theory of dynamical systems and control theory. A similar concept appears in the theory of general state-space Markov chains usually under the name Foster–Lyapunov functions.
In control engineering and system identification, a state-space representation is a mathematical model of a physical system specified as a set of input, output, and variables related by first-order differential equations or difference equations. Such variables, called state variables, evolve over time in a way that depends on the values they have at any given instant and on the externally imposed values of input variables. Output variables’ values depend on the state variable values and may also depend on the input variable values.
In mathematics, Laplace's method, named after Pierre-Simon Laplace, is a technique used to approximate integrals of the form
The Lyapunov equation, named after the Russian mathematician Aleksandr Lyapunov, is a matrix equation used in the stability analysis of linear dynamical systems.
In mathematics, a Hurwitz matrix, or Routh–Hurwitz matrix, in engineering stability matrix, is a structured real square matrix constructed with coefficients of a real polynomial.
The Duffing equation, named after Georg Duffing (1861–1944), is a non-linear second-order differential equation used to model certain damped and driven oscillators. The equation is given by
Nonlinear control theory is the area of control theory which deals with systems that are nonlinear, time-variant, or both. Control theory is an interdisciplinary branch of engineering and mathematics that is concerned with the behavior of dynamical systems with inputs, and how to modify the output by changes in the input using feedback, feedforward, or signal filtering. The system to be controlled is called the "plant". One way to make the output of a system follow a desired reference signal is to compare the output of the plant to the desired output, and provide feedback to the plant to modify the output to bring it closer to the desired output.
In the mathematics of evolving systems, the concept of a center manifold was originally developed to determine stability of degenerate equilibria. Subsequently, the concept of center manifolds was realised to be fundamental to mathematical modelling.
LaSalle's invariance principle is a criterion for the asymptotic stability of an autonomous dynamical system.
In applied mathematics, comparison functions are several classes of continuous functions, which are used in stability theory to characterize the stability properties of control systems as Lyapunov stability, uniform asymptotic stability etc.
In mathematics, stability theory addresses the stability of solutions of differential equations and of trajectories of dynamical systems under small perturbations of initial conditions. The heat equation, for example, is a stable partial differential equation because small perturbations of initial data lead to small variations in temperature at a later time as a result of the maximum principle. In partial differential equations one may measure the distances between functions using Lp norms or the sup norm, while in differential geometry one may measure the distance between spaces using the Gromov–Hausdorff distance.
In control theory, a control-Lyapunov function (CLF) is an extension of the idea of Lyapunov function to systems with control inputs. The ordinary Lyapunov function is used to test whether a dynamical system is (Lyapunov) stable or asymptotically stable. Lyapunov stability means that if the system starts in a state in some domain D, then the state will remain in D for all time. For asymptotic stability, the state is also required to converge to . A control-Lyapunov function is used to test whether a system is asymptotically stabilizable, that is whether for any state x there exists a control such that the system can be brought to the zero state asymptotically by applying the control u.
In control theory, backstepping is a technique developed circa 1990 by Myroslav Sparavalo, Petar V. Kokotovic, and others for designing stabilizing controls for a special class of nonlinear dynamical systems. These systems are built from subsystems that radiate out from an irreducible subsystem that can be stabilized using some other method. Because of this recursive structure, the designer can start the design process at the known-stable system and "back out" new controllers that progressively stabilize each outer subsystem. The process terminates when the final external control is reached. Hence, this process is known as backstepping.
For applied mathematics, in nonlinear control theory, a non-linear system of the form is said to satisfy the small control property if for every there exists a so that for all there exists a so that the time derivative of the system's Lyapunov function is negative definite at that point.
The Lyapunov–Malkin theorem is a mathematical theorem detailing stability of nonlinear systems.
In mathematical physics and the theory of partial differential equations, the solitary wave solution of the form is said to be orbitally stable if any solution with the initial data sufficiently close to forever remains in a given small neighborhood of the trajectory of
Input-to-state stability (ISS) is a stability notion widely used to study stability of nonlinear control systems with external inputs. Roughly speaking, a control system is ISS if it is globally asymptotically stable in the absence of external inputs and if its trajectories are bounded by a function of the size of the input for all sufficiently large times. The importance of ISS is due to the fact that the concept has bridged the gap between input–output and state-space methods, widely used within the control systems community.
This article incorporates material from asymptotically stable on PlanetMath, which is licensed under the Creative Commons Attribution/Share-Alike License.