Scroll back to top

    Asymptotic stability

    What are eigenvalues and eigenvectors?

    Consider a square matrix . If we can find a complex number and a non-zero, complex-valued vector that satisfy

    then we call an eigenvalue of and the corresponding eigenvector of . If we wanted, we could rearrange this equation to put it in a form that might be more familiar:

    One way to find eigenvalues is to solve the equation

    where "" means taking a determinant. In general, this equation will be an th-order polynomial in , and so will have solutions --- we might call them . To find an eigenvector that corresponds to each eigenvalue , we solve

    for . Note that there are many possible solutions to this equation and that eigenvectors are only unique up to a scale factor. In particular, for any real number , we have

    Apparently, if is an eigenvector corresponding to , then so is for any . For this reason, algorithms to find eigenvectors typically normalize them to have unit length.

    How do I diagonalize a square matrix?

    Suppose we have found the eigenvalues and eigenvectors of a square matrix . Define the matrix

    with an eigenvector in each column, and also the matrix

    with the eigenvalues along the diagonal.

    Two things are true.

    First, the following equality holds:

    You could easily verify this result for yourself.

    Second, if are all distinct (i.e., if no two eigenvalues are the same), then the matrix is invertible. This result is harder to verify --- it has to do with the fact that if the eigenvalues are distinct then the eigenvectors are linearly independent.

    The key consequence of being invertible is that we can solve the above equality to write:

    In this case --- if all eigenvalues are distinct and so the matrix of eigenvectors is invertible --- we say that is diagonalizable. The process of “diagonalizing ” is finding the matrix .

    What is the matrix exponential of a diagonal matrix?

    It is easy to find the matrix exponential of a diagonal matrix, starting from the definition:

    What is the solution to a linear system that is diagonalizable?

    We have seen that the solution to

    with the initial condition

    is

    Suppose is a diagonalizable matrix, so that

    where

    is a diagonal matrix that contains the eigenvalues of and where

    is a matrix of the corresponding eigenvectors. Then, applying the definition of matrix exponential again, we have

    where the last step comes from what we just found out about the matrix exponential of a diagonal matrix. In this expression, the terms , , and are constant. The only terms that depend on , in fact, are the scalar exponential functions

    that appear in the diagonal of

    Therefore, we can infer the behavior of based entirely on these scalar exponential functions. In particular, suppose that each eigenvalue — a complex number — has real part and imaginary part , or in other words that

    Euler’s formula tells us that

    Apparently, as time gets large, one of three things is true about each of these terms:

    • if , then grows quickly
    • if , then is constant () or is sinusoidal with constant magnitude ()
    • if , then decays quickly to zero

    It is possible to show that (more or less) the same result holds for any system , not only ones for which is diagonalizable. This takes more work, and involves the transformation of into Jordan normal form rather than into a diagonal matrix. We would discover that the terms that depend on all look like

    where is an integer that is at most the multiplicity of the eigenvalue . Since increases or decreases a lot faster than , then the same three things we listed above would be true of each term in , just like before.

    See Feedback Systems: An Introduction for Scientists and Engineers (Åström and Murray) for details.

    When is a linear system asymptotically stable?

    The system

    is called asymptotically stable if as , starting from any initial condition

    Based on our observations about the solution to linear systems that are diagonalizable, we can state the following important result:

    Asymptotic stability

    The system

    is asymptotically stable if and only if all eigenvalues of have negative real part.

    In particular, we now have a test for whether or not a controller “works.” Suppose we apply linear state feedback

    to the state space system

    so that

    The controller “works” when this system is asymptotically stable, i.e., when goes to zero as time gets large. We now know, therefore, that the controller “works” when all eigenvalues of have negative real part.

    We may not have a systematic way of finding a matrix to make the closed-loop system stable yet, but we certainly do have a systematic way now of deciding whether or not a given matrix makes the closed-loop system stable.