3

I have a question regarding the stability of linear systems. Let's assume we have two stable linear systems represented by matrices $A_1$ and $A_2$, where both matrices have eigenvalues with strictly negative real parts (i.e., both systems are stable in the sense that their eigenvalues are in the left half of the complex plane).

Now, consider the linear interpolation of these two matrices:

$$A(\alpha) = \alpha A_1 + (1 − \alpha)A_2, \alpha \in [0, 1]$$

My question is: Can we always say that the interpolated matrix $A(\alpha)$ is also stable for all $\alpha \in [0, 1]$? In other words, does the interpolation of two stable matrices always result in a stable matrix?

If this is not always true, could you provide an example where the interpolation results in an unstable system? Any insights or references to existing research would be greatly appreciated.

  • If it is true for A+B in general then it should be true, as multiplying the matrix by a constant only shifts the eigenvalues, and you can keep doing sums such as (A+B) + B to get as close to the ratio $\alpha$ you want. So I think you look at the answers for eigenvalues for matrix sums. Eigenvalues of matrix sums, URL (version: 2018-06-19): https://mathoverflow.net/q/4224 – kirk beatty Oct 24 '24 at 04:57
  • 1
    Btw if you want an insight for this topic fact in recent existing research, you can look to the topics: linear parameter varying system and/or switched systems. In this topic switching between stable systems (or varying the parameters between stable cases) can lead to exploding solutions/unstable behaviour.... – MatteoDR Oct 24 '24 at 11:58

3 Answers3

8

No. Here is a counterexample:

$$A_1 = \begin{pmatrix} -\frac{1}{2} & 2 \\ 0 & -\frac{1}{2}\end{pmatrix}$$ $$A_2 = \begin{pmatrix} -\frac{1}{2} & 0 \\ 2 & -\frac{1}{2}\end{pmatrix}$$ $$A(\frac{1}{2}) = \frac{1}{2}A_1 + \frac{1}{2}A_2 = \frac{1}{2}\begin{pmatrix} -1 & 2 \\ 2 & -1\end{pmatrix}$$

Then both $A_1$ and $A_2$ have only strictly negative eigenvalues, namely $-\frac{1}{2}$, but $A(\frac{1}{2})$ has a positive eigenvalue, namely $\frac{1}{2}$.

David Gao
  • 22,850
  • 9
  • 28
  • 48
3

While the other answers have focused on providing counterexample, it may also be interesting to ask under what general conditions on the matrices $A_1$ and $A_2$ the statement is actually true.

This is going to be true whenever there is a change of basis under which both matrices are expressed in upper-triangular form.

Special cases include matrices with the same eigenvectors or which can be put in their Jordan form using the same change of basis.

Implicit (sufficient) conditions include the existence of a positive definite matrix $P$ such that $A_i^TP+PA_i$ is negative definite for both $i=1$ and $i=2$.

I will be happy to add more conditions if you mention them in the comments.

KBS
  • 7,903
1

Copying above from David Gao just wanted to note that can have eigenvalue of 0. So be careful linear interpolating matrices. $$A_1 = \begin{pmatrix} -1 & -2 \\ 0 & -1\end{pmatrix}$$

$$A_2 = \begin{pmatrix} -1 & 0 \\ -2 & -1 \end{pmatrix}$$

$$A(\frac{1}{2}) = \frac{1}{2}A_1 + \frac{1}{2}A_2 = \frac{1}{2}\begin{pmatrix} -2 & -2 \\ -2 & -2\end{pmatrix}$$