1

Suppose $A$ and $B$ are two matrices over the field $\mathbb{R}$. We do not assume that $A$ or $B$ are invertible, but we want to prove that there exists a real number $c$ such that $A+cB$ is invertible.

Additionally, we assume that $A+iB$ is invertible, where $i=\sqrt{-1}$ is the imaginary unit.

I have seen a proof that uses the determinant of $A+cB$. However, I am interested in finding a more elementary proof that does not rely on the determinant or the eigenvalues of the matrices. If possible, I would like to use only basic concepts of linear algebra, such as rank, linear independence, linear transformation, etc. If polynomials are involved, please avoid using the characteristic polynomial or the minimal polynomial of the matrices.

Any help or hint would be greatly appreciated!

Anne Bauval
  • 49,005
  • 1
    First of all, it is not generally true that such a $c$ will exist under the assumptions you presented. What conditions on $A$ and $B$ would you like to impose in order to ensure that such a $c$ exists? – Ben Grossmann Dec 12 '23 at 22:43
  • @BenGrossmann Thanks for your comment. I want to complete the proof presented here. So $A$ and $B$ are $U$ and $V$ respectively. That means $A+iB$ is invertible. –  Dec 12 '23 at 22:45
  • 2
    So we are given that $A + iB$ is an invertible complex matrix. Please make this explicit in your question. – Ben Grossmann Dec 12 '23 at 22:49

2 Answers2

2

Let $f(t) := \det(A+tB)$,and $f(i)=\det(A+iB)\neq0$ implies that $f(t)$ is not zero polynomial.

According to Fundamental theorem of algebra , $f$ has at most $n$ complex roots.

Therefore there exists a real number $c_0$ such that $f(c_0) = \det(A+c_0 B) \neq 0$

Zhiwei
  • 117
  • Thanks for your answer. I appreciate it. But as I said, I want to see if a more elementary method can be found. Is it possible to write this proof without talking about determinant? –  Dec 13 '23 at 05:12
  • 1
    @IlkayBurak It is, as shown in the other answer, but, as stressed by Mariano, this requirement is unreasonable. And you must admit that the other answer is much less "elementary". – Anne Bauval Dec 13 '23 at 10:53
  • Why can you guarantee that the roots of $f(t)$ are not all complex? – Deif Dec 13 '23 at 17:59
  • 1
    @Deif Nobody said so, and it is not needed. – Anne Bauval Dec 14 '23 at 05:20
1

Let us ask a slightly more general question first.

Let $F$ is a proper subfield of an extension field $E$ and $A,B\in M_n(F)$. If $A+\lambda B\in M_n(E)$ is invertible for some $\lambda\in E\setminus F$, is it true that $A+cB$ is invertible for some $c\in F$?

It turns out that the answer depends on the cardinality of $F$. If $|F|\le n$, the answer can be negative. E.g. when $$ F=GF(2)=\{0,1\},\quad A=\pmatrix{0&0\\ 1&1}\quad\text{and}\quad B=I_2, $$ $A+cB$ is singular for all $c\in F$. However, as $\det(A+xB)=x(x-1)$, $A+\lambda B$ is invertible for all $\lambda\in E\setminus F$.

Yet, when $|F|>n$, the answer is affirmative. As you already know, this is evident from the usual determinantal approach: since $p(x)=\det(A+xB)$ is not the zero polynomial (because $p(\lambda)\ne0$) and its degree is at most $n$ (because the matrices are $n\times n$), it has at most $n$ roots in $F$. Hence $p(c)$ must be nonzero for some $c\in F$ when $|F|>n$. In turn, $A+cB$ is invertible.

As you want other approaches, we will present two proofs below. The first proof works over any field $F$ with $|F|>n$, but for brevity, we will consider only the case where $F$ is infinite. The second proof is tailor-made for your specific case that $F=\mathbb R,\, E=\mathbb C$ and $\lambda=i$.


Proof 1. Let $x$ be an indeterminate and $Q=F(x)$ be the quotient field of the polynomial ring $F[x]$. If $A+xB\in M_n(Q)$ is singular, there exists some nonzero vector $v(x)\in Q^n$ such that $(A+xB)v(x)=0$. By definition of $Q$, each element $v_j(x)$ of $v(x)$ is a fraction of two polynomials in $x$ with coefficients in $F$. We may assume that this fraction is in lowest terms. Therefore, by multiplying $v$ with the least common multiple of the denominators of the $v_j(x)$s, we may assume that each $v_j(x)$ is a polynomial. Then, by multiplying $v(x)$ by $\gcd(v_1(x),\ldots,v_n(x))^{-1}$, we may further assume that $v_1(x),\ldots,v_n(x)$ are coprime.

It follows that $v_j(\lambda)$ must be nonzero for some $j$. Let $h:Q\to E$ be the field homomorphism defined by $h(x)=\lambda$ and $h(1_F)=1_E$. For any vector $w\in Q^n$, let $h(w)$ be the application of $h$ to $w$ entrywise. Then $(A+\lambda B)v(\lambda)=h\left((A+xB)v(x)\right)=h(0)=0$ but $v(\lambda)\ne0$. This contradicts the assumption that $A+\lambda B$ is invertible over $E$. Hence $A+xB$ must be invertible over $Q=F(x)$. Write $(A+xB)^{-1}$ as $\frac{1}{q(x)}P(x)$ where $q(x)\in\mathbb F[x]\setminus0$ and $P(x)\in M_n(F[x])$. Since $F$ is an infinite field, we can always pick some $c\in F$ such that $q(c)\ne0$. (The degree of $q$ can actually be shown to be $\le n$. Therefore, it actually suffices to assume only that $|F|>n$, but to keep our answer short, we shall skip the proof of $\deg q\le n$ here.) Then $\frac{1}{q(c)}P(c)(A+cB)=I$. Hence $A+cB$ is invertible.


Proof 2. (Here $F=\mathbb R,\, E=\mathbb C$ and $A+iB$ is invertible.) Suppose the contrary that $A+cB$ is singular for every real number $c$. Let $(c_1,c_2,\ldots)$ be any $n+1$ distinct real numbers. Then for each $j$ we may pick a nonzero vectors $\mathbf v_j$ such that $(A+c_jB)\mathbf v_j=0$. Let $k\ge2$ be the smallest integer such that $\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_k$ are linearly dependent (such $k$ exists because $\mathbf v_1,\mathbf v_2,\ldots,\mathbf v_{n+1}$ must be linearly dependent in $\mathbb R^n$). Write $\mathbf v_k$ as a linear combination $\sum_{j=1}^{k-1}w_j\mathbf v_j$ of the other $\mathbf v_j$s. By removing any $\mathbf v_j$ whose coefficient $w_j$ is zero from the list, we may assume that each $w_j$ is nonzero. So, if we define $V=\pmatrix{\mathbf v_1&\mathbf v_2&\cdots&\mathbf v_k}$, then $\ker(V)=\operatorname{span}\{\mathbf w\}$ where $\mathbf w=(w_1,w_2,\ldots,w_k)^T$ is an entrywise nonzero vector.

Let $C=\operatorname{diag}(c_1,c_2,\ldots,c_k)$. By construction, we have $AV=-BVC$. Since $C$ is a diagonal matrix with distinct diagonal entries and $\mathbf w$ is entrywise nonzero, $C\mathbf w$ and $\mathbf w$ are linearly independent. Therefore, if $$ \pmatrix{C&I_k\\ I_k&-C}\pmatrix{p\mathbf w\\ q\mathbf w}=\pmatrix{r\mathbf w\\ s\mathbf w} $$ for some $p,q,r,s\in\mathbb R$, i.e., if $pC\mathbf w=(r-q)\mathbf w$ and $-qC\mathbf w=(s-p)\mathbf w$, we must have $p=q=r=s=0$. It follows that for any nonzero vector $$ \pmatrix{\mathbf x\\ \mathbf y}\in\ker\left[\pmatrix{V\\ &V}\pmatrix{C&I_k\\ I_k&-C}\right] $$ (such a vector exist because the block-diagonal matrix containing two copies of $V$ has deficient rank), at least one of $\mathbf x$ and $\mathbf y$ is not a scalar multiple of $\mathbf w$, i.e., at least one of $V\mathbf x$ and $V\mathbf y$ is nonzero.

By our choice of $\mathbf x$ and $\mathbf y$, we have $VC\mathbf x=-V\mathbf y$ and $VC\mathbf y=V\mathbf x$. Therefore $VC(\mathbf x+i\mathbf y)=iV(\mathbf x+i\mathbf y)$, i.e., $VC\mathbf z=iV\mathbf z$, where $\mathbf z=\mathbf x+i\mathbf y$. Since at least one of $V\mathbf x$ and $V\mathbf y$ is nonzero, $V\mathbf z$ is nonzero. Therefore $$ (A+iB)V\mathbf z=(AV)\mathbf z+iBV\mathbf z=-BVC\mathbf z+iBV\mathbf z=-B(iV\mathbf z)+iBV\mathbf z=0, $$ which is a contradiction to the assumption that $A+iB$ is invertible. Hence $A+cB$ must be invertible for some real number $c$.

user1551
  • 149,263