12

The true or false question states: "True of False: Every 3-dimensional subspace of $ \Bbb R^{2 \times 2}$ contains at least one invertible matrix."

Here the $ \Bbb R^{2 \times 2}$ represents the space of all two by two matrices. It seems like this is true, but I am not sure how to prove or disprove the statement. (If such is true, then it's easy to see that every 3-dimensional subspace of $ \Bbb R^{2 \times 2}$ contains infinitely many invertible matrices)

  • Hint: think about the standard basis of $M_2(\mathbb{R}), and what a matrix might look like if it were spanned by three of those basis matrices – Santana Afton Feb 24 '17 at 01:17
  • @JazzyMatrix not all of those are necessarily in the span of a given three standard basis matrices. – Adam Hughes Feb 24 '17 at 01:32
  • @JazzyMatrix Even taking the standard basis, the linear span of any three of them always contains an invertible matrix... – DonAntonio Feb 24 '17 at 01:35
  • @AdamHughes Yes, not all subspaces are generated by a given set of three. However, looking at a given set might give intuition as to why its true for an arbitrary subspace generated by three. – Santana Afton Feb 24 '17 at 01:40
  • 1
    Nice question! Does anyone have any idea what the situation is in higher dimensions (the smallest $d$ such that every $d$-dimensional subspace of $n \times n$ matrices contains an invertible matrix)? By considering upper triangular matrices it's at least not hard to see that $d \ge \frac{n(n+1)}{2}$. – Qiaochu Yuan Feb 24 '17 at 02:10
  • 1
    @QiaochuYuan Over any field, if all matrices of the subspace $V$ have rank at most $k$, then the dimension of $V$ is at most $nk$, and this bound is achieved (so you have $d=(n-1)n+1$). See On the maximal rank in a subspace of matrices by Meshulam. – Jose Brox Aug 14 '18 at 17:24

5 Answers5

10

Here's a nice solution using the fact that $\Bbb R^{2 \times 2}$ has a "dot-product" given by $$ \DeclareMathOperator{\tr}{Tr} \langle A,B \rangle = \tr(AB^T) $$ With that, we can describe any dimesnion $3$ subspace by $$ S = \{A : \tr(AM) = 0\} $$ for a fixed non-zero matrix $M$. If $M$ is invertible, then we can note that $$ A = \pmatrix{1&0\\0&-1}M^{-1} $$ is an element of $S$. If $M$ is not invertible, then $M = uv^T$ for column vectors $u$ and $v$. It suffices to select an invertible $A$ such that $Au$ is perpendicular to $v$.

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355
  • Excellent answer. +1 Yet why would we worry about $;M;$ not being invertible? We can choose $;M;$ to be invertible and even with trace$;\neq0;$, and then take the linear functional $;T_MA:=tr(AM);$ which will clearly be non-zero and thus its kernel has dimension three... – DonAntonio Feb 24 '17 at 01:55
  • 1
    @DonAntonio we have to account for all possible 3-dimensional subspaces. Some are the orthogonal complement of an invertible $M$, but some are necessarily the orthogonal complement of a non-zero, non-invertible $M$. For example: to describe $$ {\pmatrix{a&b\c&0} : a,b,c \in \Bbb R} $$ we necessarily require a singular $M$. – Ben Grossmann Feb 24 '17 at 01:58
  • Of course, thanks. When you first wrote "we have to account for all possible 3 dimensional..." I just remembered the question accurately. – DonAntonio Feb 24 '17 at 02:05
  • @DonAntonio. Ah. Nevertheless, I'm sure you won't be the only one with this in mind. – Ben Grossmann Feb 24 '17 at 02:08
  • @BlueRoses For any subspace $S$ of a finite dimensional inner-product space $V$, we have $\dim S + \dim S^\perp = V$ (where $S^\perp$ denotes the orthogonal complement of $S$). So, if $S$ is $3$-dimensional and $V$ is $4$-dimensional, then $S^\perp$ is $1$-dimensional. That is, $S^\perp$ is spanned by one vector. If $w$ is a fixed vector which spans $S^\perp$, then we can say that $$ S = (S^\perp)^\perp = {w}^\perp = {v \in V : \langle v, w \rangle = 0} $$ – Ben Grossmann Aug 19 '18 at 07:02
5

Here's a quick proof which uses special properties of the field $\mathbb{R}$. Consider the set of matrices of the form $\begin{pmatrix}a & -b \\ b & a\end{pmatrix}$. Note that every nonzero matrix in this set is invertible, since such a matrix has determinant $a^2+b^2$ which is nonzero unless $a=b=0$ (here is where we use the fact that our field is $\mathbb{R}$). But these matrices form a $2$-dimensional subspace of $\mathbb{R}^{2\times 2}$, which must have nontrivial intersection with any $3$-dimensional subspace. So any $3$-dimensional subspace contains a nonzero matrix of this form, which is invertible.


OK, now here's a more complicated proof that works over any field. Let $V\subseteq\mathbb{R}^{2\times 2}$ be $3$-dimensional and let $\{e_1,e_2\}$ be a basis for $\mathbb{R}^2$. Let $W$ be the $2$-dimensional subspace of $\mathbb{R}^{2\times 2}$ consisting of all $A$ such that $A(e_1)=0$. Note that $\dim V=\dim V\cap W+\dim V/(V\cap W)$ and $V\cap W$ and $V/(V\cap W)$ are each at most $2$-dimensional. So one has dimension $1$, and the other has dimension $2$.

Suppose $\dim V\cap W=1$ so $\dim V/(V\cap W)=2$. Let $A\in V\cap W$ be nonzero, so $A(e_1)=0$ and $A(e_2)\neq 0$. Note that $\dim V/(V\cap W)=2$ means that every element of $\mathbb{R}^{2\times 2}/W$ has a representative in $V$. That is, for any matrix $B$, there is $C\in V$ such that $B-C\in W$, which means $B(e_1)=C(e_1)$. In particular, choosing $B$ such that $B(e_1)$ is linearly independent from $A(e_2)$, there is some $C\in V$ such that $C(e_1)$ is linearly independent from $A(e_2)$. If $C$ is invertible, we're done. Otherwise, $C(e_2)$ is a multiple of $C(e_1)$, and so $C(e_2)+A(e_2)$ is not a multiple of $C(e_1)$. Taking $D=C+A$, we then have that $D(e_1)=C(e_1)$ and $D(e_2)=C(e_2)+A(e_2)$ are linearly independent. Thus $D$ is an invertible element of $V$.

The case that $\dim V\cap W=2$ and $\dim V/(V\cap W)=1$ is similar. Let $A\in V\setminus (V\cap W)$, so $A(e_1)\neq 0$. If $A$ is invertible, we're done; otherwise $A(e_2)$ is a multiple of $A(e_1)$. Since $\dim V\cap W=2$, we have $W\subset V$. In particular, let $B$ be a matrix such that $B(e_1)=0$ and $B(e_2)$ is not a multiple of $A(e_1)$. Then $A(e_2)+B(e_2)$ is not a multiple of $A(e_1)$, and $B\in W\subset V$. So $C=A+B\in V$ is invertible since $C(e_1)=A(e_1)$ and $C(e_2)=A(e_2)+B(e_2)$ are linearly independent.

(In fact, with a little work you can prove you can always choose $e_1$ so that you're in the first case, so the second case is unnecessary.)

Eric Wofsey
  • 342,377
3

I think this is the most straightforward way to see it, using just the basic operations and dimensional considerations. Let $X$ be a $3$-dimensional subspace, and let $$A=\pmatrix{1&0\\0&0}, B=\pmatrix{0&1\\0&0}, C=\pmatrix{0&0\\1&0},D=\pmatrix{0&0\\0&1}$$

Since $X$ is $3$-dimensional, it contains some non-trivial linear combination $aA + dD$. If $a$ and $d$ are non-zero then you're done; if not, $X$ contains $A$ or $D$, so assume without loss of generality it contains $A$. Similarly, assume WLOG it contains $B$.

Since $X$ is $3$-dimensional it contains some third linearly independent matrix; this matrix must have some non-zero entry in its bottom row. By adding multiples of $A$ and $B$, we see that $X$ contains some matrix

$$\pmatrix{0&0\\x&y}$$

With $x$ or $y$ non-zero. Adding this to $A$ or $B$ yields an invertible matrix.

juan arroyo
  • 1,405
1

$\newcommand{\Reals}{\mathbf{R}}$Every $2 \times 2$ real matrix can be written uniquely in the form $$ \left[\begin{array}{cc} a + c & b - d \\ b + d & a - c \\ \end{array}\right] = a\left[\begin{array}{rr} 1 & 0 \\ 0 & 1 \\ \end{array}\right] + b\left[\begin{array}{rr} 0 & 1 \\ 1 & 0 \\ \end{array}\right] + c\left[\begin{array}{rr} 1 & 0 \\ 0 & -1 \\ \end{array}\right] + d\left[\begin{array}{rr} 0 & -1 \\ 1 & 0 \\ \end{array}\right] $$ for some real numbers $a$, $b$, $c$, and $d$. The set of non-invertible matrices is the locus $$ \det\left[\begin{array}{cc} a + c & b - d \\ b + d & a - c \\ \end{array}\right] = a^{2} + d^{2} - (b^{2} + c^{2}) = 0, $$ which is a cone on a product of circles. (The intersection with the unit $3$-sphere is the Clifford torus.) Particularly, no three-dimensional subspace of $\Reals^{2 \times 2}$ is contained in the set of non-invertible matrices. Consequently, in every three-dimensional subspace of $\Reals^{2 \times 2}$, the set of non-invertible matrices is a closed algebraic set, and the set of invertible matrices is open and dense.

  • 2
    +1 The description "a cone on a product of circles" is very lively, and I can mentally visualise it easily. However, is there any simple proof that this cone does not contain any three-dimensional subspace? – user1551 Feb 24 '17 at 09:39
  • @user1551: One approach is to note that a three-dimensional subspace cuts the unit sphere in a great $2$-sphere, which does not contain a torus. (Admittedly, this argument is turning out to be less low-tech than I'd originally foreseen.) – Andrew D. Hwang Feb 24 '17 at 11:34
  • ...where by "does not contain", of course I meant "is not contained in". – Andrew D. Hwang Feb 25 '17 at 00:56
0

Something inspired by @omnomnomnom's answer:

Let $S = \{ A \mid tr(AM^T) = 0 \}$ and $M=U\Sigma V^T$ its SVD. Then, $A = U TV^T\in S$ if and only if $tr(T\Sigma) = 0$. In that case for example the flipped identity $$T = \begin{bmatrix}0 & 1 \\ 1 & 0 \end{bmatrix},$$ for which $A=UTV^T$ is invertible.

user251257
  • 9,417