7

Let $A$ a matrix $n\times n$ over $\mathbb R$. I'm trying to prove that $A$ is invertible $\iff\det A\neq 0$.

If $A$ is invertible, there is $B$ s.t. $AB=I$, and thus $$\det(A)\det(B)=\det(AB)=\det(I)=1,$$ and thus $\det A\neq 0$.

I have problem to prove the converse. Let $A$ s.t. $\det(A)\neq 0$. I would like to say that $$\det(A)\det(A)^{-1}=1.$$ I know that if $A^{-1}$ exist then $\det(A)^{-1}=\det(A^{-1})$, but since I have to prove that $A^{-1}$ exist, I can't use this formula... so how can I conclude ?

user330587
  • 1,654

6 Answers6

6

To continue the proof using your approach:

Use the definition of $\det( )$ as the unique alternating multilinear function acting on columns of the matrix such that $\det(I) = 1$ (as opposed to the definition that gives an algebraic expression for $\det()$ in terms of permuations).

$\det(A) \ne 0 \implies $ columns of $A$ are linearly independent $ \implies A$ is full rank.

To get the first arrow, you can do proof by contradiction. So assume the columns of $A$ are dependent and $A_1 = \sum_{i=2}^n c_iA_i$ for some $c_i$. Then $\det(A_1, …, A_n) = \det(A_1 - \sum_{i=2}^n c_iA_i, A_2 + c_2A_2, …, A_n + c_nA_n) = \det(0, A_2 + c_2A_2, …, A_n + c_nA_n) = 0$.

This is the contradiction.

To get the second arrow, it's just the definition of rank.

So $A$ is surjective. By rank-nullity theorem, it is also injective. Therefore it is bijective, proving the existence of $A^{-1}$ as a function. Then it is a small step to prove that $A^{-1}$ linear and then you can complete the proof.

Mark
  • 6,038
5

To elaborate on some of the other answers.

The determinant is an alternating multilinear form on the columns (or rows) of a matrix. That means that if you write your matrix as $(v_1, \dots, v_n)$ where $v_i \in \mathbf{R}^n$ are the columns of the matrix, then

  1. $\det$ is linear in each column

  2. If $v_i = v_j$ for any $i \ne j$ then $\det(v_1, \dots, v_n) = 0$.

These two properties imply that if $v_1,\dots,v_n$ are linearly dependent, then $\det(v_1,\dots, v_n) = 0$. For suppose that $v_1, \dots, v_n$ are linearly dependent. Then we can write one of the columns as a linear combination of the others, and for simplicity, we will suppose that the first column can be written as a linear combination of the others. Thus, let

$$v_1 = \sum_{i = 2}^n \alpha_i v_i. $$

Then, we have by 1. and 2.,

\begin{align*} \det(v_1,v_2,\dots,v_n) &= \det \left( \sum_{i = 2}^n \alpha_i v_i,v_2,\dots,v_n\right) \\ &= \sum_{i = 2}^n \alpha_i\det \left(v_i,v_2,\dots,v_n\right) \tag{by 1.} \\ &= \sum_{i = 2}^n 0 \tag{by 2.} \\ &= 0 \end{align*}

Thus if $A$ is not invertible, then the columns of $A$ are linearly dependent, so $\det A = 0$. This is the first proof.


For the second proof, in terms of elementary matrices, we know that there are 3 kinds of elementary row (or column) operations:

  1. Scale any row by a non-zero $\alpha \in \mathbf{R}$
  2. Swap any two rows
  3. Add $\alpha$ times one row to a different row

Each row operation can be written in terms of matrix multiplication. Moreover, if $E$ is the matrix that does the row operation, then $\det E$ is $\alpha$, in case 1; $-1$ in case 2; and $1$ in case 3. When we row reduce $A$, we multiply $A$ on the left by a sequence $E_1,E_{2},\dots, E_k$ of elementary matrices to get

$$ E_kE_{k-1}\cdots E_2E_1 A = \begin{pmatrix} I_r & 0 \\ 0 & 0 \end{pmatrix}$$

where $I_r$ is the identity matrix of size $r = \operatorname{rank}(A)$.

$$ \det(E_k)\cdots \det(E_1) \det(A) = \det\begin{pmatrix} I_r & 0 \\ 0 & 0 \end{pmatrix}.$$

We have $\det(E_k)\cdots \det(E_1) \ne 0$ since $\det(E_i) \ne 0$ for any $i$. If $\det(A) \ne 0$ then we have

$$ \det\begin{pmatrix} I_r & 0 \\ 0 & 0 \end{pmatrix} \ne 0$$

which is only possible if $r = n$. I.e. when $A$ has full rank.

Sera Gunn
  • 27,981
4

Well! You see $A^{-1}= Adj A / det(A)$ is also an important result which clearly proofs both cases.

  • Sorry but I neither don't know the formula nor what is Adj(A). Is there an other way ? – user330587 Jun 24 '18 at 05:05
  • Depending on what you are allowe to assume, we also habe that determinant of $A $ is non-zero iff determinant of row echelon form of $A $ (say $R $) is non-zero iff $A $ is invertible. Using elementary matrices $A=E_nE_{n-1}\dots E_1$, with each $E_i $ an elementary (and so invertible) matrix. – AnyAD Jun 24 '18 at 05:07
  • 1
    This result is an expression of Cramer's rule. – lhf Jun 24 '18 at 19:17
1

Devendra Singh Rana's answer is the actual reason why the determinant is useful and why we have this characterization: the formula $(\mathrm{Com}A)^TA= \mathrm{det}(A)I_n$ shows that it suffices for $\mathrm{det}A$ to be invertible (hence for a field, $\neq 0$) for $A$ to be !

However in the specific case of $\mathbb{R}$ (or a field more generally) there is another proof that works great : let $f: M_n(\mathbb{R})\to \mathbb{R}$ be a multiplicative ($f(AB)= f(A)f(B)$) nonconstant function. Then $f(A)\neq 0$ if and only if $A$ is invertible.

Of course, applying this to $f=\mathrm{det}$ gives the desired conclusion.

How do we prove the claim ? Well in one direction, argue as you do that invertible matrices have nonzero $f$. For this you first have to show that $f(I_n)=1$. This follows from $I_nI_n = I_n$ and the fact that $f$ is nonconstant.

Thus you know that invertible matrices have nonzero $f$. Now this implies that "$f(A)\neq 0$" is invariant under equivalence of matrices (if $P,Q$ are invertible, then $f(PAQ) = 0 \iff f(A)=0$).

But two matrices are equivalent iff they have the same rank ! Now if $A$ is not invertible, its rank is $<n$ and so $A$ is equivalent to some nilpotent matrix $N$: but $f(N)^n= f(N^n) = f(0)$ and you can argue (because $0\times 0 = 0$ and $f$ is nonconstant) that $f(0)=0$, and so $f(N)^n=0$, $f(N)=0$, hence $f(A) = 0$.

So the fact that $\det$ characterizes invertibility can be seen either through the comatrix formula which I restated in the beginning, or through abstract properties of the function $\det$.

Maxime Ramzi
  • 45,086
0

Depending on what you are allowed to assume, we also have that determinant of $A $ is non-zero iff determinant of row echelon form of $A $ (say $R $) is non-zero iff $A $ is invertible. Using elementary matrices $R=E_nE_{n-1}\dots E_1A$, with each $E_i $ an elementary (and so invertible) matrix.

AnyAD
  • 2,672
0

Iff $\det(A)$ was $0$ then $0$ would be an eigenvalue of $A$, hence $A$ wouldn’t be invertible, see Show that a matrix $A$ is singular if and only if $0$ is an eigenvalue..

Michael Hoppe
  • 18,614