To elaborate on some of the other answers.
The determinant is an alternating multilinear form on the columns (or rows) of a matrix. That means that if you write your matrix as $(v_1, \dots, v_n)$ where $v_i \in \mathbf{R}^n$ are the columns of the matrix, then
$\det$ is linear in each column
If $v_i = v_j$ for any $i \ne j$ then $\det(v_1, \dots, v_n) = 0$.
These two properties imply that if $v_1,\dots,v_n$ are linearly dependent, then $\det(v_1,\dots, v_n) = 0$. For suppose that $v_1, \dots, v_n$ are linearly dependent. Then we can write one of the columns as a linear combination of the others, and for simplicity, we will suppose that the first column can be written as a linear combination of the others. Thus, let
$$v_1 = \sum_{i = 2}^n \alpha_i v_i. $$
Then, we have by 1. and 2.,
\begin{align*}
\det(v_1,v_2,\dots,v_n) &= \det \left( \sum_{i = 2}^n \alpha_i v_i,v_2,\dots,v_n\right) \\
&= \sum_{i = 2}^n \alpha_i\det \left(v_i,v_2,\dots,v_n\right) \tag{by 1.} \\
&= \sum_{i = 2}^n 0 \tag{by 2.} \\
&= 0
\end{align*}
Thus if $A$ is not invertible, then the columns of $A$ are linearly dependent, so $\det A = 0$. This is the first proof.
For the second proof, in terms of elementary matrices, we know that there are 3 kinds of elementary row (or column) operations:
- Scale any row by a non-zero $\alpha \in \mathbf{R}$
- Swap any two rows
- Add $\alpha$ times one row to a different row
Each row operation can be written in terms of matrix multiplication. Moreover, if $E$ is the matrix that does the row operation, then $\det E$ is $\alpha$, in case 1; $-1$ in case 2; and $1$ in case 3. When we row reduce $A$, we multiply $A$ on the left by a sequence $E_1,E_{2},\dots, E_k$ of elementary matrices to get
$$ E_kE_{k-1}\cdots E_2E_1 A = \begin{pmatrix} I_r & 0 \\ 0 & 0 \end{pmatrix}$$
where $I_r$ is the identity matrix of size $r = \operatorname{rank}(A)$.
$$ \det(E_k)\cdots \det(E_1) \det(A) = \det\begin{pmatrix} I_r & 0 \\ 0 & 0 \end{pmatrix}.$$
We have $\det(E_k)\cdots \det(E_1) \ne 0$ since $\det(E_i) \ne 0$ for any $i$. If $\det(A) \ne 0$ then we have
$$ \det\begin{pmatrix} I_r & 0 \\ 0 & 0 \end{pmatrix} \ne 0$$
which is only possible if $r = n$. I.e. when $A$ has full rank.