11

I need to prove that the determinant $\det: M(n, \mathbb{R}) \rightarrow \mathbb{R}$ is a continuous function given the euclidean metric on the vector space of all $n \times n$ matrices over $\mathbb{R}$, i.e. $\Vert M \Vert = \sqrt{\sum_{i,j=1}^n M_{i,j}^2}$.

So what I need to prove, I think, is that there exists a $\delta > 0$ such that if $\Vert M - M' \Vert < \delta$, for any $M' \in M(n,\mathbb{R})$ then it follows that for all $\epsilon > 0: |\det(M) - \det(M')|< \epsilon$.

Unfortunately I have no idea how to derive the correct inequalities between the given euclidean metric and a determinant, since I'm kind of struggling with the permutation part in the definition of a determinant. Can anybody help, please?

eager2learn
  • 2,839
  • 11
    Do you have to use $\epsilon-\delta$? For example, can you also use that polynomials are continuous? Multiplications of continuous functions are continuous? –  Jun 06 '15 at 11:40
  • I'm not entirely sure but I think the proof is supposed to use $\epsilon - \delta$, since the question explicitly says that the metric on the vector space of matrices is the euclidean norm. Would a specific metric on the domain of det somehow be relevant to a proof involving polynomials? – eager2learn Jun 06 '15 at 11:46
  • You may think it this way: Identified $M(n, \mathbb R)$ as $\mathbb R^{n^2}$. Note that the metric on $M(n, \mathbb R)$ is exactly the usual Euclidean metric on $\mathbb R^{n^2}$. If you think of $\det$ as a function from $\mathbb R^{n^2}$, this function is really a polynomial of degree $n$ with $n^2$ variables. –  Jun 06 '15 at 11:49
  • 3
    Very interesting and apparently horrible, horrible question if it really has to be done with the epsilon delta definition and not using the fact that polynomials are continuous functions. And I wrote "apparently" above because there might be some slick trick to overcome the seemingly tought difficulties. – Timbuc Jun 06 '15 at 12:02
  • @user99914 how would the continuity of polynomials be helpful? The determinant can be computed easily using the characteristic polynomial, but a perturbation in the variable $\lambda$ of the characteristic polynomial is only tantamount to a perturbation in the diagonals of the matrix $A$. – Him Jul 03 '19 at 15:20

2 Answers2

18

Let $M=(a_{ij})_{n\times n}$ and $M'=(a'_{ij})_{n\times n}$. Recall $$ \det M=\sum_{\sigma\in S_n}\text{sgn}(\sigma)\prod_{i=1}^na_{i,\sigma_i},\det M'=\sum_{\sigma\in S_n}\text{sgn}(\sigma)\prod_{i=1}^na'_{i,\sigma_i} $$ where $S_n$ is the set of all permutations of $\{1,2,\cdots,n\}$ and $\sigma_i=\sigma(i)$. Define $m=\max_{1\le i, j\le n}\{|a_{ij}|,|a'_{ij}|\}$. Then \begin{eqnarray*} \vert\det M-\det M'\vert&=&\bigg|\sum_{\sigma\in S_n}\text{sgn}(\sigma)(\prod_{i=1}^na_{i,\sigma_i}-\prod_{i=1}^na'_{i,\sigma_i})\bigg|\\ &\le&\sum_{\sigma\in S_n}\bigg|\prod_{i=1}^na_{i,\sigma_i}-\prod_{i=1}^na'_{i,\sigma_i}\bigg|\\ &=&\sum_{\sigma\in S_n}\bigg|a_{1,\sigma_1}a_{2,\sigma_2}\cdots a_{n,\sigma_n}-a'_{1,\sigma_1}a'_{2,\sigma_2}\cdots a'_{n,\sigma_n}\bigg|\\ &=&\sum_{\sigma\in S_n}\bigg|(a_{1,\sigma_1}-a'_{1,\sigma_1})a_{2,\sigma_2}a_{3,\sigma_3}\cdots a_{n,\sigma_n}+a'_{1,\sigma_1}(a_2-a'_{2,\sigma_2})a_{3,\sigma_3}\cdots a_{n,\sigma_n}\\ &&+a'_{1,\sigma_1}a'_{2,\sigma_2}(a_{3,\sigma_2}-a'_{3,\sigma_3})\cdots a_{n,\sigma_n}+\dots+a'_{1,\sigma_1}a'_{2,\sigma_2}a'_{3,\sigma_3}\cdots a'_{n-1,\sigma_{n-1}}(a_{n,\sigma_n}-a'_{n,\sigma_n})\bigg|\\ &\le&\sum_{\sigma\in S_n}\sum_{i=1}^nm^{n-1}|a_{i,\sigma_i}-a'_{i,\sigma_i}|. \end{eqnarray*} For $\forall \varepsilon>0$, for $\delta=\frac{\varepsilon}{2nn!m^{n-1}}$, let $$ \|M-M'\|<\delta. $$ Then we have $|a_{i,j}-a'_{ij}|<\delta$ for all $1\le i,j\le n$ and hence $$ |\det M-\det M'|\le\sum_{\sigma\in S_n}\sum_{i=1}^nm^{n-1}\delta=nn!m^{n-1}\delta=\frac{\varepsilon}{2}<\varepsilon. $$

Sha Vuklia
  • 4,356
xpaul
  • 47,821
6

For a more geometric approach, you can use the following inequality: \begin{equation}\tag{1} |\det M|\le \prod_{j=1}^n \lVert M_j\rVert. \end{equation} Here $\lVert M_j\rVert$ denotes the Euclidean norm of the $j$-th column of $M$. This inequality is best understood if the determinant of $M$ is interpreted as the signed volume of the parallelogram spanned by the vectors $M_1\ldots M_n$. It expresses the intuitive fact that a skew parallelogram has smaller volume than the right parallelogram having the same sides. (Look here for more information and a proof).

Observing that $\det(M)$ is a linear map in each column $M_j$, one obtains the following formula, in which the right hand side is a telescoping sum: $$ \det M-\det M'=\sum_{j=1}^n\det\begin{bmatrix} M'_1&\ldots& M'_{j-1}& M_j-M'_j& M_{j+1}&\ldots& M_n\end{bmatrix}.$$ Taking absolute values and using inequality $(1)$ one has $$\tag{2} \lvert \det M-\det M'\rvert \le \sum_{j=1}^n\lVert M'_1\rVert\ldots \lVert M_j-M'_j\rVert\ldots\lVert M_n\rVert.$$

The inequality $(2)$ shows that $\det$ is locally Lipschitz continuous.

Indeed, if $C>0$ is such that $$\lVert M_k\rVert\le C,\ \lVert M'_k\rVert\le C, \qquad \forall k=1\ldots n, $$ then using the elementary inequality $$ \sum_{j=1}^n a_j\le \sqrt{n}\left(\sum_{j=1}^n a_j^2\right)^{\frac{1}{2}}\qquad \forall a_1\ldots a_n\ge 0, $$ one sees from $(2)$ that $$ \begin{split} \lvert \det M-\det M'\rvert&\le C^{n-1}\sum_{j=1}^n\lVert M_j-M'_j\rVert \\ &\le \sqrt{n}C^{n-1}\sqrt{\sum_{j=1}^n \lVert M_j-M'_j\rVert^2} \\ &=\sqrt{n}C^{n-1}\lVert M-M'\rVert_{\mathrm{matrix}} \end{split} $$ where $\lVert M\rVert_{\mathrm{matrix}}^2=\sum_{k=1}^n\lvert M_{i\,j}\rvert^2.$

  • Side note: The "telescoping sum trick" is exactly the same xpaul used. This proof and xpaul's one are not so different, actually. This proof is "column-oriented" while xpaul's is "element-oriented" but that's all. The algebraic machinery is the same and it relies on the multilinear character of the determinant function. – Giuseppe Negro Jun 09 '15 at 21:01
  • Do you know of a text or paper where one might cite this factoid? – Him Jul 03 '19 at 15:22
  • @Scott: Which "factoid"? I am sorry, I don't understand what you mean. – Giuseppe Negro Jul 03 '19 at 15:48
  • That the determinant is a continuous function. Is this proof you've provided available in a textbook somewhere? – Him Jul 04 '19 at 09:20
  • Well, surely it is, but I don't know exactly. You can safely dismiss this as "well-known", though, no need for an exact reference. – Giuseppe Negro Jul 04 '19 at 09:22
  • I could do so in a Mathematics publication. These things are not well-known in other fields. I have been previously admonished for not providing a citation for what I considered a "well-known" thing that was more-or-less basic multivariate calculus, to which I then provided a citation to Newton's Principia. – Him Jul 04 '19 at 09:25
  • Newton's Principia is a very tough citation! :-) Well, anyway, I don't know a reference, however it surely exists. You could write something along the lines of "since $\det A$ is a polynomial in the entries $a_{i, j}$ of $A$, it is a continuous function of such". It is a concise proof, much better than a citation, especially if such citation leads to an obscure textbook. – Giuseppe Negro Jul 04 '19 at 09:32
  • AH! of course. that is a much better solution. Thank you. – Him Jul 04 '19 at 10:10