18

I have the following inequality to prove.

With $A \in M_n(R)$ show that: $$ (\det(A))^2 \leq \prod_{i=1}^n\left( \sum_{k=1}^n A_{k,i}^2\right) $$

What I already have: I found out that: $$G(v_1,\ldots,v_m) = \det(A^T A)=\det(A^T)\cdot\det(A)=(\det(A))^2 $$

Also that with $G(v_1,\ldots,v_m) = (\det(A))^2$ results that:

$$\operatorname{Vol}(v_1,\ldots,v_m)= \left|\det(A)\right| =\prod_{i=1}^n |s(v_k,u_i)|$$

I don't even now if this is right.. It would be really helpful if someone could help me with the proof..

PhysX
  • 397

3 Answers3

19

There's a quick proof using the $QR$ decomposition. In particular, note that any matrix $A$ can be written as $$ A = QR $$ where $Q$ orthogonal and $R$ is upper triangular (in fact, the columns of $Q$ can be taken as the orthonormal basis attained via the Gram-Schmidt process). Let $q_j$ denote the columns of $Q$, let $a_j$ denote the columns of $A$ (for $j = 1,\dots,n$), and let $r_{ij}$ denote the entries of $R$. Then $$ a_j = \sum_{i=1}^j r_{ij}q_i $$ It follows that $$ \|a_j\|^2 = \sum_{i=1}^j |r_{ij}|^2\|q_j\|^2 \geq |r_{jj}|^2 \implies |r_{jj}| \leq \|a_{j}\| $$ Finally, we have $$ |\det(A)| = |\det(Q)| |\det(R)| = 1 \cdot \left|\prod_{j=1}^n r_{jj}\right| \leq \prod_{j=1}^n \|a_j\| $$ as desired.

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355
6

This inequality is 'geometrically intuitive': Note that $|\text{det}(A)|$ is the volume of the parallelopiped spanned by the $A_i$, and that $\prod_{j = 1}^n ||A_j||_2$ is the volume of the box with side lengths same lengths as the $a_i$. Your assertion is equivalent to the statement that, if you are given fixed side lengths, you can bound the most area when your parallelopiped is a box. This is clear in $2D$ by cutting and pasting.

In fact, I think (but haven't checked this carefully) you can prove the statement with such reasoning: If you have any two vectors that are not orthogonal, you can cut and paste to get a new paralellopiped with the same area, where one of the side lengths is strictly smaller. Now you can scale up that length of that side. The end result of doing so results in a paralellopiped with strictly greater volume but with the same side lengths. Keep doing this until all of the vectors are orthogonal, and you are left with a rectangle with volume $\prod ||A_i||$, and the course of the proof shows that this is larger than $|\text{det}(A)|$.

Reading over this again : Essentially what I'm asserting with the cutting and pasting idea is that Gram-Schmidt orthogonalization (not orthonormalization) preserves the determinant. Algebraically this is because orthogonalization writes $A = A' T$, where $T$ is an upper triangular matrix with $1$s along the diagonal, so $\text{det}(T) = 1$ and multiplicativity of the determinant gives $\det(A) = \det(A')$. On the other hand, the result of orthogonalization is that vectors only potentially get shorter, since we remove some orthogonal component from them, so at the end $\det(A')$ is equal to the product of the $L^2$ norms of the vectors in the orthonormal basis, but that is smaller than the original product of $L^2$ norms.

Since another name for the Gram-Schmidt orthogonalization is the QR deocmposition, this is similar to the other answer.

Elle Najt
  • 21,422
1

I will give my own proof to this inequality. (In fact there are a lot of proofs to it, but I haven't found one with the same idea of mine.)

Hadamard's inequality has two equivalent forms. The other is for any positive semi-definite matrix $A$ we have \[\det A\leqslant \prod a_{ii},\] with equality when $A$ is a diagonal matrix. The equivalence can be inferred from the fact that a positive semi-definite matrix $A$ can be written as $B^\mathsf TB$.


Proof. We decompose $A$ as $A=U^\mathsf T\Lambda U$, where $U$ is an orthogonal matrix and $\Lambda=\operatorname{diag}(\lambda_1,\dots,\lambda_n)\succcurlyeq 0$. Then we have $\det A=\prod \lambda_i$, $\prod a_{ii}=\prod \boldsymbol u_i^\mathsf T\Lambda \boldsymbol u_i$. What we want becomes \[ \prod \lambda_i\leqslant\prod\boldsymbol u_i^\mathsf T\Lambda \boldsymbol u_i. \] Just expand the right hand side: \[ \prod_i\boldsymbol u_i^\mathsf T\Lambda \boldsymbol u_i=\prod_i\sum_j\lambda_j u_{ij}^2\geqslant\prod_i\prod_j\lambda_j^{u_{ij}^2}=\prod_j\lambda_j^{\sum_i u_{ij}^2}=\prod_j \lambda_j. \] Here the weighted AM-GM inequality is used. If you were not familiar with it, Jensen's inequality can also be used: \[ \prod_i\sum_j\lambda_j u_{ij}^2=\exp\Big(\sum_i\ln\sum_j\lambda_j u_{ij}^2\Big)\geqslant\exp\Big(\sum_i\sum_ju_{ij}^2\ln\lambda_j\Big)=\exp\Big(\sum_j\ln\lambda_j\Big)=\prod_j\lambda_j. \]

It can be seen that the equality holds when all $\lambda_i$'s are equal, i.e., $A$ is a scalar matrix, or $U$ is identity matrix. Hence the equality holds when $A$ is a diagonal matrix.

ImbalanceDream
  • 1,081
  • 2
  • 11