0

For any $n \times 1$ matrix $X$ define $|X|$ as $\sqrt{\sum\limits_{k=1}^n X_k ^2}$ where $X_k$ is the $k-th$ row (or just the euclidean norm of a vector in $\mathbb{R}^n$).

For an $n \times n$ $A$ and a $n \times 1$ $X$ both with real entries and $\det(A) \ne 0$, I am interested in whether there is a general inequality or relation between $|AX|$ and $|\det(A)| \cdot |X|$. By "relation," I mean I am looking for an inequality that could involve additional assumptions or restrictions on $A$ or $X$ to make the relationship work for all $n$. For example, one could assume that $A$ has only positive entries or that $\det(A)$ is greater than a certain threshold.

For $n=1$ $|AX|= | \det(A) X|$ but for $n>1$ I don't think $|AX|\le | \det(A) X|$ or ($|AX|\ge | \det(A) X|$) is generally true.

pie
  • 8,483

2 Answers2

1

For $A = \begin{pmatrix} k & 0 \\ 0 & 1 \end{pmatrix}$ (any $k>0$ real) and $X = \begin{pmatrix} 0 \\ 1 \end{pmatrix}$, $|AX| = 1$ and $|\text{det}(A)||X| = k$. There is no relation between the two quantities unless we put restrictions on matrix $A$. (See @Severin 's solution)

Edit: Another example: For $A = \begin{pmatrix} \sqrt{k^2+1} & k \\ k & \sqrt{k^2+1} \end{pmatrix}$ (any $k$ real) and $X = \begin{pmatrix} 0 \\ 1 \end{pmatrix}$, $|AX| = \sqrt{2k^2+1}$ and $|\text{det}(A)||X| = 1$. The matrix $A$ in this case is Symmetric, diagonalizable with $$ \left(\begin{array}{cc} \sqrt{k^2+1} & k \\ k & \sqrt{k^2+1} \end{array}\right)=\left(\begin{array}{cc} -1 & 1 \\ 1 & 1 \end{array}\right) \left(\begin{array}{cc} \sqrt{k^2+1}-k & 0 \\ 0 & \sqrt{k^2+1}+k \end{array}\right) \left(\begin{array}{cc} \frac{-1}{2} & \frac{1}{2} \\ \frac{1}{2} & \frac{1}{2} \end{array}\right) $$

Sam
  • 3,763
  • I put a restriction that $\det(A)\ne 0$ and the question is asking to find a restriction and a relation such that the inequality work, also it doesn't have to be basic inequality like $|AX|< |det(A)X|$, it could be, for example, $|AX|^n< |det(A)X|$ (this is what I meant by relation) – pie Sep 15 '24 at 16:38
  • Thanks, edited the solution @pie – Sam Sep 15 '24 at 16:42
  • "There is no relation between the two quantities." so no matter what restriction I use or no matter what inequity I use, The general case for all $n$ won't work? – pie Sep 15 '24 at 16:44
  • 1
    @pie This answer tells you that even for diagonal matrices it does not work. What do you hope for? – Severin Schraven Sep 15 '24 at 16:47
  • @SeverinSchraven I hope to discover some restriction that this inequality work in some scene for example if one put a restriction $\det(A)\ge 1$ this answer would be incorrect (or at least the method), – pie Sep 15 '24 at 16:51
  • One can choose more restriction on $A$ as they please... – pie Sep 15 '24 at 16:52
  • Sure, I can give you conditions which make this inequality true. The point is more what kind of conditions you are looking for.... – Severin Schraven Sep 15 '24 at 16:55
  • @SeverinSchraven The most general one of them, if $A=I$ the question is trivial so this restriction works but is there more general one? – pie Sep 15 '24 at 16:56
  • 1
    One set of conditions would be $A$ positive definite with smallest eigenvalue $\geq 1$. Respectively, $A$ symmetric and $$\max_{\lambda\in \mathrm{spec}(A)} \vert \lambda\vert \geq 1.$$ – Severin Schraven Sep 15 '24 at 16:58
  • @SeverinSchraven This is awesome, is there any more weaker hypothesis? – pie Sep 15 '24 at 17:08
1

Let's start from Sam's solution. Namely, pick a diagonal matrix $$A=\begin{pmatrix} \lambda_1 & 0 \\ 0 & \lambda_2 \end{pmatrix}$$ In this setting we easily see that $$ \vert \det(A)\vert= \vert \lambda_1\vert \cdot \vert \lambda_2\vert$$ and $$\Vert A\Vert_\mathrm{op} =\max\{ \vert \lambda_1\vert, \vert \lambda_2\vert\}.$$ Thus, if $\vert \lambda_1\vert, \vert \lambda_2\vert\geq 1$, then $$ \Vert A\Vert_\mathrm{op} =\max\{\vert \lambda_1\vert, \vert \lambda_2\vert\} \leq \vert \lambda_1\vert \cdot\vert \lambda_2\vert =\vert \det (A)\vert.$$ In particular we get $$\vert AX\vert \leq \Vert A\Vert_\mathrm{op} \vert X\vert \leq \vert \det(A)\vert \vert X\vert =\vert \det(A) X\vert.$$ Note that if any of the two eigenvalues has norm less than $1$, then the inequality no longer holds.

More general conditions: So, your question is really asking whether there is a relation between the operator norm and the absolute value of the determinant. Recall that the determinant is the product of the eigenvalues (call them $\lambda_1,\dots, \lambda_n$ where we list them with multiplicity) of $A$, thus, $$\vert \det(A)\vert = \prod_{j=1}^n \vert \lambda_j\vert.$$ On the other hand, under some conditions (see here Is spectral radius = operator norm for a positive valued matrix? for an overview, respectively here Quick question: matrix with norm equal to spectral radius for the characterisation) we have that the operator norm of $A$ is equal to $$ \max_{j=1, \dots, n} \vert \lambda_j\vert.$$ Hence, if we assume in addition that $\vert \lambda_j\vert\geq 1$, then the product, i.e. $\vert \det(A)\vert$, is bigger than the maximum of the absolute value of the eigenvalues (which by assumption is the operator norm of $A$).