4

a) Show that the tridiagonal matrix $$A=\begin{pmatrix} 2 & 1 & \dots & 0 \\ 1 & 2 & 1 & \vdots \\ \vdots & \vdots & \ddots & \vdots \\ 0 & \ldots & 1 & 2\end{pmatrix}$$ is positive definite.

b) Show that a symmetric, positive definite matrix $A\in \mathbb{R}^{n\times n}, \ A=\{a_{k\ell}\}_{k,\ell=1}^n$ has only positive diagonal elements and that it holds that $\displaystyle{\max_{k=1,\ldots, n}| a_{kk}| = \max_{k,\ell=1,\ldots,n}| a_{k\ell}|}$.


I have done the following :

a) Let $x\in \mathbb{R}^n\setminus \{\vec{0}\}$. \begin{align*}x^TAx&=\sum_{i,k=1}^na_{ik}x_ix_k \\ & [\text{ if } i=k : a_{ii}=2 \ , \ \text{ if } k=i-1 : a_{i(i-1)}=1 \ , \ \text{ if } k=i+1 : a_{i(i+1)}=1 ]\\ & =\sum_{i=1}^nx_i x_{i-1}+2\sum_{i=1}^n x_i^2+\sum_{i=1}^n x_ix_{i+1} \\ & =x_1 x_{0}+\sum_{i=2}^nx_i x_{i-1}+2\sum_{i=1}^n x_i^2+\sum_{i=1}^{n-1} x_ix_{i+1} +x_nx_{n+1} \\ & =\sum_{i=2}^nx_i x_{i-1}+2\sum_{i=1}^n x_i^2+\sum_{i=1}^{n-1} x_ix_{i+1} \\ & = \sum_{i=1}^{n-1}x_i x_{i+1}+2\sum_{i=1}^n x_i^2+\sum_{i=1}^{n-1} x_ix_{i+1} \\ & = 2\sum_{i=1}^{n-1}x_i x_{i+1}+2\sum_{i=1}^n x_i^2 \\ & = \left (\sum_{i=1}^{n-1} x_i^2+x_n^2\right )+2\sum_{i=1}^{n-1}x_i x_{i+1}+\left (x_1^2+\sum_{i=2}^n x_i^2\right ) \\ & = \sum_{i=1}^{n-1} x_i^2+2\sum_{i=1}^{n-1}x_i x_{i+1}+\sum_{i=2}^n x_i^2+x_1^2+x_n^2 \\ & = \sum_{i=1}^{n-1} x_i^2+2\sum_{i=1}^{n-1}x_i x_{i+1}+\sum_{i=1}^{n-1} x_{i+1}^2+x_1^2+x_n^2 \\ & = \sum_{i=1}^{n-1} \left (x_i^2+2x_i x_{i+1}+ x_{i+1}^2\right )+x_1^2+x_n^2 \\ & = \sum_{i=1}^{n-1} \left (x_i+ x_{i+1}\right )^2+x_1^2+x_n^2 \\ & >0 \end{align*} Since it is a sum of positive (squares) terms. It is equal to zero only if all terms are equal to zero and we get this only when $\vec{x}=\vec{0}$.

Is that correct and complete?

$$$$

At b) I need some help.

We have that $A$ is symmetric, that means that $\displaystyle{a_{k\ell}=a_{\ell k}}$, or not?

We also have that $A$ is positive definite, that means that $\displaystyle{x^TAx>0 \Rightarrow \sum_{k, \ell=1}^na_{k \ell }x_kx_\ell >0}$ for $x\neq 0$, right?

How do we get that $a_{kk}>0$ ?

Do we maybe take the half sum and then the other sum is the same due to symmetric property?

$$$$

EDIT :

For b) I have done the following :

We consider that the contrary is true, i.e $\displaystyle{\max_{k,\ell=1,\ldots,n}|a_{k\ell}|=|a_{ij}|}$ with $i<j$.

Since $A$ is positive definite we have that $x^T A x >0 \quad \forall x\neq 0$.

We consider the vector $x_{ij}$ where the $i$-th entry is equal to $1$ and the $j$-th entry is equal to $-1$ and all other entries are equal to $0$, where $i,j\in \{1, \ldots , n\}$ arbitrary.

For this vector we get : \begin{align*}&x_{ij}^T A x_{ij} >0 \\ & \Rightarrow \begin{pmatrix}0 & \ldots & 0 & 1 & 0 & \ldots & -1 & \ldots & 0\end{pmatrix}\begin{pmatrix}a_{11} & \ldots & a_{1n} \\ \ldots & \ldots & \ldots \\ a_{i1} & \ldots & a_{in} \\ \ldots & \ldots & \ldots \\ a_{j1} & \ldots & a_{jn} \\ \ldots & \ldots & \ldots \\ a_{n1} & \ldots & a_{nn}\end{pmatrix}\begin{pmatrix}0 \\ \vdots \\ 0 \\ 1 \\ 0 \\ \vdots \\ -1 \\ \vdots \\ 0\end{pmatrix}>0 \\ & \Rightarrow \begin{pmatrix}a_{i1}-a_{j1} & \ldots & a_{ii}-a_{ji} & \ldots & a_{ij}-a_{jj} & \ldots & a_{in}-a_{jn}\end{pmatrix}\begin{pmatrix}0 \\ \vdots \\ 0 \\ 1 \\ 0 \\ \vdots \\ -1 \\ \vdots \\ 0\end{pmatrix}>0 \\ & \Rightarrow (a_{ii}-a_{ji})-(a_{ij}-a_{jj})>0\\ & \Rightarrow a_{ii}-a_{ji}-a_{ij}+a_{jj}>0\end{align*} Since $A$ is symmetric we get that $a_{ji}=a_{ij}$, and so we get \begin{equation*}a_{ii}-a_{ji}-a_{ij}+a_{jj}>0 \Rightarrow a_{ii}-a_{ij}-a_{ij}+a_{jj}>0\Rightarrow a_{ii}+a_{jj}>a_{ij}+a_{ij}\end{equation*} and in that way we get a contradiction.

Is that correct and complete?

Mary Star
  • 14,186
  • 15
  • 91
  • 205
  • 2
    For b), consider $x=[1\ 0\ \cdots\ 0]$, what do you find? As for a) it seems right, though I read it diagonally – G Frazao Nov 16 '22 at 10:47
  • 1
    For this $x$ do we get $a_{11}>0$ ? @GFrazao – Mary Star Nov 16 '22 at 10:58
  • 1
    Yes, and if you continue with $x = [0\ 1\ 0\ \cdots\ 0]$ and so on? – G Frazao Nov 16 '22 at 10:59
  • Then we will get that $a_{kk}>0$ for all $k$, right? Can we just take this specific $x$ ? Do we not have to show that the diagonal elements are positive for all vectors $x$ ? @GFrazao – Mary Star Nov 16 '22 at 11:01
  • 1
    For the first question you can simply use Gershgorin circle Theorem. I guess you can use it also for the second question. – yes Nov 16 '22 at 11:02
  • 1
    Note that $a_{kk}$ are elements of the matrix $A$, they have no relation with $x$. We chose these specific $x$'s because they tell us specific information for each $a_{kk}$, given that $x^T A x >0$ holds for any $x \neq 0$. – G Frazao Nov 16 '22 at 11:03
  • Ah ok! I got it! And what about the second question with the maximum? How do wegt that equality? @GFrazao – Mary Star Nov 16 '22 at 11:05
  • 1
    One possible way is to use the Gershgorin circle theorem, as mentioned by VanBaffo: https://en.wikipedia.org/wiki/Gershgorin_circle_theorem. This might be a bit of an overkill, let me think if there is a simpler way to show it. – G Frazao Nov 16 '22 at 11:06
  • 1
    Gerschgorin Discs tell you the matrix is PSD but doesn't tell you it is PD. You need to do something like apply Taussky's refinement of G-discs, use Perron-Frobenius Theory, or directly calculate the eigenvalues of the Tri-diagonal matrix.... i.e. part (1) is basically a duplicate of https://math.stackexchange.com/questions/3903369/prove-that-this-block-matrix-is-positive-definite/ . Part 2 comes from writing $A=B^TB$ and seeing the $(a_{i,k})^2=(\mathbf b_i^T\mathbf b_k)^2 \leq \big \Vert \mathbf b_i\big\Vert_2\cdot \Vert \mathbf b_k\big\Vert_2=a_{i,i}\cdot a_{k,k}$ by Cauchy-Schwarz – user8675309 Nov 16 '22 at 16:58
  • 1
    that should read $(a_{i,k})^2=(\mathbf b_i^T\mathbf b_k)^2 \leq \big \Vert \mathbf b_i\big\Vert_2^2\cdot \Vert \mathbf b_k\big\Vert_2^2=a_{i,i}\cdot a_{k,k}$ which of course is bounded above by $\max\big(a_{i,i}, a_{k,k}\big)^2$ – user8675309 Nov 16 '22 at 18:03
  • Can you check please the edit part of the above question? Is that correct and complete? @user8675309 – Mary Star Nov 16 '22 at 18:59
  • Can you check please the edit part of the above question? Is that correct and complete? @VanBaffo – Mary Star Nov 16 '22 at 18:59

1 Answers1

0
  1. Show that a (symmetric) positive definite matrix $A\in \mathbb{R}^{n\times n}$, has positive diagonal elements.

Positive definite: $x^T A x >0 \quad \forall x\neq 0$.

Consider the vector $x_k$ with $k^{\text{th}}$ entry $1$ and remaining entries $0$.

$x_k^T A x_k = a_{kk} >0 \quad { }_\blacksquare$

  1. Show that the matrix element with maximum absolute value lies on the diagonal.

Consider the vector $x = [x_1\ x_2\ 0 \cdots 0]$.

$\begin{align} &x^T A x = a_{11}x_1^2 + 2a_{12}x_1 x_2 + a_{22}x_2^2 >0 \\ &\Rightarrow a_{11}a_{22} > a_{12}^2 \\ &\Rightarrow \frac{a_{11}}{|a_{12}|} \frac{a_{22}}{|a_{12}|} > 1 \end{align}$

Which implies that either $a_{11}$ or $a_{22}$ must be greater than $|a_{12}|$. You can easily generalize to show that $|a_{kl}|$ is upper-bounded by either $a_{kk}$ or $a_{ll}$ for all $k,l$, which completes the proof.


Of course, as mentioned earlier, the Gershgorin circle theorem is a useful tool to have in your linear algebra tool box.

G Frazao
  • 434
  • 1
    Can you check please the edit part of the above question? Is that correct and complete? – Mary Star Nov 16 '22 at 18:59
  • 1
    @MaryStar your edit is in the right path, but not complete. You showed $a_{ii}+a_{jj}>2a_{ij}$, which is correct but is not equivalent to $a_{ii}+a_{jj}>|2a_{ij}|$. Imagine that $a_{ii}=a_{jj}=1$ and $a_{ij}=-2$, the equation you derive holds even though $|a_{ij}|>\text{max}(|a_{ii}|,|a_{jj}|)$. – G Frazao Nov 17 '22 at 10:25
  • So what is missing in my edit part to get the absolute value? – Mary Star Nov 17 '22 at 10:27
  • 1
    @MaryStar Repeat the reasoning with the vector with $i$-th and $j$-th entries equal to $1$ and remaining equal to $0$. You should arrive at $a_{ii}+a_{jj}>-2a_{ij}$, combine it with the previous equation to get the absolute value. – G Frazao Nov 17 '22 at 10:30
  • Ah ok! Thank you very much!!! :-) – Mary Star Nov 17 '22 at 10:42
  • Do you maybe have also an idea about my other question : https://math.stackexchange.com/questions/4578124/matrix-exchanging-elements ? – Mary Star Nov 17 '22 at 10:42