1

I'm trying to prove that this symmetric block matrix is positive definite $$ \begin{bmatrix} B & -I & 0 & \dots& 0 \\ -I & B & -I & \ddots & \vdots\\ 0 & -I & \ddots & \ddots & 0\\ \vdots & \ddots & \ddots & \ddots & -I\\ 0 & \dots & 0& - I & B \\ \end{bmatrix} $$

with $I$ the identity matrix

$$B = \begin{bmatrix} 4 & -1 & 0 & \dots& 0 \\ -1 & 4 & -1 & \ddots & \vdots\\ 0 & -1 & \ddots & \ddots & 0\\ \vdots & \ddots & \ddots & \ddots & -1\\ 0 & \dots & 0& - 1 & 4 \\ \end{bmatrix}$$

by proving that this term $x^T A x$ is positive, but given the fact that it is a block matrix I'm stuck. Could you help me please?

Sofia
  • 41

3 Answers3

2

Call your matrix $A$. By Gerschgorin disc theorem, all eigenvalues of $A$ lie inside the union of the Gerschgorin discs $X=\overline{B}(4,4),\,Y=\overline{B}(4,3)$ and $Z=\overline{B}(4,2)$, where $\overline{B}(z,r)$ denotes the closed disc centred at $z\in\mathbb C$ with radius $r$. Since these eigenvalues are also real (because $A$ is real symmetric), they must be nonnegative.

Now, by Taussky's extension to Gerschgorin disc theorem, since $A$ is irreducibly diagonally dominant (i.e. $A$ is irreducible, diagonally dominant and it is strictly diagonally dominant on at least one row), it is nonsingular. Hence all eigenvalues of $A$ are positive and $A$ is positive definite.

user1551
  • 149,263
  • Ben has this as being singular in some cases, but I have the same read as you... there is a single communicating class in the underlying graph so it must be PD – user8675309 Nov 11 '20 at 20:59
  • 2
    @user8675309 I wasn't aware of Taussky's extension until I read one of your comments a couple of days ago. Thanks to you, I can use this extension now. And yes, I think the OP's matrix is irreducible. Ben's answer perhaps has an error, but I cannot spot what exactly is wrong. – user1551 Nov 11 '20 at 21:08
  • @user8675309 I had made a mistake; my results now agree with user1551's – Ben Grossmann Nov 11 '20 at 21:30
  • Taussky's extension is a very nice trick! And this is definitely easier than computing eigenvalues. – Ben Grossmann Nov 11 '20 at 21:31
1

As the comment on your post mentions, the Gershgorin circle theorem is enough to prove that the matrix is at least positive semidefinite.

However, it turns out that we can actually find an expression for the eigenvalues of this matrix without too much effort. Suppose that the block-matrix is $m \times m$, with each block an $n \times n$ matrix. Let $M_k$ denote the $k \times k$ matrix $$ M_k = \begin{bmatrix} 0 & -1 & 0 & \dots& 0 \\ -1 & 0 & -1 & \ddots & \vdots\\ 0 & -1 & \ddots & \ddots & 0\\ \vdots & \ddots & \ddots & \ddots & -1\\ 0 & \dots & 0& - 1 & 0 \\ \end{bmatrix}. $$ Because $M_k$ is tridiagonal and Toeplitz, its eigenvalues can be easily found to be $$ \lambda_j = 2 \cos\left(\frac{\pi j}{k + 1}\right), \quad j = 1,\dots,k. $$ Using the Kronecker product and Kronecker sum, we can write this block matrix in the form $$ A = 4 I_{mn} + M_m \otimes I_n + I_m \otimes M_n = 4I_{mn} + (M_m \oplus M_n). $$ It is known that for square matrices $P,Q$, the eigenvalues of $P \oplus Q$ have the form $\lambda + \mu$ for eigenvalues $\lambda$ of $P$ and $\mu$ of $Q$. So, the eigenvalues of $A$ are given by $$ \lambda_{p,q} = 4 + 2 \cos\left(\frac{\pi p}{m + 1}\right) + 2 \cos \left(\frac{\pi q}{n + 1}\right), \quad p = 1,\dots,m, \quad q = 1,\dots,n. $$ We can see that these eigenvalues are non-negative because the minimum possible value for the above is $4 - 2\cdot 1 - 2 \cdot 1 = 0$. Thus, $A$ will always be positive semidefinite.

In fact, we see that $A$ is positive definite since neither of the cosines attain their minimum of $-1$.

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355
0

Another way to get the desired result is via Perron-Frobenius Theory and with a little cleverness we get Taussky's refinement of Gerschgorin Discs as a corollary.

$A:=\begin{bmatrix} B & -I & 0 & \dots& 0 \\ -I & B & -I & \ddots & \vdots\\ 0 & -I & \ddots & \ddots & 0\\ \vdots & \ddots & \ddots & \ddots & -I\\ 0 & \dots & 0& - I & B \\ \end{bmatrix}$

notice that $A$ is irreducible / represents a single communicating class in an underlying (possibly weighted and directed) graph

Direct Argument
consider $A':=-\big(A-4I\big)$
now $A'$ has maximum row sum of $4$ and minimum row sum of $2$. Since $A'$ is irreducible and real non-negative we may apply Perron-Frobenius Theory to observe that $2\lt \lambda_\text{Perron} \lt 4$
Being symmetric, $\lambda_\text{Perron} = \sigma_{max} = \Big\Vert A'\Big\Vert_2= \Big\Vert -A'\Big\Vert_2$
and since $-A' = A - 4I\implies A=-A'+4I$, then for any $\big \Vert \mathbf x\big \Vert_2=1$ we have

$\min \mathbf x^TA\mathbf x$
$= \min \Big\{\mathbf x^T\big(-A'+4I\big)\mathbf x\Big\}$
$= \min\Big\{ \mathbf x^T\big(-A'\big)\mathbf x+\mathbf x^T\big(4I\big)\mathbf x\Big\}$
$\geq \min\Big\{ \mathbf x^T\big(-A'\big)\mathbf x\Big\}+ \min\Big\{ \mathbf x^T\big(4I\big)\mathbf x\Big\}$
$= - \max\Big\{ \mathbf x^T\big(A'\big)\mathbf x\Big\}+ 4$
$\gt -4 +4$
$=0$
$\implies A \succ \mathbf 0$

Optional Second
$\text{Gerschgorin Discs & Perron-Frobenius Theory} \implies \text{Taussky's refinement of G-Discs}$
In words, the refinement says for any Weakly Diagonally Dominant irreducible $A$ where the dominance is strict in at least one row
$\det\big(A\big ) \neq 0$

arguing by contradiction, suppose $\det\big(A\big )=0$.
Now left multiply by invertible diagonal matrix $D$ where $d_{k,k} = -(a_{k,k})^{-1}$ (and we know $a_{k,k}\neq 0$, why?). In other words $D:=-\big(I\circ A\big)^{-1}$. Note that $D$ preserves rank, irreducibility and the diagonal dominance structure.
Since $(DA)$ has its diagonal homogenized to consist solely of $-1$'s, let us write $C-I=(DA)$

$\det\big(C-I\big)=\det\big(DA\big)=\det\big(A\big)=0\implies$ there is some $\mathbf x \neq \mathbf 0$ such that
$\big(C-I\big)\mathbf x = \mathbf 0\implies C\mathbf x = \mathbf x \implies \lambda_\text{max modulus}\Big(C\Big)\geq 1$
but
$\lambda_\text{max modulus}\Big(C\Big) \leq \lambda_\text{max modulus}\Big(\big \vert C\big \vert \Big) \lt 1$
which is a contradiction.

notes:
(i.) $\big \vert C\big \vert$ denotes $C$ with each element replaced by its modulus. The inequality is proven here $\rho(B) \leq \rho(|B|)$, where $\rho$ is the spectral radius
(ii.) The second inequality follows because $\big\vert C\big \vert$ is irreducible and substochastic and strictly so in at least one row. So $\big \vert C\big \vert$ cannot have an eigenvalue of 1, by standard Markov chain arguments or directly applying Perron-Frobenius Theorem.

user8675309
  • 12,193
  • for several different proofs of (ii.) ref https://math.stackexchange.com/questions/4888565/markov-chain-transition-matrix-eigenvalues-textbook-exercise/ and let $P:= \Sigma \big \vert C\big \vert$ where $\Sigma\succ \mathbf 0$ is a diagonal matrix chosen such that $P\mathbf 1 = \mathbf 1$ i.e. $P$ is a stochastic matrix – user8675309 Jun 12 '24 at 04:19