1

We were asked to find a symmetric idempotent matrix $H$ with rank $n-1$ such that if $X$ is a column vector with $n$ observations, then ${1\over n}X^THX$ is the variance of observations in $X$.

I found the matrix (for $n$ obs) to be $H_n=I_n-{1\over n}A_n$ where $I_n$ is identity matrix of dimension $n\times n$, $A_n$ is again $n\times n$ with all observations being $1$ and $H_n$ is the required matrix.

It was easy to show this is symmetric and idempotent but I'm facing difficulty with showing its rank is $n-1$. However, it is easy to see $R_1+R_2+\dots+R_n=0$ where $R_i$ is the $i^{th}$ row. So its rank is strictly less than $n$.
I also noticed $R_1+R_2+\dots+R_n-R_i\ne0$ for any $i$.
How should I proceed?

StubbornAtom
  • 17,932
Anvit
  • 3,449
  • How many zero eigenvalues does this matrix have? If the matrix is symmetric it should be diagonalizable. – Chaos Nov 20 '18 at 10:22
  • Sorry, haven't studied eigenvectors yet. Is there an approach without them? – Anvit Nov 20 '18 at 10:25
  • 2
    It is known (if you don't know this, then it is a very good exercise) that, for an idempotent $n$-by-$n$ matrix $E$ over a field $\mathbb{K}$, $\ker(E)\oplus\text{im}(E)=\mathbb{K}^n$. (That means $\ker(E)+\text{im}(E)=\mathbb{K}^n$ and $\ker(E)\cap\text{im}(E)={0}$.) So, if you find out that $\ker(E)$ is $r$-dimensional, then $\text{im}(E)$ is $(n-r)$-dimensional, whence $E$ is of rank $n-r$. The same situation applies here. Prove that $\ker(H)$ has dimension $1$. – Batominovski Nov 20 '18 at 10:53
  • @Batominovski $ker(H)={(x,x,\dots)|x\in\mathbb K}$ I think this proves it. Thanks – Anvit Nov 21 '18 at 04:47

3 Answers3

3

Denoting the column vector of all $1$s by $\mathbf1$, we have$$H=I_n-\frac{1}{n}\mathbf{11}^\top$$

Indeed as you say, $H$ is an idempotent matrix. Then we know that $$\mathrm{rank}(H)=\mathrm{trace}(H)=\mathrm{trace}(I_n)-\mathrm{trace}\left(\frac{1}{n}\mathbf{11}^\top\right)=n-1$$


We can also use some trivial rank inequalities although this is quite unnecessary to prove the result:

We know that for any two matrices $A$ and $B$ having the same order, $$\mathrm{rank}(A-B+B)\le \mathrm{rank}(A-B)+\mathrm{rank}(B)$$

Or, $$\mathrm{rank}(A-B)\ge |\mathrm{rank}(A)-\mathrm{rank}(B)|$$

Noting that $\mathbf{11}^\top$ is a rank $1$ matrix, applying this inequality on $H$ we get,

$$\mathrm{rank}(H)\ge n-1$$

Now we can show that $\mathrm{rank}(H)$ is never $n$ (the only other possibility) from the fact that $$\det(H)=1-\frac{1}{n}\mathbf1^\top\mathbf1=1-1=0$$

So it must be that $$\mathrm{rank}(H)=n-1$$

StubbornAtom
  • 17,932
1

To prove the rank of $H$ is $n-1$, we may look at the linear system $HX = 0$ and prove the dimensions of the space of solutions is $1$. The system $HX = 0$ may be written as

\begin{align*} (S)\left\{\begin{matrix} x_1 &+& x_2 &+& \ldots &x_n &=& n x_1 \\ x_1 &+& x_2 &+& \ldots &x_n &=& n x_2 \\ \vdots && &&&\vdots && \vdots \\ x_1 &+& x_2 &+& \ldots &x_n &=& n x_n \\ \end{matrix}\right. \end{align*} now substract the first line to all the other lines : \begin{align*} (S) &\Longleftrightarrow & \left\{\begin{matrix} x_1 &+& x_2 &+& \ldots &x_n &=& n x_1 \\ &&&&& 0 &=& n (x_2-x_1) \\ &&&&&\vdots&& \vdots \\ &&&&& 0 &=& n (x_n-x_1) \\ \end{matrix}\right.\\ \\ & \Longleftrightarrow & \left\{\begin{matrix} x_1 &+& x_2 &+& \ldots &x_n &=& n x_1 \\ &&x_2&&& &=& x_1 \\ &&&\ddots&&&& \vdots \\ &&&&& x_n &=& x_1 \\ \end{matrix}\right. \end{align*} Next we subtract all lines $2, \ldots, n$ to line $1$ to get \begin{align*} (S) &\Longleftrightarrow & \left\{\begin{matrix} x_2&& &=& x_1 \\ &\ddots &&& \vdots \\ &&x_n &=& x_1 \\ \end{matrix}\right. \end{align*} whose solutions are the $1$-dimensional space generated by $\begin{pmatrix}1 \\ 1 \\ \vdots \\ 1\end{pmatrix}$.

Joel Cohen
  • 9,444
-1

Let $e_k = (1, \omega^k, \omega^{2k}, \ldots, \omega^{(n-1)k})$ where $\omega=e^{2\pi i/n}$. You have shown that $e_0$ is in the null-space of $H$. For $0<k<n$, $e_k$ is an eigenvector of $H$ of eigenvalue $1$. As the $(e_k)$ are independent (why?), this shows that the rank is $n-1$.

Richard Martin
  • 1,691
  • 7
  • 8
  • Sorry, I should've mentioned. This is in a Statistics course and I haven't studied eigenvectors – Anvit Nov 20 '18 at 10:15
  • Go and study them then. – Richard Martin Nov 20 '18 at 10:18
  • Please tell me if me reasoning is correct. ${e_1,e_2,\dots,e_n}$ for a basis for vector space $V=\mathbb C^n$ and $H$ is linear transformation $V\to V$. By rank-nullity theorem, rank = $n$-dim(kernel) = $n-1$. If it is correct I just need to show that $e_k$ are independant – Anvit Nov 20 '18 at 10:47
  • You can pick a basis orthogonal to $e_0$, but you need to show that the kernel has dimension no higher than 1. This is what I did above. – Richard Martin Nov 20 '18 at 10:52