Consider the matrix $B = \begin{bmatrix} 2 & 2 \\ 1 & 3 \end{bmatrix}$. Find projection matrices $P_1, P_2$ such that (1) $B = \lambda_1 P_1 + \lambda_2 P_2$ where $\lambda_1, \lambda_2$ are the eigenvalues of $B$, (2) $P_1 P_2 = 0$, and (3) $P_1 + P_2 = I_2$, the $2 \times 2$ identity. (Note: a projection matrix $P$ satisfies $P^2 = P$.
The eigenvalues are $\lambda_1 = 1, \lambda_2 = 4$ and eigenvectors $\begin{bmatrix} 2 \\ -1 \end{bmatrix}, \begin{bmatrix} 1 \\ 1 \end{bmatrix}$. The problem comes from this past QR exam - https://lsa.umich.edu/content/dam/math-assets/math-document/AIM/DELA/DELA_Sep18%20-%20Differential%20Eqns%20%26%20Linear%20Algebra%20Fall%202018.pdf - and I thought I could figure it out for practice, but I haven't been able to solve it. In particular, I'm not familiar with how to decompose a matrix into projection matrices using its eigenvalues. Any help or hints?