For a symmetric $n \times n$ matrix $A$, we will write $\lambda(A)$ for the vector of its eigenvalues in descending order. In other words, $\lambda_1(A) \ge \lambda_2(A) \dots \ge \lambda_n(A)$ are the eigenvalues of $A$. For two vectors $v, w \in \mathbb{R}^n$, we will write $v \succeq w$ ($v$ majorizes $w$), if $\sum_{i=1}^k v_i \ge \sum_{i=1}^k w_i$ for all $k=1 \dots n-1$, and $\sum_{i=1}^n v_i = \sum_{i=1}^n w_i$. We can state the problem now:
Let $A$ be a $n \times n$ positive semidefinite matrix, and let $P$ be any orthogonal projection (also a $n \times n$ matrix). Is it true that $$ \lambda(A) \preceq \lambda(PAP) + \lambda((I-P)A(I-P))?$$
My thoughts on the problem:
- the majorization condition for $k=1$ follows from the characterization of largest eigenvalue as $\lambda_1(A) = \max_{||v||=1} \langle Av, v \rangle $ and Cauchy-Schwarz,
- the majorization equality condition for $k=n$ follows quickly since $tr(A) = tr(AP) + tr(A(I-P)) = tr(PAP) + tr((I-P)A(I-P))$,
- numerically it seems to be true.