Consider the following proof:
Theorem 5.5. Let $\mathsf{T}$ be a linear operator on a vector space, and let $\lambda_1, \lambda_2, \ldots, \lambda_k$ be distinct eigenvalues of $\mathsf{T}$. For each $i = 1, 2, \ldots, k$, let $S_i$ be a set of eigenvectors of $\mathsf{T}$ corresponding to $\lambda_i$. If each $S_i \, (i = 1, 2, \ldots, k)$ is linearly independent, then $S_1 \cup S_2 \cup \cdots \cup S_k$ is linearly independent.
Proof. The proof is by mathematical induction on $k$. If $k = 1$, there is nothing to prove. So assume that the theorem holds for $k - 1$ distinct eigenvalues, where $k - 1 \geq 1$, and that we have $k$ distinct eigenvalues $\lambda_1, \lambda_2, \ldots, \lambda_k$ of $\mathsf{T}$. For each $i = 1, 2, \ldots, k$, let $S_i = \{v_{i1}, v_{i2}, \ldots, v_{in_i} \}$ be a linearly independent set of eigenvectors of $\mathsf{T}$ corresponding to $\lambda_i$. We wish to show that $S_1 \cup S_2 \cup \cdots \cup S_k$ is linearly independent.
Consider any scalars $\{a_{ij}\}$, where $i = 1, 2, \ldots, k$ and $j = 1, 2, \ldots, n_i$, such that $$ \sum_{i=1}^{k} \sum_{j=1}^{n_i} a_{ij} v_{ij} = 0. \tag{1} $$ Because $v_{ij}$ is an eigenvector of $\mathsf{T}$ corresponding to $\lambda_i$, applying $\mathsf{T} - \lambda_k I$ to both sides of $(1)$ yields $$\sum_{i=1}^{k-1} \sum_{j=1}^{n_i} a_{ij} (\lambda_i - \lambda_k) v_{ij} = 0. \tag{2} $$ But $S_1 \cup S_2 \cup \cdots \cup S_{k-1}$ is linearly independent by the induction hypothesis, so that $(2)$ implies $a_{ij} (\lambda_i - \lambda_k) = 0$ for $i = 1, 2, \ldots, k - 1$ and $j = 1, 2, \ldots, n_i$. Since $\lambda_1, \lambda_2, \ldots, \lambda_k$ are distinct, it follows that $\lambda_i - \lambda_k \neq 0$ for $1 \leq i \leq k-1$. Hence $a_{ij} = 0$ for $i = 1, 2, \ldots, k - 1$ and $j = 1, 2, \ldots, n_i$, and therefore $(1)$ reduces to $\sum_{j=1}^{n_k} a_{kj} v_{kj} = 0$. But $S_k$ is also linearly independent, and so $a_{kj} = 0$ for $j = 1, 2, \ldots, n_k$. Consequently $a_{ij} = 0$ for $i = 1, 2, \ldots, k$ and $j = 1, 2, \ldots, n_i$, proving that $S$ is linearly independent.
I've been looking at this proof for too long and don't get how the author went from Equation $(1)$
$$\sum_{i=1}^k \sum_{j=1}^{n_i} a_{ij}v_{ij} = 0 \tag{1}$$
to Equation $(2)$
$$\sum_{i=1}^{k-1} \sum_{j=1}^{n_i} a_{ij}(\lambda_i - \lambda_k) v_{ij} = 0 \tag{2}.$$
I think I just don't get what applying $\mathsf{T} - \lambda_k I$ on the sum means.