Given that $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$ are linearly independent vectors in $\ker(A-\lambda I)^k$ but not in $\ker(A - \lambda I)^{k-1}$ for some $\lambda$ and $k \geq 2$, how can it be proven that any possible linear combination of these vectors will also not be in $\ker(A - \lambda I)^{k-1}$? Therefore, more generally, that $$\left(\ker(A-\lambda I)^k\setminus\ker(A-\lambda I)^{k-1}\right)\cup\{\mathbf{0}\}$$is a subspace?
This question emerged when I was trying to prove that any set of jordan chains is linearly independent (if the top vectors also were linearly independent), once I got stuck, I found some posts that went through an analogous approach I did but at a certain point they concluded something like $$(A-\lambda I)^{k-1}(a_1\mathbf{v}_1 + a_2\mathbf{v}_2 + a_3\mathbf{v}_3) = 0 \implies a_1,a_2,a_3=0$$ which wasn't very obvious to me.