Let be $A(x) = x_1 A_1 + x_2 A_2 + x_3 A_3$, where $x = [ \, x_1 \; x_2 \; x_3 \,]^T\,$ is a real vector, $A_i$, $i=1,2,3$, are known $3\! \times \!3$ real matrices. How can I solve the following problem.
Find scalar $\lambda$ and vector $u$ such that
$[ \, \lambda I_3 - A(u) \, ]\,u = 0, \quad \quad (\,1\,)$
where $I_3$ is the $3\! \times \!3$ identity matrix, $\lambda$ and $u$ are eigenvalue and eigenvector of $A(u)$, respectively.
Is there, in the literature, a tractable and feasible solution for this kind of problem?