You’ve computed the characteristic polynomial correctly, but appear to be getting lost trying to factor it. The constant term is $b(a^2-1)$, which is nonzero for most values of $a$ and $b$, so $\lambda$ can’t be a factor. This goes hand in hand with the rank of the matrix: it’s obviously full-rank unless $a=\pm1$ or $b=0$, so zero can’t always be an eigenvalue.
You can make much less work for yourself by expanding the determinant along the last row or column. Two of the entries are zero, so your characteristic polynomial will already be partially factored. To wit, we get $(b-\lambda)((1-\lambda)^2-a^2)$. The second factor is a difference of two squares, which I assume that you know how to factor, so we get $(b-\lambda)(1-\lambda+a)(1-\lambda-a)$ with hardly any effort. The eigenvalues of the matrix therefore are $b$, $1+a$ and $1-a$. Compute corresponding eigenvectors in the usual way, taking care with the variables $a$ and $b$ in the matrix if you’re using Gaussian elimination.
As I noted in a comment, eigenvectors and eigenvalues of this matrix can be found pretty much by inspection. Recall that the columns of the matrix are the images of the standard basis. The last column is a multiple of $(0,0,1)^T$, so that gives you one eigenvector with eigenvalue $b$.
Next, recall that when you multiply a vector by this matrix, the result is a linear combination of its columns. Observe that the sum of the first two columns is $(1+a,1+a,0)^T$. Summing the first two columns is equivalent to multiplying the matrix by $(1,1,0)^T$, so you have another eigenvector, with eigenvalue $1+a$.
You can always get the last eigenvalue “for free,” since the trace is equal to the sum of the eigenvalue. Here, this gives us $(1+1+b)-b-(1-a) = 1-a$ for the last eigenvalue. The matrix is symmetric, so eigenvectors that have different eigenvalues are orthogonal. Assuming that the three eigenvalues are distinct, that means we can take the cross product of the two other eigenvectors found so far to get a third linearly-independent eigenvector. Alternatively, observe that subtracting the second column from the first results in a multiple of $(1,-1,0)^T$, so that’s another eigenvector.
A curious thing about this family of matrices is that even though the eigenvalues depend on $a$ and $b$, and in some cases you’ll even end up with repeated eigenvalues, these matrices have a common orthogonal eigenbasis that’s independent of these parameters.