You need your $\log_n(ca^n)$ not only to be increasing, but to actually go to infinity. Apart from that, your proof looks fine.
Here's how I would prove it. Lightly tweaking the notation a little*, we want to prove that $n^b = o(a^n)$ (for fixed constants $b > 0$ and $a > 1$), or in other words that for any given constant $c > 0$, we have $n^b < c a^n$ for sufficiently large $n$.
Just solve for $n$. Note that $n^b = e^{b \ln n}$, and $a^n = e^{n \ln a}$. So we want $e^{b \ln n} < e^{\ln c + n \ln a}$, or $b \ln n < \ln c + n \ln a$, or equivalently $n \ln a - b \ln n > -\ln c$. This is something you can see clearly is true for sufficiently large $n$: the LHS goes to infinity, while the RHS is a fixed constant. This you can now prove.
If you wanted a “precise and nice definition for $n_0$” (but really you shouldn't want it: note that the whole point of the $O$ notation is to suppress such information as nonessential), you could probably do the following: pick $n_0$ to be large enough such that:
(That is, pick $n_0$ to be the maximum of the points where the two conditions hold.) Then for $n \ge n_0$, we have
$$ n \ln a - b \ln n > n \ln a - 0.5 \ln a > 0.5n \ln a > -\ln c$$
and therefore $c a^n > n^b$ as required.
(The way I arrived from “$n \ln a - b \ln n > -\ln c$” to this trick is to see that the LHS was basically $O(n)$, so the pesky thing being subtracted could be bounded by say $0.1n$ or $0.5n$… and carrying the factors along.)
If you want something even more explicit: to prove that $n^b < ca^n$ for sufficiently large $n$, assume without loss of generality that $c \le 1$ (otherwise if $c > 1$ we could just prove the stronger inequality with $c = 1$), pick
$$n_0 = e + \left(\frac{b + \ln(1/c)}{\ln a}\right)^2$$
so that, for all $n \ge n_0$, we have:
- $\dfrac{n}{\ln n} > \sqrt{n}$ (true for all $n > 1$), and
- $\sqrt{n} > \dfrac{b + \ln(1/c)}{\ln a}$
which together give
$$ \frac{n}{\ln n} > \frac{b + \ln(1/c)}{\ln a}$$
or
$$ n \ln a > b \ln n + \ln(1/c)\ln n > b \ln n + \ln(1/c) $$
and therefore
$$a^n > n^b (1/c)$$
which is the same as $n^b < c a^n$ that we wanted to prove.
But if you don't want a “precise and nice definition for $n_0$”, then the nicer way to prove this (and to think about this) would be to write something like:
$$n^b = e^{O(\ln n)} = e^{o(n)} = o(a^n)$$
after proving that each of these (one-way) equalities is generally valid. That is the whole power of the notation (or the idea of asymptotic analysis): to avoid having to thinking about specific constants where things happen, while still being completely formal and rigorous.
[* Footnote on notation: I used $b$ instead of $x$ to indicate that it is a constant. And I don't like the subset notation for $O$, so I will not indulge in that misguided pedanticism. You can read the treatments by de Bruijn and by Knuth to see that we lose much of the value of $O$ notation by definining $O(x)$ as a set; for example, statements like $e^{O(1)} = O(1)$ become harder to write. See also the “four reasons” for preferring “$=$” over “$\subseteq$”, on pp 446–447 of Concrete Mathematics: actually just work through the whole Asymptotics chapter and see if you still prefer thinking of $O(\ldots)$ as sets.]