An optimization problem is considered convex if you are minimizing a convex function over a convex feasible region. So you must ask yourself:
- Is the function $-\langle x, v\rangle$ convex?
- Is the set $C\times\{\|v\|\leqslant1\}$ convex? (See edit below)
A potentially helpful way to attack 1. is to consider the case when $n=1$ (so that $-\langle x,v\rangle=-xv$, where $x,v\in\mathbb{R}$).
Edit
The formal definition of a convex optimization problem is a problem that can be expressed as
$$
\begin{array}{rl}
\min\ & f(x) \\
\text{s.t.}\ & g_i(x)\leqslant 0\text{ for all }i=1,\dots,m
\end{array}
$$
where $f:\mathbb{R}^n\to\mathbb{R}\cup\{\infty\}$ and $g_i:\mathbb{R}^n\to\mathbb{R}\cup\{\infty\}$ are convex functions for $i=1,\dots,m$. However given, any convex set $C$, the indicator function
$$ I_c(x)=\begin{cases}0, & x\in{C}, \\ \infty, &\text{else.}\end{cases} $$
is convex. Hence, if we are minimizing a convex function over a convex set, we can always (mathematically) represent our problem in the standard form above. Note that we are using the extended reals here, which is common in convex optimization (see e.g. Boyd's book).
Caveat: of course, the indicator function is useless for computation--you would need an algebraic description of the set $C$. But for definitions (as we're concerned with here), it will suffice.