For any bounded operator $T$ on a Hilbert space $\mathcal H$, one has
$$
\lVert T \rVert=\sup \lVert Tv\rVert = \sqrt{\sup \langle Tv, Tv\rangle} = \sqrt{\sup \langle v, T^* Tv\rangle},
$$
where the suprema are over $v\in\mathcal{H}$ with $\lVert v \rVert = 1$.
If $T$ is self-adjoint, i.e. if $T^*=T$, then one has
$$
\lVert T \rVert= \sup \lvert \langle v, Tv\rangle \rvert,\tag{1}
$$
as shown in this answer. $T$ is called anti self-adjoint if $T^*=-T$, which is equivalent to $iT$ being self-adjoint. In this case we conclude that (1) holds true as well. Of course, if $T\geqslant 0$, formula (1) simplifies to
$$
\lVert T \rVert= \sup \langle v, Tv\rangle.
$$
In order for $\sup \langle v, Tv\rangle$ to make sense, we must have $\langle v, Tv\rangle \in \mathbb{R}$, which implies that $T$ is self-adjoint. In this case, we have
$$
\sup \langle v, Tv\rangle = \max_{\lambda\in\sigma(T)} \lambda,
$$
which may or may not coincide with $\lVert T \rVert$.
If one only assumes that $T$ is normal, i.e. that $T^*T=TT^*$, then formula (1) still holds true. This can be proven by means of the spectral theorem, see Theorem 12.25 in the book 'Functional Analysis' by Rudin.
In general, one has
$$
\lVert T \rVert \leqslant 2 \sup \lvert \langle v, Tv\rangle \rvert.\tag{2}
$$
In order to obtain this bound, note that one can decompose $T$ into a sum of a self-adjoint operator and an anti self-adjoint operator
$$
T=(T+T^*)/2 + (T-T^*)/2.
$$
But this implies
\begin{align*}
\lVert T \rVert &\leqslant \lVert T+T^* \rVert /2 + \lVert T-T^* \rVert /2 \\
&= \sup\lvert \langle v,Tv\rangle + \langle Tv,v\rangle \rvert/2 + \sup\lvert \langle v,Tv\rangle - \langle Tv,v\rangle \rvert/2 \\
&\leqslant 2 \sup\lvert \langle v,Tv\rangle \rvert.
\end{align*}
Finally, inequality (2) is optimal, as one may readily verify by considering the $2\times 2$ matrix
$$
T=\left[\begin{matrix} 0 & 1 \\ 0 & 0\end{matrix}\right].
$$