"Big O" notation crops up everywhere in analytic number theory. Roughly speaking, we say $f(x) = O(g(x)$ if there exists a positive constant $M$ s.t. $\lvert f(x) \rvert \leq M \lvert g(x) \rvert $ for sufficiently large $x$ (we can also define it near other points, such as when $x$ is near $0$, but here, I'm assuming that $x$ is going off to infinity).
Another notation I've met recently is $f(x) \ll g(x)$. In papers, authors seems to use "big O" and double "less than" signs interchangeably.
Are these notations equivalent, or do they have slightly different meanings?