This is a sub-post of the problem Probabilistic Riemann hypothesis.
Consider a sequence of independent Bernoulli random variables $(X_n)_{n \geq 3}$ of parameters $1 / \log n$ (so $X_n = 1$ with probability $1 / \log n$ and $X_n = 0$ otherwise). The law of the iterated logarithm implies that almost surely, $\sum_{n \leq x}X_n = \textrm{Li}(x) + O(\sqrt{x\log \log x})$, as pointed out in this paper https://hal.science/hal-03174415/document. Without using the law of the iterated logarithm, I'd like to show that almost surely, one has $\sum_{n \leq x} X_n = \textrm{Li}(x) + O(x^{1/2}\log x)$ for all but finitely many $x$.
Attempt: By the Borel-Cantelli lemma, it suffices for each fixed choice of ${x}$ (which we can take to be a natural number) to obtain the given estimate with probability ${1-O(x^{-2})}$ (say). Let $Y_n = X_n - 1 / \log n$, the $Y_n$ have mean zero and are independent, and have size $O(1)$. Thus
$\displaystyle \mathop{\bf E} (\sum_{2 < n \leq x} Y_n)^k = O_k( x^{k/2})$
for any fixed even number ${k}$ (the only terms in ${(\sum_{2 < n \leq x} Y_n)^k}$ that give a non-zero contribution are those in which each ${Y_n}$ appears at least twice, so there are at most ${k/2}$ distinct indices of ${n}$ that arise, and so there are only ${O_k(x^{k/2})}$ such terms, each contributing ${O(1)}$. By Markov's inequality, it follows that $\displaystyle {\bf P}(|\sum_{2 < n \leq x} Y_n| > x^{1/2}\log x) \leq O(1 / \log^k x), k = 2, 4, \dots$ which is summable in $x$ (e.g. taking $k=2$). The claim then follows from the fact that $\sum_{2 \leq n \leq x}1 / \log n = \textrm{Li}(x) + O(1)$. Is this a valid a argument or not?