0

This is a sub-post of the problem Probabilistic Riemann hypothesis.

Consider a sequence of independent Bernoulli random variables $(X_n)_{n \geq 3}$ of parameters $1 / \log n$ (so $X_n = 1$ with probability $1 / \log n$ and $X_n = 0$ otherwise). The law of the iterated logarithm implies that almost surely, $\sum_{n \leq x}X_n = \textrm{Li}(x) + O(\sqrt{x\log \log x})$, as pointed out in this paper https://hal.science/hal-03174415/document. Without using the law of the iterated logarithm, I'd like to show that almost surely, one has $\sum_{n \leq x} X_n = \textrm{Li}(x) + O(x^{1/2}\log x)$ for all but finitely many $x$.

Attempt: By the Borel-Cantelli lemma, it suffices for each fixed choice of ${x}$ (which we can take to be a natural number) to obtain the given estimate with probability ${1-O(x^{-2})}$ (say). Let $Y_n = X_n - 1 / \log n$, the $Y_n$ have mean zero and are independent, and have size $O(1)$. Thus

$\displaystyle \mathop{\bf E} (\sum_{2 < n \leq x} Y_n)^k = O_k( x^{k/2})$

for any fixed even number ${k}$ (the only terms in ${(\sum_{2 < n \leq x} Y_n)^k}$ that give a non-zero contribution are those in which each ${Y_n}$ appears at least twice, so there are at most ${k/2}$ distinct indices of ${n}$ that arise, and so there are only ${O_k(x^{k/2})}$ such terms, each contributing ${O(1)}$. By Markov's inequality, it follows that $\displaystyle {\bf P}(|\sum_{2 < n \leq x} Y_n| > x^{1/2}\log x) \leq O(1 / \log^k x), k = 2, 4, \dots$ which is summable in $x$ (e.g. taking $k=2$). The claim then follows from the fact that $\sum_{2 \leq n \leq x}1 / \log n = \textrm{Li}(x) + O(1)$. Is this a valid a argument or not?

shark
  • 1,551
  • How is $\frac1{\log^2x}$ summable in $x$? – nejimban May 02 '24 at 09:25
  • @nejimban: You are right, the sum diverges for any power $k$, any suggestion for fixing this issue to get a stronger concentration of measure? – shark May 02 '24 at 18:46
  • Perhaps your $O(x^{k/2})$ is not precise enough. What exactly is $\text{Var}\left(\sum_{n\le x}Y_n\right)$? How should it behave as $x\to\infty$? – nejimban May 02 '24 at 20:44
  • @nejimban: That there are $O_k(x^{k/2})$ terms with non-zero contribution to the $k$th moment ${\bf E}|\sum_{2 < n \leq x} Y_n|^k$ follows the same argument as the proof of Prediction 3 in the note: https://terrytao.wordpress.com/2015/01/04/254a-supplement-4-probabilistic-models-and-heuristics-for-the-primes-optional/comment-page-2/#respond. – shark May 02 '24 at 21:26
  • @nejimban: A direct computation gives $\text{Var}\left(\sum_{n\le x}Y_n\right) = \sum_{n \leq x}\frac{1}{\log n}(1 - \frac{1}{\log n})$, are you suggesting something similar to the Chebyshev's inequality? – shark May 02 '24 at 21:28
  • From your computation it follows that $\text{Var}(\sum_{n\le x}Y_n)\sim\text{Li}(x)$ as $x\to\infty$, but I'm afraid this is not enough. Although we could probably get what you want in probability (instead of almost surely)… – nejimban May 03 '24 at 08:31
  • 1
    @nejimban: Actually the problem in the mother post https://math.stackexchange.com/questions/4908059/probabilistic-riemann-hypothesis of this sub-post is solved with some minor adjustments to the argument presented here. Thanks for the feedback. – shark May 04 '24 at 03:29
  • Right, though this is not as good as $O(\sqrt{x\log\log x})$ :) – nejimban May 04 '24 at 07:32

0 Answers0