Let's $\pi(n)$ denote the prime-counting function. The prime number theorem famously states that $\pi(n) \underset{n\to+\infty}\sim \dfrac{n}{\log n}$. I'm interested in rougher but easier estimates. I will use the $\ll$ symbol instead of the $O$ notation.
- Most proofs of the infiniteness of prime numbers give some sort of lower bound. For example, Euclid's well-known proof can be used to get the very weak $\pi(n) \gg \log \log n$.
- One of my favourite proofs is the following: every number $k \leq n$ can be factored out as a squarefree number (which is a product of distinct primes, all $\leq n$) and a perfect square (and there are $\leq \sqrt n$ perfect squares which are $\leq n$). That gives $n \leq 2^{\pi(n)}\,\sqrt n$, so $\pi(n) \gg \log n$. Euler-type proofs also give this sort of bounds (and, conversely, this argument is the basis of Erdős's proof that $\sum_p \dfrac 1p = +\infty$).
- There are some clever, relatively easy proofs of the Čebyšëv estimates $\dfrac{n}{\log n} \ll \pi(n) \ll \dfrac{n}{\log n}$.
My question is: is there something nontrivial between 2. and 3? Specifically, I'm looking for a simple argument (simpler than all proofs of the Čebyšëv estimates) which would give a significant improvement on the logarithmic bound. Ideally, I would love a proof that I can remember almost as easily as 2. but which would give a lower bound $\pi(n) \gg n^\epsilon$, for some $\epsilon > 0$.