Since $\mathbb{Q}$ is dense in $\mathbb{R}$, given a real $\varepsilon > 0$ and an $\alpha$, there exist $m, n \in \mathbb{Z}$ such that $$\left|\frac{m}{n} - \alpha \right| < \varepsilon \,.$$
However, is it true that, given a real $\varepsilon > 0$ and an $\alpha$, there exist $m, n \in \mathbb{Z}$ such that $$\left|\frac{m}{n} - \alpha \right| < \frac{\varepsilon}{n} \,?$$
I believe that's true. My intuition is that, for big $n$, the multiples of $1 / n$ make up a sufficiently precise "grid" to account for $(\varepsilon /n)$-good approximations of $\alpha$. The problem is that, with bigger $n$, the error $\varepsilon / n$ is also smaller. I'm trying to write an actual argument that justifies this, and any help would be appreciated.