This source (p.15) claims that a sequence of functions $ f_n : \mathbb{R} \to \mathbb{R},$ with $f_n = f(x + 1/n)$, converges to $f$ uniformly on $\mathbb{R}$.
Unless some more conditions are placed on $f$, this would seem untrue to me. For example, let $f(x) = 0$ if $x=0$ and $1$ otherwise. Then, for any $n, f(0) = 0,$ while $f_n(0) = f(1/n) = 1.$
If $f$ is uniformly continuous, the claim is true, as for all $x$, for all $\epsilon > 0$, $n > 1/\delta \implies |f_n(x) - f(x)| < \epsilon$.
If $f$ is merely continuous, I believe it converges, but not necessarily uniformly.
Note: This question is related to, but distinct from, similar discussions (e.g. When does $\lim_{n\to\infty}f(x+\frac{1}{n})=f(x)$ a.e. fail ), as I'm asking about a claim in a specific source; and because this discussion concerns elementary real analysis without measure theory.