3

This source (p.15) claims that a sequence of functions $ f_n : \mathbb{R} \to \mathbb{R},$ with $f_n = f(x + 1/n)$, converges to $f$ uniformly on $\mathbb{R}$.

Unless some more conditions are placed on $f$, this would seem untrue to me. For example, let $f(x) = 0$ if $x=0$ and $1$ otherwise. Then, for any $n, f(0) = 0,$ while $f_n(0) = f(1/n) = 1.$

If $f$ is uniformly continuous, the claim is true, as for all $x$, for all $\epsilon > 0$, $n > 1/\delta \implies |f_n(x) - f(x)| < \epsilon$.

If $f$ is merely continuous, I believe it converges, but not necessarily uniformly.

Note: This question is related to, but distinct from, similar discussions (e.g. When does $\lim_{n\to\infty}f(x+\frac{1}{n})=f(x)$ a.e. fail ), as I'm asking about a claim in a specific source; and because this discussion concerns elementary real analysis without measure theory.

SRobertJames
  • 6,117
  • 1
  • 12
  • 40
  • I think you're correct. As for the case where $f$ is merely continuous, you can consider the example $f(x)=x^2$. – Feng Oct 24 '22 at 01:42
  • 1
    The source you link to says says $f$ is differentiable and $|f'(x)| \leq L$ for all $x \in \mathbb{R}$. This assumption implies $f$ is $L$-Lipschitz on $\mathbb{R}$ but you are correct that the claim is true if $f$ is uniformly continuous. – M A Pelto Oct 24 '22 at 02:03

0 Answers0