11

Let $D\subset\mathbb R$ and let $T\in(0,+\infty)$. A function $f\colon D\longrightarrow\mathbb R$ is called a periodic function with period $T$ if, for each $x\in D$, $x+T\in D$ and $f(x+T)=f(x)$.

If $D\subset\mathbb R$ and $f\colon D\longrightarrow\mathbb R$ is continuous and periodic, must there be, among all periods of $f$, a minimal one?

Questions like this one have been posted here before, but in each case, as far as I can see, the domain of $f$ was $\mathbb R$, which implies that the set $P$ of periods, together with $0$ and $-P$, is a subgroup of $(\mathbb{R},+)$. Using that (together with continuity), it is easy to see that a minimal period must exist indeed. But I don't know whether it is true or not in the general case.

  • Maybe I misunderstand, but I feel like the argument carries over pretty cleanly. Suppose there is no minimal one. Because periods get arbitrarily small, $D$ must be dense in a positive ray (starting at any $x \in D$), and in fact $f$ is constant on this dense set. Thus $f$ is constant. – Mees de Vries Oct 11 '18 at 10:37
  • @MeesdeVries How do you go from “periods get arbitrarily small” to “$D$ must be dense in a positive ray”? Doesn't that assume that if $T$ and $T^\star$ are periods, with $T>T^\star$, then $T-T^\star$ is a period too? Is that obvious in this context? – José Carlos Santos Oct 11 '18 at 10:40
  • If $T_n$ is a sequence of periods of $f$ which tends to zero, and $x \in D$, then the set of points $x'$ where $f(x) = f(x')$ -- so in particular $x' \in D$ -- includes at least ${x + kT_n \mid k, n \in \mathbb N}$, which is certainly dense in $[x, \infty)$. Is this wrong? – Mees de Vries Oct 11 '18 at 10:44
  • 1
    @MeesdeVries It looks right, but you are assuming that “there is no minimal period” is equivalent to “there is a sequence of periods which converges to $0$”. Why do you think so? – José Carlos Santos Oct 11 '18 at 10:47
  • Ah, thank you for the correction. I see where the difficulty lies. Interesting question! – Mees de Vries Oct 11 '18 at 10:48
  • One clarification. Is the idea that for $T$ to be a period we must have $x+T\in D$ for all $x\in D$? If we only require $f(x+T)=f(x)$ whenever both $x$ and $x+T$ are elements of $D$ then we get "obnoxious" counterexamples like $f(x)=x$ for all $x\in D=\Bbb{Q}$ which would vacuously have period $T$ for any irrational $T$. – Jyrki Lahtonen Oct 11 '18 at 10:50
  • Maybe $D$ should be an open set? – p4sch Oct 11 '18 at 10:53
  • @JyrkiLahtonen Yes. I wrote in my second sentence that $x+T\in D$. – José Carlos Santos Oct 11 '18 at 10:53
  • @p4sch I made no topological assumption about $D$. – José Carlos Santos Oct 11 '18 at 10:54
  • Thanks. I wasn't sure whether we are to parse it like A) $(x\in D\wedge x+T\in D)\implies f(x+T)=f(x)$ or B) $x\in D\implies (x+T\in D\wedge f(x+T)=f(x))$. Interpretation B obviously excludes the obnoxious examples :-) – Jyrki Lahtonen Oct 11 '18 at 10:56
  • What if we take $D={0}\cup (1,\infty)$ and $f$ constant? Every $T>1$ is a period, but $T=1$ is not so there is no smallest period? In other words. you may want to explicitly exclude constant functions. Sorry about continuing to nitpick. – Jyrki Lahtonen Oct 11 '18 at 11:12
  • @JyrkiLahtonen That answers my question. Please post your example as an answer. – José Carlos Santos Oct 11 '18 at 11:16

1 Answers1

10

This has the air of exploiting a hole left in the question parameters, but here comes.

Let $D=\{0\}\cup(1,\infty)$ and let $f(x)$ be a constant function. Then $T$ will be a period if and only if $T+D\subseteq D$. In particular:

  • every $T>1$ is a period, but
  • $T=1$ is not a period (and there cannot be smaller periods $\le 1$),
  • so there is no smallest period.
Jyrki Lahtonen
  • 140,891