14

I ran into this little problem somewhere online:

If $g(x) = f(f(x)) = x^2 - x + 1$, what is $f(0)$?

Plugging first $x=1$ and then $x=0$ into the identity $g(f(x)) = f(g(x))$, it is not hard to see that $f(0) = f(1) = 1$.

But that made me wonder: what else can we really say about $f$. Does $f$ have to be symmetric (around $x = \dfrac{1}{2}$) as $g$ is? Does it have to be continuous?

If $f$ is assumed to be differentiable, the chain rule gives us $f'(0) = -1$ and $f'(1) = 1$, so $f$ is tangent to $g$ in those points. And it would make sense if $f(x) \sim x^\sqrt{2}$ since $g(x) \sim x^2$, as $|x| \rightarrow \infty$. But I don't know how to prove something like this.

I'm aware that if $f(f(x)) = x$, there are many different choices for $f$, including discontinuous ones (e.g. $f(x) = 1 / x$ for $x\neq 0, f(0)=0$). But I find it hard to have intuition about the constraints on $f$ given $g(x) = x^2 - x + 1$...

Edit: Perhaps we should start with a simpler example, say $g(x) = x^4$. Then obviously $f(x) = x^2$ is a solution, and so is $f(x) =\dfrac{1}{x^2}$. Are there other solutions?

Edit 2: Ok apparently this is super hard: http://reglos.de/lars/ffx.html contains large number of references, and remarks that "it remains an often extremely difficult task to find the iterative roots of even very simple functions", so I'm not really expecting any conclusive answer below..

svangen
  • 309
  • 1
  • 7

5 Answers5

1

Here is a different construction principle that produces a continuous solution $f$ on $X:={\mathbb R}_{\geq3}$. Put $a_0:=3$, $b_0:=5$, and define recursively $$a_{k+1}:=g(a_k),\quad b_{k+1}:=g(b_k)\qquad(k\geq0)\ .$$ In this way we obtain two intertwined sequences $$(a_0,b_0,a_1,b_1,a_2,b_2,\ldots)=(3,5,7,21,43,421,1807, 176\,821,\ldots)\ .$$ Let $$I_k:=[a_k,b_k],\quad J_k:=[b_k,a_{k+1}]\qquad(k\geq0)\ .$$ These are consecutive intervals sharing only endpoints; together they form a partition of $X$. Furthermore $$g(I_k)=I_{k+1},\quad g(J_k)=J_{k+1}\qquad(k\geq0)\ .$$ The function $$f_0(x):=x+2\quad(x\in I_0)$$ maps $I_0$ bijectively onto $J_0$. We now define the function $f:\>X\to{\mathbb R}$ as follows: $$f(x):=\left\{\eqalign{&g^k\circ f_0\circ g^{-k}(x)\qquad(x\in I_k,\ k\geq0)\cr &g^{k+1}\circ f_0^{-1}\circ g^{-k}(x)\qquad(x\in J_k, \ k\geq0)\ .\cr}\right.$$ This $f$ maps each $I_k$ onto $J_k$ and each $J_k$ onto $I_{k+1}$. Furthermore it is easy to check that $f\circ f=g$ on $X$.

Choosing $f_0$ more carefully makes the resulting $f$ even continuously differentiable.

0

Actually, I wonder if $f$ is symmetric, even $g$ is.

  1. $g(0)=g(1)=1$
  2. for $x=0$, $f(g(0)) = f(1) = g(f(0))$
  3. for $x = 1$, $g(f(1)) = f(1) = g(f(1))$

According to 3., assume that $f(1)=k$, then $k = g(k)$. For equation $g(k)=k$, we can get $k=1$, that is $f(1)=1$.

According to 2. and 3.,$g(f(1)) = g(f(0)) = g(1) = 1$, so $g(f(0)) = g(1)$, then $f(0)$ can be $1$ or $0$. if $f(0)=0$, then $g(0)=f(f(0))=0\neq 1$, so $f(0)=1=f(1)$

0

Let us assume that function $g$ and $f$ have attributes as :

  1. $g(s+x) = g(s-x)$, so function $g$ is symmetric around $s$
  2. $g(x) = f(f(x)) \to g(f(x)) = f(g(x))$

We suppose that $g(s+x_0) = g(s-x_0) = m$, so, we can get :

  1. for $x_1 = s+x_0$, $f(g(x_1)) = g(f(x_1)) = f(m)$
  2. for $x_2 = s-x_0$, $f(g(x_2)) = g(f(x_2)) = f(m)$
  3. according to 1 and 2, we can get $g(f(x_1)) = g(f(x_2))$,then we have three choices:
    • $f(x_1)=f(x_2) \to f(x)$ is also symmetric around $s$
    • $f(x_1) = 2s-f(x_2)=f(x_2) \to f(x_2) = f(x_1) = s$,and so $f(x)$ and $g(x)$ are both constant function
    • for specific $g(x_{random}) = r$, there are $2l$ values of $x$ which are also symmetric around $s$, $l \in \{2,3,4,5,...\}$, I don't know how to analyse it.
0

Note that $g$ has the fixed point $x=1$. We therefore replace the coordinate $x$ on our real line by a new coordinate $t$ via $x:=1+t$. If we express $f$ and $g$ in this new coordinate we obtain functions $\hat f(t)=f(1+t)-1$ and $\hat g(t)=g(1+t)-1$. This "conjugation" leads to $$\hat f\circ \hat f=\hat g,\qquad \hat g(t)=t+t^2\ .$$ Now $t=0$ is a fixed point of $\hat g$. We therefore look for a $\hat f$ having $0$ as a fixed point as well. The Ansatz $$f(t):=\sum_{k=1}^\infty a_k t^k\ ,$$ a formal power series, leads via $\hat f\bigl(\hat f(t)\bigr)=t+t^2$ to a recursion for the $a_k$. It begins with $a_1^2=1$. The choice $a_1=-1$ already breaks down at the second step, but $a_1=1$ leads to $$\hat f(t)=t + t^2/2 - t^3/4 + t^4/4 - 5 t^5/16 + 27 t^6/64 - 9 t^7/16 + 171 t^8/256 - 69 t^9/128 - 579 t^{10}/2048 + 10689 t^{11}/4096+\ldots$$ Unfortunately the coefficients tend to increase, hence it is not sure that we have found an analytic $\hat f$.

  • 1
    Is the change of variables really correct? Sorry if I'm misunderstanding something obvious. The two functions $\hat{g}$ and $g$ have different minimal values for instance, so we do not have $\hat{g}(t) = g(x)$, with $x = 1+t$. I like the approach with the formal power series though! – svangen Apr 27 '17 at 13:33
  • 1
    I'm considering $f$ and $g$ as maps of the space ${\mathbb R}$ (or some part of it) to itself. If you change the coordinate in this space the expressions of the maps $f$ and $g$ change in the described way. Cf. what happens to the matrix of a linear map $A:>V\to V$ if you change the basis of $V$. – Christian Blatter Apr 28 '17 at 12:38
0

There's a lot of possibilities for $f$ given $f\circ f$, so not much can be said. However, one non-obvious thing that can be said is, for functions defined on $\mathbb R$, that if $f\circ f$ has a fixed point, then $f$ has a fixed point as well (not necessarily the same one). This is a particular case of Sharkovskii theorem.

Federico
  • 425