Recall that a derivation on a commutative algebra $A$ is a linear operator $D:A\to A$ which satisfies the Leibniz rule for products $D(fg)=fDg+gDf$. The standard differentiation $f\mapsto f'$ is certainly a derivation on $C^{\infty }(\mathbb{R})$ and it is known that it is essentially the only one because any derivation is of the form $f\mapsto hf'$ with a $h\in C^{\infty }(\mathbb{R})$. But these are not derivations on $C^{k}(\mathbb{R})$ for $k\geqslant 1$ as $f'\notin C^{k}(\mathbb{R})$ for some $f\in C^{k}(\mathbb{R})$. I suspect there is only the trivial one. Is this true? And if so, how can I prove that? I know it is true for $C^{0}(\mathbb{R})$ (proof).
-
Maybe you can look at the proof for $C^0$ and see if it can be adapted? Do you have a reference for this proof? – Captain Lama Sep 18 '24 at 16:30
-
Here is a : link to a proof: https://ncatlab.org/nlab/show/derivation#DerOfContFuncts – Udo Zerwas Sep 18 '24 at 18:13
-
So you just need to write any $C^k$ function which vanishes at $0$ as a product of two $C^k$ functions which both vanish at $0$. – Captain Lama Sep 18 '24 at 19:35
-
1But how do you prove that this is always possible? I don't think it .is – Udo Zerwas Sep 18 '24 at 19:48
-
1It's not always possible. In fact a product of two $C^k$ functions which vanish at $0$ has a Taylor expansion of order at least $k+1$ (it doesn't necessarly mean that it's $C^{k+1}$), but there are $C^k$ functions that don't admit it, like $x^{k+1/2}$. – Matteo Gori Sep 18 '24 at 21:59
-
Right, any function $f_{\alpha}:x\mapsto \left| x \right|^{\alpha}$ (you have omitted the modulus which is needed for the function to be defined for $x\lt 0$) with a constant $\alpha$ satisfying $k\lt \alpha \lt k+1$ is $k$ times differentiable at $0$ but not $k+1$ times and it does not admit such a product representation which can be justified using the order of a function (at $0$): this order of such a product would have to be either an integer or be at least $k+1$ but both contradict the fact that the order of $f_{\alpha}$ is $\alpha$. Any idea for an approach to prove my conjecture? – Udo Zerwas Sep 19 '24 at 09:14
-
I did the proof, and writing it. It takes quite long time ;) – needmoremath Sep 19 '24 at 09:14
-
I am really looking forward to your proof. – Udo Zerwas Sep 19 '24 at 09:19
-
@needmoremath Will you still post your proof? – Udo Zerwas Sep 19 '24 at 16:45
-
2@needmoremath You said "I did the proof", Are you still busy with writing it down to present it here or have you quit it because meanwhile you have detected an error? – Udo Zerwas Sep 21 '24 at 10:30
-
2@needmoremath - if you have a really long proof it is okay to just post an outline of it. Perhaps someone else will take it and write up all the details. – JonathanZ Sep 21 '24 at 11:31
-
Seems to be a grand grandson of Pierre de Fermat. – Martin Brandenburg Sep 21 '24 at 11:37
-
So we should ask Andrew Wiles if he can settle my question? – Udo Zerwas Sep 21 '24 at 12:02
-
I do not see a problem. My several answers grew to papers, some of which remain unfinished for years. – Alex Ravsky Sep 21 '24 at 12:10
-
Sorry for the late. – needmoremath Sep 21 '24 at 17:03
-
1Indeed, my initial proof was completely wrong, so I threw the solution out and finally did complete proof (hopefully). + I apologize for not responding to the comments. I have worked on my personal space since this window is too inconvenient and slow. – needmoremath Sep 21 '24 at 17:07
-
By the way, I am afraid that my answer is too long (regardless of whether it needs to be or not), so no one will verify or even check my answer... – needmoremath Sep 21 '24 at 17:10
-
Thanks a lot! I have started checking it out. No worries others will do the same. – Martin Brandenburg Sep 21 '24 at 17:17
-
Thank you for reviewing it! I will also check throughout again. – needmoremath Sep 21 '24 at 17:33
-
@needmoremath -I am also reviewing your proof but I am afraid it will take me at least as long as it took you to devise it and take it down in that praiseworthy detail. As was to be expected a convincing proof of this conjecture will be quite involved. – Udo Zerwas Sep 22 '24 at 10:03
-
Sure I can wait (as you had... :) ) and looking forward to discussing this – needmoremath Sep 22 '24 at 10:04
-
@Udo Zerwas Since you denied my arguments, I erased my solution and want to discuss on another space. Good luck to you for showing existence of non-trivial derivations~ – needmoremath Sep 23 '24 at 10:17
-
1@needmoremath I have not denied your arguments but just said I cannot spend the time to check them for correctness. And in no way have I motivated you to withdraw your answer which is a pity because others might have reviewed it completely. – Udo Zerwas Sep 23 '24 at 10:49
-
Really would like to know if somebody out there is still working on this problem, and if so, in the negative (my conjecture when I posted it) or the positive direction? it seems to be deeper than I had thought myself at first. Meanwhile i am totally undecided and are looking forward to hear about new ideas which need not be complete solutions already.. I could not find a single reference in the literature which only mentions this problem. – Udo Zerwas Sep 24 '24 at 10:00
-
Take a look at the paper "Characterizations of derivations on spaces of smooth functions" here: https://link.springer.com/article/10.1007/s00010-022-00934-x If I understand Corollary 2 correctly, then a derivation $D:C^k(\mathbb{R}) \to C(\mathbb{R})$, $k>0$ is of the form $D(f)=df'$ for some $d \in C(\mathbb{R})$. If this is correct and if in addition $D$ maps to $C^k(\mathbb{R})$ maybe it is possible to show $d=0$. – Gerd Sep 25 '24 at 12:14
-
In the book of König and Milman "Operator Relations Characterizing Derivatives" (cited in the paper of my last comment) Theorem 3.1 there says that each mapping $D:C^k(\mathbb{R}) \to C(\mathbb{R})$, $k>0$, with $D(fg)=Df \cdot g+f \cdot Dg$ is of the form $Df=cf\log|f| + d f'$ for some functions $c,d \in C(\mathbb{R})$. Now, if $D$ is in addition linear then $c=0$ (by considering constant functions) and if $D$ in addition maps to $C^k(\mathbb{R})$ then $d=T(x \mapsto x) \in C^k(\mathbb{R})$, and then $df' \in C^k(\mathbb{R})$ for each $f \in C^k(\mathbb{R})$. This implies $d=0$. – Gerd Sep 25 '24 at 13:44
-
This is an answer to the question, not just a comment. Comments should only be used to clarify, not answer the question. See How do comments work for more information. Therefore, please post your answer as an answer. This brings extra visibility to the answer und puts the question off the unanswered list. – Martin Brandenburg Sep 25 '24 at 14:18
-
@Gerd Yes, you have detected a relevant reference which finally gives a satisfactory answer to my question just as you have sketched. Unfortunately the proof of the central theorem which you have quoted is not given in this paper but in this book it refers to. I have immediately ordered this book because I wonder whether this proof is hard and I have absolutely no idea what it will look like, have you? – Udo Zerwas Sep 25 '24 at 14:29
-
@UdoZerwas I skimmed the proof. It seems to me that the methods are elementary but nevertheless I find it quite long, tricky and technical. – Gerd Sep 25 '24 at 14:39
-
@MartinBrandenburg I posted it as comment since I do not intend to work out the proof of König and Milman. But if you think it is worth an answer I will do that. – Gerd Sep 25 '24 at 14:42
2 Answers
The following proof is based on Theorem 3.1 from the book of König and Milman: Operator Relations Characterizing Derivatives, Birkhäuser (2018):
Theorem: Let $k \in \mathbb{N}$ and let $D:C^k(\mathbb{R}) \to C(\mathbb{R})$ be a mapping with the property $$ \forall f,g \in C^k(\mathbb{R}): ~ D(f\cdot g)=Df \cdot g+f\cdot Dg. $$ Then there exist functions $c,d \in C(\mathbb{R})$ such that $$ \forall f \in C^k(\mathbb{R}): ~ Df = c\cdot f\cdot \log|f| + d \cdot f'. $$
Now, if $D$ is in addition linear, then inserting the constant function $g=e$ yields $$ (2e(\log(2)+1))\cdot c=(2e\log(2e))\cdot c = D(2g)=2Dg = (2e\log e)\cdot c = (2e)\cdot c, $$ hence $c=0$. If in adddition $D(C^k(\mathbb{R})) \subseteq C^k(\mathbb{R})$, then inserting $h(x)=x$ yields $Dh=d \in C^k(\mathbb{R})$. Now $$ \forall f \in C^k(\mathbb{R}): ~ Df=d\cdot f' \in C^k(\mathbb{R}). $$ If $d(x_0) \not= 0$ for some $x_0 \in \mathbb{R}$ choose $f \in C^k(\mathbb{R})$ such that $f^{(k)}$ is not differentiable in $x_0$. Then $df'$ is in $C^{k-1}(\mathbb{R})$ but not in $C^k(\mathbb{R})$, a contradiction. Summing up, $D=0$.
- 9,892
- 1
- 6
- 27
The fact that no such derivation exists is a special case of theorem 3.1 in Operator Relations Characterizing Derivatives by König and Milman.
Theorem 3.1. Let $k\geq0$ and $I\subset\mathbb{R}$ be open. The general solution to the operator equation $$T[fg]=T[f]g+fT[g]$$ for $T:C^k(I,\mathbb{R})\rightarrow C^0(I,\mathbb{R})$ is given by $$T[f]=\begin{cases}cf\ln|f|+df' & k > 0\\ cf\ln|f| & k=0\end{cases}$$ for any $c,d\in C^0(I,\mathbb{R})$.
A derivation $D$ on $C^k(\mathbb{R})$ is a linear operator satisfying the equation above with image in $C^k(\mathbb{R})$. Linearity forces $c=0$. On the other hand, having image in $C^k(\mathbb{R})$ also forces $d=0$ (as we will see). We can prove a version of this theorem specifically for derivations quite easily using similar ideas to those presented by König and Milman. In what follows, let $D:C^k(\mathbb{R})\rightarrow C^k(\mathbb{R})$ be a derivation with $k>0$.
Lemma 1. If $f_1,f_2\in C^k(\mathbb{R})$ agree on an open set $I\subseteq \mathbb{R}$, then $D[f_1],D[f_2]$ agree on $I$ as well.
For any open $I\subseteq\mathbb{R}$, $p\in I$, and $f_1,f_2\in C^k(\mathbb{R})$ that agree on $I$, we may choose a $g\in C^k(\mathbb{R})$ with $g(p)=1$ and $\text{supp}(g)\subset I$. Then, since $(f_1-f_2)g=0$, the Leibniz rule yields $$(D[f_1]-D[f_2])g+(f_1-f_2)D[g]=0$$ Evaluating at $p$, we get $D[f_1](p)-D[f_2](p)=0$. Since $p$ was arbitrary, $D[f_1]$ and $D[f_2]$ also agree on $I$. $\square$
Lemma 2. $D$ is an order-$k$ linear differential operator.
Let $f\in C^k(\mathbb{R})$, $p\in \mathbb{R}$, and $q\in \mathbb{R}[x]$ be the degree $k$ Taylor polynomial of $f$ at $p$. Define $$g(x)=\begin{cases}f(x)& x\leq p\\ q(x) & x\geq p\end{cases}$$ By lemma 1, $D[g](x)=D[f](x)$ for $x<p$ and $D[g](x)=D[q](x)$ for $x>p$. By continuity, $D[f](p)=D[g](p)=D[q](p)$. Thus, $D[f](p)$ depends only on $f(p),\ldots,f^{(k)}(p)$. Since $p$ was arbitrary and $D$ is linear, $D[f]$ is linear in $f,\ldots,f^{(k)}$. $\square$
Lemma 3. $D[f]$ is of the form $df'$ for some $d\in C^k(\mathbb{R})$.
In light of lemma 2, we may write $$D[f]=\sum_{n=0}^k\alpha_nf^{(n)}$$ for some functions $(\alpha_n)$. Let $e_p^m$ be the map $x\mapsto (x-p)^m$. Then $$D[e_0^m]=\sum_{n=0}^m\frac{m!}{(m-n)!}\alpha_ne_0^{m-n}\in C^k(\mathbb{R})$$ Induction on $m$ shows that $\alpha_n\in C^k(\mathbb{R})$ for each $n$.
We know $\alpha_0=0$, as $D[1]=D[1^2]=2D[1]$ gives $D[1]=0$. On the other hand, for $m>1$ we have $$m!\alpha_m(p)=D[e_p^m](p)=me_p^{m-1}(p)D[e_p](p)=0$$ by the power rule (easily derivable from the Leibniz rule). Thus, $\alpha_m=0$ for $m>1$. The only remaining term in the formula for $D[f]$ is $\alpha_1f'$. $\square$
Theorem. $D$ is trivial.
Write $D[f]=df'$ as in lemma 3. If $d$ is not identically $0$, then with the choice of a non-degenerate interval $[a,b]\subset \text{supp}(d)$, we may take $$f(x)=\int_a^x\frac{g(t)}{d(t)}dt \quad\quad x\in[a,b]$$ for some $g$ that is exactly $k-1$ times differentiable in $[a,b]$. We can extend $f$ to $\mathbb{R}$ by defining it outside of $[a,b]$ to be the Taylor polynomial of degree $k$ of the closest endpoint. Then, $df'$ and $g$ are identical on $[a,b]$, so $df'\not\in C^k(\mathbb{R})$. $\square$
- 3,319
- 15
- 27
-
In my opinion it is very deserving that you give a proof of this Theorem 3.1. and do not only refer to it as I have this book not yet available. The reasoning is surprisingly easy to follow up but it is a totally different story to devise these ideas, of course. – Udo Zerwas Sep 26 '24 at 13:02