I know that $\frac{d}{dx} f(x) = f(x)$ has a solution $f(x) = e^x$ by letting $y= f(x)$ and using separation of variables. Is there a technique to solve (or at least start on) something like $\frac{d}{dx} f(x) = f(2x)$? Putting the $2$ anywhere else such as $2f(x)$, $f(x)+2$, etc. does not seem to cause much of an issue, but replacing the $x$ with $2x$ stumps me...
-
Certainly, there are techniques but for this you have to specify the function $~f(x)~$. Suppose $~f(x)=\sin x~$, then $~f(2x)=\sin (2x)~$. Now this $~f(2x)~$ can be written in terms of $~f(x)~$ and hence it can be solved. – nmasanta Mar 05 '20 at 03:51
-
2Does this answer your question? What function satisfies $F'(x) = F(2x)$? – Gonçalo Dec 30 '23 at 00:19
2 Answers
A partial solution ...
As far as a technique for something like $f'(x) = f(2x)$ goes, the method of Frobenius can provide some assistance. Slightly more generally you'll find $y'(x) = y(Ax)$ has formal solution \begin{equation} y(x) = C\, \sum_{k=0}^\infty\, A^{(k+1)k/2}x^k/k!, \end{equation} where $C$ is the constant of integration. The ratio test requires $\vert A\vert \leq 1$ for the series to converge, so this doesn't handle your case of $A=2$.
- 3,004
-
Unfortunately since the form is analogous $A^{k^2}$ this will be pretty challenging to assign any kind of divergent renormalization to. But one day we should be able to renormalize this and find those strange functions. – Sidharth Ghoshal Nov 15 '24 at 16:20
As far as I know something like this is significantly more difficult than solving "normal" differential equations. You already see why that's the case: In general we just have no idea how to even approach this. Basically most solution methods are in one way or another an integration (you mentioned separation of variables yourself). There are a lot of integration tricks, but basically none of them allow you to integrate something like $\frac{f'(x)}{f(2x)}$ generally, where $\frac{f'(x)}{f(x)}$ is very easy to integrate with substitution.
Additionally the existence and uniqueness theory of differential equations that most students learn doesn't really deal with something like this, so it is unclear whether solutions exist and how we have to choose "initial conditions" to make the solution unique.
A simple example would be something like a "delay equation" like $f'(x)=f(x-1)$ where $f:[0,\infty)\to\mathbb{R}$. Here the change of the function is equal to the value of the function "in the past" (if we see the argument as time). If I think through this correctly you need as initial conditions the values of $f$ on the interval $[0,1]$ and then you can obtain a unique solution. Just knowing the value at one specific $x$ would now be nearly enough to guarantee a unique solution. Once you have the values of $f$ on $[0,1]$ you can just integrate that to get the values on $(1,2]$ and then integrate that to get the values on $(2,3]$ and so on to define the function on the whole range (assuming that the integral is differentiable again and so on). In a little more detail: For $x\in(1,2]$ we can define $f(x)=f(1)+\int_1^x f'(s)ds=f(1)+\int_1^x f(s-1)ds$. This expression can be computed by the given "initial conditions" because $s-1\in [0,1]$, so $f(s-1)$ is already known by assumption. Repeat this procedure for all following intervals.
You see by this example that uniqueness of the solution needs more than just the value of $f$ at one point you would need for an ordinary linear first order differential equation like $f'(x)=f(x)$. Specifying the correct conditions is an important part when dealing with differential equations.
The fact that we could even solve this was pure luck: The value of $f'(x)$ was always a value of the function "we computed already" so we could just integrate without really thinking about it. Something similar could happen if you have something like the following example: $f:[\frac{1}{2},\infty)\to\mathbb{R}$ where on $[1,\infty)$ the function $f$ should fulfill the differential equation $f'(x)=f(\frac{x}{2})$. The values on $[\frac{1}{2},1)$ are already given (and the function does not need to fulfill the differential equation there because that wouldn't even be well defined). This is constructed explicitly to be easily solvable by the same method I already used: For $x\in[1,2)$ we have $f(x)=f(1)+\int_1^x f'(s)ds=f(1)+\int_1^x f(\frac{s}{2})ds$ and again $f(\frac{s}{2})$ is already known because $\frac{s}{2}\in[\frac{1}{2},1)$ which is exactly the interval where we already know the function by assumption. Repeat to get the values on the interval $[2,4)$, then $[4,8)$ and so on. Again, this is only solvable because we were "lucky" that the derivative only depended on values that we already knew. For something like $f'(x)=f(f(x))$ or $f'(x)=f(2x)$ this is not as easy. In special cases you might be able to obtain special solutions via ansatz or other very specific methods, but as mentioned before you wouldn't generally know whether these solutions are unique. For example $f'(x)=f(f(x))$ could be solvable with a power law ansatz, i.e. $f(x)=ax^r$. Why? Well, the derivative of a power law is another power law and the composition of a power law with itself is again a power law. So plugging that ansatz into the functional differential equation gives $arx^{r-1}=a(ax^r)^r=a^{1+r}x^{r^2}$. Assuming $a\ne 0$ this gives $ar=a^{1+r}$ and $r-1=r^2$. Solving the latter gives the two solutions $r=\frac{1}{2}\pm i\frac{\sqrt{3}}{2}$. Attention: Our differential equation is not linear, so we can't expect a linear combination of these different solutions to be another solution. Now take one of these solutions for $r$ and solve $ar=a^{1+r}$ (i.e. $r=a^r$) for $a$. Then you have two solutions of the differential equation, one for each $r$. Both fulfill $f(0)=0$. Assuming the solution is analytic (i.e. can be written as a power series) you could also use a power series as an ansatz to maybe find equations for the coefficients. I will leave that for you to try yourself (I don't know where it'll lead, so be careful to not get stuck in a dead end!). The power series ansatz might be the most general method I can spontaneously give you, but of course it only works if you can show that the solution actually is a power series (which might be arbitrarily difficult in general).
If you are interested in something like this, you might want to look up "functional differential equations" which is the name of something like this I think (for example I just found a text by E. Eder called "The Functional Differential Equation x'(t)=x(x(t))" which is one of the examples you asked for! The text even claims to show analiticity, i.e. that the function can indeed be written as a power series, which is great because that means we can use the power series ansatz I mentioned earlier!). But I would suggest to first study the theory of ordinary differential equations, which on its own is already very interesting (well, that of course depends on who you ask) and probably a lot more understandable for "beginner" (of course I have no idea how experienced you actually are, sorry if that feels patronizing), because I would guess that the methods necessary to deal with functional differential equations are more advanced, so you should understand the theory of ordinary differential equations first.
Another alternative would be that you don't want to study the general theory, but you have an explicit problem at hand. Maybe you want to describe a physical system or something similar where the dynamics follow such a functional differential equation. If you have such a specific case that you want to explore, I would suggest asking about that explicit example. In such a case "physical intutition" might be useful to e.g. find the necessary form of the initial conditions (or some other requirement like monotonicity) which as discussed above is a necessary and already difficult first step towards the solution. Also once you have information about existence, uniqueness and properties of the solution, you might be able to solve the equation numerically!
I hope this helps!
- 139