6

Find the derivative of the solution of the equation $\ddot x = x + A \dot x^2$, with initial conditions $x(0) = 1, \dot x(0) = 0$, with respect to the parameter $A$ for $A = 0$.

-- Vladimir Arnold

This is a great ODE question posed by Vladimir Arnold which takes creativity and ingenuity to solve. My work so far is below.


Solution: The solution is $$ \frac {dx}{dA} \bigg\rvert_{A=0} = \frac 1 6 \cosh 2t - \frac 2 3 \cosh t+ \frac 1 2 $$ since $$ x = \frac{\cosh t + A \left(\frac 1 6 \cosh 2t + \frac 1 2 \right)}{1 + \frac 2 3 A} + \mathcal{O}(A^2). $$ Proof: Computation shows that this definition satisfies $x(0) = 1$ and $\dot x(0) = 0$. Furthermore, we can rewrite $x$ to $$ x = (1 - \frac 2 3 A) \left[ \cosh t + A \left(\frac 1 6 \cosh 2t + \frac 1 2 \right) \right] + \mathcal{O}(A^2) \\ %x = ( 1 - \frac 2 3 A ) \left( \cosh t + A (\frac 1 6 \cosh 2t + \frac 1 2) \right) + \mathcal{O}(A^2) \\ $$ so that $$ \dot x = (1 - \frac 2 3 A)(\sinh t + A \frac 1 3 \sinh 2t) + \mathcal{O}(A^2) \\ A\dot x^2 = (1 - \frac 4 3 A)(\sinh^2 t + A \frac 2 3 \sinh 2t) + \mathcal{O}(A^2) \\ \ddot x = (1 - \frac 2 3 A)(\cosh t + A \frac 2 3 \cosh 2t) + \mathcal{O}(A^2)$$ which solves the ODE*.

Discussion: I solved this by noting that when $A = 0, x = \cosh x$. For $A \neq 0$, we get an extra $A \sinh^2 t$ term, which needs to be corrected for. Since we are taking the derivative at $A = 0$, we can throw out any $A^2$ terms, and treat this ODE as if it were linear. Then solving $y - \ddot y = \sinh^2 t$ tells us to correct the extra $A \sinh^2 t$ by adding a $A(\frac 1 6 \cosh 2t + \frac 1 2)$ term to $x$. This solves the ODE but requires adjusting $x$ by a factor of $1/(1 + \frac 2 3 A)$ to meet the initial conditions.


Update & Questions

While alternative solutions are great (if proven!), I ask for help completing and presenting the proof of this solution.

Questions:

  1. How can I actually check (and show!) that, as defined, $x = \ddot x - A \dot x^2$? (This is the line marked *). I believe my derivation is correct, since I solved and checked each step, but the computations for checking the final claim get too messy for me to do successfully. Each step involves complicated derivatives combined with hyperbolic identities. Even using Wolfram Alpha fails, since it can't use the $\mathcal{O}(A^2)$ notation and so fails to simplify.

  2. How do I justify this use of $\mathcal{O}(A^2)$ justified? How do achieve rigor when using this approach?

  3. What's the proper way to present this? As an ODE, the work is in finding a solution. Then checking it is just (large) computation. What type of presentation or exposition is called for?

These questions really boil down to: How do I use this approach, and present it clearly and rigorously?, which is why I've used the proof-writing tag.

SRobertJames
  • 6,117
  • 1
  • 12
  • 40

4 Answers4

5

Write $x=x(t,A)$ and \begin{equation} x_{tt}=x+Ax_{t}^2\implies x_{Att}=x_{A}+x_{t}^2+2Ax_{t}x_{At}. \end{equation} Letting $x_A(t)=x_{A}(t,0)$ and $\tilde{x}(t)=x(t,0)$, \begin{equation} x_A''(t)=x_A(t)+(\tilde{x}'(t))^2.\tag{1} \end{equation} When $A=0$, $\tilde{x}''(t)=\tilde{x}(t)$ and so $\tilde{x}(t)=\cosh(t)$ to satisfy ICs $\tilde{x}(0)=1$ and $\tilde{x}'(0)=0$. Substituting into $(1)$, \begin{equation} x_A''(t)-x_A(t)=\sinh^2(t), \end{equation} which has solution \begin{equation} x_A(t)=\frac{1}{6}\cosh 2t-\frac{2}{3}\cosh t+\frac{1}{2}=\frac{1}{12}e^{-2t}(e^{t}-1)^4, \end{equation} subject to ICs $x_A(0)=x_A'(0)=0$, after considering the homogeneous equation and then using variation of parameters for the particular solution.

Note: the first PDE can be written \begin{equation} \frac{\partial^3 x}{\partial A \partial t^2} = \frac{\partial x}{\partial A} + \left( \frac{\partial x}{\partial t} \right)^2 + 2A \frac{\partial x}{\partial t} \frac{\partial^2 x}{\partial A \partial t}, \end{equation} if it is clearer. I prefer subscripts to denote partial differentiation, but $x_A(t)$ is a function, so I'll put this here to avoid ambiguity.

  • Please explain "after considering the homogeneous equation and then using variation of parameters for the particular solution." It's hard to accept that on faith. – SRobertJames Nov 26 '24 at 12:39
  • 1
    @SRobertJames You can write $x_A''(t)-x_A(t)=f(t)$ s.t. $x_A(0)=x_A'(0)=0$ as the the superposition of solutions to $x_A''(t)-x_A(t)=0$ s.t. $x_A(s)=0$ and $x_A'(s)=f(s)\mathop{ds}$ over all $s$, by viewing $f$ as an impulse force. The solution to this is $x(t)=\frac{f(s)\mathop{ds}}{2}\left(e^{t-s}-e^{-(t-s)}\right)=f(s)\sinh(t-s)\mathop{ds}$. Hence, $x(t)=\int_{0}^{t}f(s)\sinh(t-s)\mathop{ds}$. In our problem, we have $f(s)=\sinh^2(s)$ and the solution follows. You just need to evaluate that integral. – Jean Daviau Nov 26 '24 at 13:08
  • @SRobertJames See here. – Jean Daviau Nov 26 '24 at 13:09
4

Using a series solution, we have $$x=\sum_{n=0}^\infty \frac {P_n(A)}{(2n)!}\,t^{2n}$$ where the first polynomials are $$\left( \begin{array}{cc} n & P_n(A) \\ 0 & 1 \\ 1 & 1 \\ 2 & 2 A+1 \\ 3 & 16 A^2+10 A+1 \\ 4 & 272 A^3+216 A^2+42 A+1 \\ 5 & 7936 A^4+7760 A^3+2232 A^2+170 A+1 \\ 6 & 353792 A^5+412736 A^4+157664 A^3+21232 A^2+682 A+1 \\ \end{array} \right)$$

This gives

$$\frac {dx}{dA} \bigg\rvert_{A=0} =\frac 1 {12} \sum_{n=1}^\infty \frac {a_n}{b_n}\, t^{2(n+1)}$$ where the $a_n$ and $b_n$ form sequences $A002675$ and $A002676$ in $OEIS$ (have a look at their definitions in the comments).

This is exactly $$\frac {dx}{dA} \bigg\rvert_{A=0} =\frac 1 {12}\,e^{-2 t} \,\left(e^t-1\right)^4$$

already given by @Jean Daviau.

2

Given the dynamical system

$$ \dot x = f(x,t,\theta) $$

with initial conditions $x(0)=x_0$

with $x = (x_1,\cdots,x_n), \theta=(\theta_1,\cdots,\theta_p)$. Here $\theta$ the unknown parameters.

We know

$$ \frac{\partial\dot x}{\partial\theta} = \frac{\partial f}{\partial x}\frac{\partial x}{\partial \theta}+\frac{\partial f}{\partial\theta} $$

now calling

$$ s = \frac{\partial x}{\partial \theta} $$

we have

$$ \dot s = \frac{\partial f}{\partial x}s+\frac{\partial f}{\partial \theta} $$

  • Case study. Consider the dynamic system

$$ \begin{array}{rcl} \dot x_1 & = & x_2 \\ \dot x_2 & = & x_1 + a x_2^2 \\ \end{array} $$

with $x_1(0)=1,\ x_2(0)=0$ then

$$ \frac{\partial f}{\partial x} = \left( \begin{array}{cc} 0 & 1 \\ 1 & 2ax_2 \\ \end{array} \right) $$

$$ \frac{\partial f}{\partial \theta} = \left( \begin{array}{c} 0 \\ x_2^2 \\ \end{array} \right) $$

The following set of DEs, solve the coupled systems.

$$ \begin{array}{rcl} \dot x_1 & = & x_2 \\ \dot x_2 & = & x_1+a x_2^2 \\ x_1(0) & = & 1\\ x_2(0) & = & 0\\ \dot s_1 & = &s_2\\ \dot s_2 & = & 2 as_2x_2+s_1+x_2^2\\ s_1(0)& = & 0\\ s_2(0)& = & 0 \end{array} $$

Now solving for $a=0$ we have:

$$ \begin{array}{rcl} x_1 & = & \cosh t\\ x_2 & = & \sinh t\\ s_1 & = & \frac{4}{3}\sinh^4\left(\frac{t}{2}\right) = \frac{1}{12}e^{-2t}(e^t-1)^4\\ s_2 & = & \frac{4}{3}\sinh^2\left(\frac{t}{2}\right) \sinh(t) \end{array} $$

Cesareo
  • 36,341
2

In the spirit of perturbation theory, let's assume that the solution to the ODE $$ \ddot{x}=x+A\dot{x}^2 \tag{1} $$ can be expanded in a power series in $A$: $$ x(t;A)=x_0(t)+Ax_1(t)+A^2x_2(t)+\ldots=\sum_{n=0}^{\infty}A^nx_n(t). \tag{2} $$ It follows from $(2)$ that $$ \left.\frac{\partial x(t;A)}{\partial A}\right|_{A=0}=x_1(t). \tag{3} $$ Since the initial conditions $x(0)=1$ and $\dot{x}(0)=0$ don't depend on $A$, the functions $x_n(t)$ in $(2)$ must satisfy \begin{align} x_0(0)&=1, \tag{4.1} \\ x_n(0)&=0\quad(n\in\mathbb{N}_{>0}), \tag{4.2} \\ \dot{x}_n(0)&=0\quad(n\in\mathbb{N}). \tag{4.3} \end{align} Finding $x_1(t)$ is now straightforward: substituting $(2)$ in $(1)$ and applying the initial conditions $(4)$, we get $$ \ddot{x}_0=x_0 \implies x_0(t)=\cosh t \tag{5} $$ and $$ \ddot{x}_1=x_1+\dot{x}_0^2=x_1+\sinh^2t \implies x_1(t) =\frac{1}{2}-\frac{2}{3}\cosh t+\frac{1}{6}\cosh(2t). \tag{6} $$

Gonçalo
  • 15,869