0

Consider the following cost function

$$ E[f] = \int_{\mathbb{R^N}} \sum_{|\alpha| = 2} \binom{2}{\alpha} |D^{\alpha}f|^2 dx_1 \dots dx_n $$

here $\alpha$ is a multi-index and $D^{\alpha}f$ denotes the derivatives with give multi-indices I am trying to find $f$ such that $E[f]=0$, since the integrand is sum of positive contribution then $E[f] = 0$ iff

$$ \left\{ \begin{array}{ll} \frac{\partial^2 f}{\partial x_i^2} = 0 & i =1,\dots,n \\ \frac{\partial^2 f}{\partial x_i \partial x_j} = 0 & i = 1,\dots,n, \;\; j = i+1, \dots n \end{array} \right. $$

I am a bit confused on how to solve this system, but I know the final result should be

$$ f(x_1,\ldots,x_n) = b + a_1x_1 + \dots + a_n x_n $$

Maybe is very trivial but I get confused on how to carry with the calculations. Can you help?

user8469759
  • 5,591
  • 4
  • 20
  • 54
  • Have you tried the calculation with $n = 2, 3, \dots$ to see what happens? If so, try to generalise the calculation or use induction. Also, the indexing for the cross derivative term seems to imply that this condition only holds when $i < j$ but this would be inconsistent with Clairauts theorem so it must hold for all $j \ne i$ – Matthew Cassell Aug 03 '24 at 03:11
  • I tried to start with non mixed derivatives and for each one I obtain $f = a_i x_i + b_i$ where the coefficients don't depends on the i-th variabile. But I don't know how to use the mixed derivatives and how to reach the conclusion – user8469759 Aug 03 '24 at 09:12
  • Try $n=2$. Note that $f\in C^{2} \implies f_{x_{1}x_{2}}=f_{x_{2}x_{1}}$ so $$f_{x_{1} x_{2}}=0 \implies f=a_{1}(x_{1})+a_{2}(x_{2}) \tag 1$$ Further \begin{align} f_{x_{1}x_{1}}=0\implies f&=c_{1}+b_{1}x_{1}+\hat{b}{1}(x{2}) \tag 2 \ f_{x_{2}x_{2}}=0\implies f&=c_{2}+b_{2}x_{2}+\hat{b}{2}(x{1}) \tag 3 \end{align} Equating \begin{align} a_{1}(x_{1})&=c_{1}+b_{1}x_{1}=\hat{b}{2}(x{1}) \ a_{2}(x_{1})&=c_{2}+b_{2}x_{2}=\hat{b}{1}(x{2}) \ \end{align} so from $(1)$ \begin{align} f&=c_{1}+b_{1}x_{1}+c_{2}+b_{2}x_{2} \ &=c+b_{1}x_{1}+b_{2}x_{2} \end{align} Now try to generalise. – Matthew Cassell Aug 05 '24 at 01:47
  • Maybe this is silly, how do you know $(1)$ is yrue – user8469759 Aug 06 '24 at 19:58
  • You integrate sequentially \begin{align} f_{x_{1} x_{2}} = 0 \implies f_{x_{1}} &= a(x_{1}) \ \implies f &= \int a(x_{1}) dx_{1} \ &= a_{1}(x_{1}) + a_{2}(x_{2}) \end{align} – Matthew Cassell Aug 07 '24 at 00:04
  • And for $(2)$ and $(3)$ where do $c_1, c_2$ come from? Wouldn't it just be $f = b_1(x_2) x_1 + b_2(x_2)$ for $(2)$ for example? – user8469759 Aug 07 '24 at 14:28
  • The form of $f$ in $(1)$ precludes solutions of the form $f = x_{1} b_{1}(x_{2}) + b_{2}(x_{2}) = B_{1}(x_{1}, x_{2}) + b_{2}(x_{2})$, while the $c_{i}$ can be absorbed into the $\hat{b}_{i}$ but I prefer to make things explicit. – Matthew Cassell Aug 07 '24 at 23:56
  • You can always start with $(2)$ and $(3)$ instead to get \begin{align} f_{x_{1} x_{1}} = 0 \implies f &= c_{1} + x_{1} b_{1}(x_{2}) + \hat{b}{1}(x{2}) \ f_{x_{2} x_{2}} = 0 \implies f &= c_{2} + x_{2} b_{2}(x_{1}) + \hat{b}{2}(x{1}) \end{align} then differentiate \begin{align} f_{x_{1} x_{2}} &= b_{1}(x_{2}) ' = 0 \implies b_{1}(x_{2}) = b_{1} = \text{constant} \ f_{x_{2} x_{1}} &= b_{2}(x_{1})' = 0 \implies b_{2}(x_{1}) = b_{2} = \text{constant} \end{align} – Matthew Cassell Aug 07 '24 at 23:56
  • Just out of curiosity, why doesn't such term pop out from the Euler Lagrange equations? (I think I understood anyway I'll post an answer soon). – user8469759 Aug 12 '24 at 11:31

0 Answers0