1

I am trying to minimize the function

$$\int_{0}^{1} y(t)\sqrt{1+(y'(t))^2} dt$$ under the constraints that $y(0)=y(1)=0$ and $\int_{0}^{1}\sqrt{1+(y'(t))^2} dt=2$. My intuition is can you not just set $y(t)=0$ and have the minimum be zero? I feel like there is something I am not understanding. Could someone help me out?

Roo
  • 21
  • 1
    If $y(t)=0$ then the last condition containing the definite integral is not satisfied. Remark: This condition says that the curve $y=y(x)$ between the points $x=0$ and $x=1$ has the length equal to $2$ units. – Marian G. Apr 21 '19 at 16:35
  • So if $y'(t)=\sqrt{3}$ that would satisfy the integral constraint. Then I just need to integrate $y'(t)$ to get $y(t)$ and use the initial conditions $y(0)=y(1)=0$ to find the constant of integration. Does that sound like the correct approach? The only issue I am running into now is that $y(t)=\sqrt(3)t+c$ and if $y(0)=0$ then $c$ is $0$ but if $y(1)=0$, then $c=-\sqrt{3}$. – Roo Apr 21 '19 at 16:54
  • I don't think that this is a correct way. How do you know that $y(x)$ is linear? A rigorous way could be the variational caluclus approach which requires to solve the so-called Euler-Lagrange equation, see e.g. here. – Marian G. Apr 21 '19 at 17:23
  • Searching on MSE, i found a similar problem to yours, see here, solved by Han de Bruijn. Unfortunately, he does not solve the derived differential equation $(y')^2-y\cdot y''+1=0$ in general. This part is therefore left to you. I hope the above link can further encourage you to find the complete solution. – Marian G. Apr 22 '19 at 14:30

0 Answers0