Saturday 24 March 2018

optimization - Proving the shortest function connecting two points is a straight line WITHOUT assuming the Euler-Lagrange equation



For learning purposes, I'm trying to prove that the shortest function passing through the two points $(x_1, y_1)$, $(x_2, y_2)$ is a straight line, without using the Euler-Lagrange equation.



My attempt at a proof is below. I think I have most of it, but I get stuck at the end.
How do I finish the proof?







My attempt:



Assume $y$ is the function of $x$ to be determined, which satisfies $y(x_1) = y_1$ and $y(x_2) = y_2$.
I define the length to be minimized as follows:



$$L(y) = \int_{x_1}^{x_2} \sqrt{1 + \left(\frac{dy}{dx}\right)^2} \,dx$$



Consider perturbing $y$ by some multiple $\delta$ of an arbitrary deviation $w$ that vanishes at $x_1$ and $x_2$:



$$L(y, \delta) = \int_{x_1}^{x_2} \sqrt{1 + \left(\frac{d(y + \delta w)}{dx}\right)^2} \,dx$$




$$L(y, \delta) = \int_{x_1}^{x_2} \sqrt{1 + \left(\frac{dy}{dx} + \delta\frac{dw}{dx}\right)^2} \,dx$$



If $y$ minimizes $L$, then for any fixed $w$ the rate of change (i.e., derivative) of $L$ with respect to $\delta$ must approach zero as $\delta \to 0$:



$$
\frac{d}{d\delta}L(y, \delta)
= \frac{d}{d\delta}\int_{x_1}^{x_2} \sqrt{1 + \left(\frac{dy}{dx} + \delta\frac{dw}{dx}\right)^2} \,dx \\
= \int_{x_1}^{x_2} \frac{d}{d\delta} \sqrt{1 + \left(\frac{dy}{dx} + \delta\frac{dw}{dx}\right)^2} \,dx \\
= \int_{x_1}^{x_2} \frac{\left(\frac{dy}{dx} + \delta\frac{dw}{dx}\right) \frac{dw}{dx}}{\sqrt{1 + \left(\frac{dy}{dx} + \delta\frac{dw}{dx}\right)^2}} \,dx \to 0
$$




Specifically, since this holds true when $\delta = 0$, we have:



$$
\int_{x_1}^{x_2} \frac{\frac{dy}{dx} \frac{dw}{dx}}{\sqrt{1 + \left(\frac{dy}{dx}\right)^2}} \,dx = 0
$$



Now here's where I get stuck:



I need to get rid of $w$ somehow.




Because the equality above needs to hold for all $w$, I think I would prefer to pick $w$ to be something convenient that makes my life easier; say, $\frac{dw}{dx} = \sqrt{1 + \left(\frac{dy}{dx}\right)^2}$, to cancel the denominator.
But I cannot assume this is possible, as $w$ must satisfy two boundary conditions:
it must vanish at both $x_1$ and at $x_2$.



How am I supposed to proceed?


Answer



Well, two final steps might be following.



First, you integrate last expression by parts:



$$\int_{x_1}^{x_2} \frac{y'w'}{\sqrt{1+ (y')^2}} dx

= \left \lbrack \frac{y'w}{\sqrt{1+ (y')^2}} \right \rbrack \Bigg \vert_{x_1}^{x_2} - \int_{x_1}^{x_2} w \frac{y''}{{(1+ (y')^2)}^{\frac{3}{2}}} dx $$



Since $w(x)$ vanishes at the endpoints, then we have



$$ \int_{x_1}^{x_2} w \frac{y''}{{(1+ (y')^2)}^{\frac{3}{2}}} dx \equiv 0 $$
for any choice of $w$.



The second step is following. Now consider family of functions $w_{x_0, n}(x)$
($x_0$ is a point from $(x_1, x_2)$ and $n$ belongs to a subset of natural numbers s.t. $(x_0 - \frac{1}{n}, x_0 + \frac{1}{n}) \subset (x_1, x_2)$ ):





  • $w_{x_0, n}(x_1) = w_{x_0, n}(x_2) = 0 $

  • $w_{x_0, n}(x)$ is zero outside $(x_0 - \frac{1}{n}, x_0 + \frac{1}{n})$

  • $w_{x_0, n}(x)$ is positive on $( x_0 - \frac{1}{n}, x_0 + \frac{1}{n} )$



It's easy to show that



$$ \lim\limits_{n \rightarrow + \infty} \int_{x_1}^{x_2} w_{x_0, n} \frac{y''}{{(1+ (y')^2)}^{\frac{3}{2}}} dx = \frac{ y''(x_0)}{{(1+ (y'(x_0))^2)}^{\frac{3}{2}}}. $$




But we know that each of these integrals is equal to 0, so $y''(x_0) \equiv 0$ for $x_0 \in (x_1, x_2) $. By continuity you just obtain that $y'' \equiv 0$ at $\lbrack x_1, x_2 \rbrack$ and thus it's a linear function of $x$.


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...