Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Calculus of variations
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Euler–Lagrange equation == {{main|Euler–Lagrange equation}} Finding the extrema of functionals is similar to finding the maxima and minima of functions. The maxima and minima of a function may be located by finding the points where its derivative vanishes (i.e., is equal to zero). The extrema of functionals may be obtained by finding functions for which the [[functional derivative]] is equal to zero. This leads to solving the associated [[Euler–Lagrange equation]].{{efn|The following derivation of the Euler–Lagrange equation corresponds to the derivation on pp. 184–185 of Courant & Hilbert (1953).<ref>{{cite book |author=Courant, R. |author-link=Richard Courant |author2=Hilbert, D. |author2-link= David Hilbert |title=Methods of Mathematical Physics |volume=I |edition=First English |publisher=Interscience Publishers, Inc. |year=1953 |location=New York |isbn=978-0471504474}}</ref>}} Consider the functional <math display="block">J[y] = \int_{x_1}^{x_2} L\left(x,y(x),y'(x)\right)\, dx \, .</math> where *<math>x_1, x_2</math> are [[Constant (mathematics)|constants]], *<math>y(x)</math> is twice continuously differentiable, *<math>y'(x) = \frac{dy}{dx},</math> *<math>L\left(x, y(x), y'(x)\right)</math> is twice continuously differentiable with respect to its arguments <math>x, y,</math> and <math>y'.</math> If the functional <math>J[y]</math> attains a [[local minimum]] at <math>f,</math> and <math>\eta(x)</math> is an arbitrary function that has at least one derivative and vanishes at the endpoints <math>x_1</math> and <math>x_2,</math> then for any number <math>\varepsilon</math> close to 0, <math display="block">J[f] \le J[f + \varepsilon \eta] \, .</math> The term <math>\varepsilon \eta</math> is called the '''variation''' of the function <math>f</math> and is denoted by <math>\delta f.</math><ref name='CourHilb1953P184'/>{{efn|Note that <math>\eta(x)</math> and <math>f(x)</math> are evaluated at the {{em|same}} values of <math>x,</math> which is not valid more generally in variational calculus with non-holonomic constraints.}} Substituting <math>f + \varepsilon \eta</math> for <math>y</math> in the functional <math>J[y],</math> the result is a function of <math>\varepsilon,</math> <math display="block">\Phi(\varepsilon) = J[f+\varepsilon\eta] \, .</math> Since the functional <math>J[y]</math> has a minimum for <math>y = f</math> the function <math>\Phi(\varepsilon)</math> has a minimum at <math>\varepsilon = 0</math> and thus,{{efn|The product <math>\varepsilon \Phi'(0)</math> is called the first variation of the functional <math>J</math> and is denoted by <math>\delta J.</math> Some references define the [[first variation]] differently by leaving out the <math>\varepsilon</math> factor.}} <math display="block">\Phi'(0) \equiv \left.\frac{d\Phi}{d\varepsilon}\right|_{\varepsilon = 0} = \int_{x_1}^{x_2} \left.\frac{dL}{d\varepsilon}\right|_{\varepsilon = 0} dx = 0 \, .</math> Taking the [[total derivative]] of <math>L\left[x, y, y'\right],</math> where <math>y = f + \varepsilon \eta</math> and <math>y' = f' + \varepsilon \eta'</math> are considered as functions of <math>\varepsilon</math> rather than <math>x,</math> yields <math display="block">\frac{dL}{d\varepsilon}=\frac{\partial L}{\partial y}\frac{dy}{d\varepsilon} + \frac{\partial L}{\partial y'}\frac{dy'}{d\varepsilon}</math> and because <math>\frac{dy}{d \varepsilon} = \eta</math> and <math>\frac{d y'}{d \varepsilon} = \eta',</math> <math display="block">\frac{dL}{d\varepsilon}=\frac{\partial L}{\partial y}\eta + \frac{\partial L}{\partial y'}\eta'.</math> Therefore, <math display="block">\begin{align} \int_{x_1}^{x_2} \left.\frac{dL}{d\varepsilon}\right|_{\varepsilon = 0} dx & = \int_{x_1}^{x_2} \left(\frac{\partial L}{\partial f} \eta + \frac{\partial L}{\partial f'} \eta'\right)\, dx \\ & = \int_{x_1}^{x_2} \frac{\partial L}{\partial f} \eta \, dx + \left.\frac{\partial L}{\partial f'} \eta \right|_{x_1}^{x_2} - \int_{x_1}^{x_2} \eta \frac{d}{dx}\frac{\partial L}{\partial f'} \, dx \\ & = \int_{x_1}^{x_2} \left(\frac{\partial L}{\partial f} \eta - \eta \frac{d}{dx}\frac{\partial L}{\partial f'} \right)\, dx\\ \end{align}</math> where <math>L\left[x, y, y'\right] \to L\left[x, f, f'\right]</math> when <math>\varepsilon = 0</math> and we have used [[integration by parts]] on the second term. The second term on the second line vanishes because <math>\eta = 0</math> at <math>x_1</math> and <math>x_2</math> by definition. Also, as previously mentioned the left side of the equation is zero so that <math display="block">\int_{x_1}^{x_2} \eta (x) \left(\frac{\partial L}{\partial f} - \frac{d}{dx}\frac{\partial L}{\partial f'} \right) \, dx = 0 \, .</math> According to the [[fundamental lemma of calculus of variations]], the part of the integrand in parentheses is zero, i.e. <math display="block">\frac{\partial L}{\partial f} -\frac{d}{dx} \frac{\partial L}{\partial f'}=0</math> which is called the '''Euler–Lagrange equation'''. The left hand side of this equation is called the [[functional derivative]] of <math>J[f]</math> and is denoted <math>\delta J</math> or <math>\delta f(x).</math> In general this gives a second-order [[ordinary differential equation]] which can be solved to obtain the extremal function <math>f(x).</math> The Euler–Lagrange equation is a [[Necessary condition|necessary]], but not [[Sufficient condition|sufficient]], condition for an extremum <math>J[f].</math> A sufficient condition for a minimum is given in the section [[#Variations and sufficient condition for a minimum|Variations and sufficient condition for a minimum]]. === Example === In order to illustrate this process, consider the problem of finding the extremal function <math>y = f(x),</math> which is the shortest curve that connects two points <math>\left(x_1, y_1\right)</math> and <math>\left(x_2, y_2\right).</math> The [[arc length]] of the curve is given by <math display="block">A[y] = \int_{x_1}^{x_2} \sqrt{1 + [ y'(x) ]^2} \, dx \, ,</math> with <math display="block">y'(x) = \frac{dy}{dx} \, , \ \ y_1=f(x_1) \, , \ \ y_2=f(x_2) \, .</math> Note that assuming {{mvar|y}} is a function of {{mvar|x}} loses generality; ideally both should be a function of some other parameter. This approach is good solely for instructive purposes. The Euler–Lagrange equation will now be used to find the extremal function <math>f(x)</math> that minimizes the functional <math>A[y].</math> <math display="block">\frac{\partial L}{\partial f} -\frac{d}{dx} \frac{\partial L}{\partial f'}=0</math> with <math display="block">L = \sqrt{1 + [ f'(x) ]^2} \, .</math> Since <math>f</math> does not appear explicitly in <math>L,</math> the first term in the Euler–Lagrange equation vanishes for all <math>f(x)</math> and thus, <math display="block">\frac{d}{dx} \frac{\partial L}{\partial f'} = 0 \, .</math> Substituting for <math>L</math> and taking the derivative, <math display="block">\frac{d}{dx} \ \frac{f'(x)} {\sqrt{1 + [f'(x)]^2}} \ = 0 \, .</math> Thus <math display="block">\frac{f'(x)}{\sqrt{1+[f'(x)]^2}} = c \, ,</math> for some constant <math>c.</math> Then <math display="block">\frac{[f'(x)]^2}{1+[f'(x)]^2} = c^2 \, ,</math> where <math display="block">0 \le c^2<1.</math> Solving, we get <math display="block">[f'(x)]^2=\frac{c^2}{1-c^2}</math> which implies that <math display="block">f'(x)=m</math> is a constant and therefore that the shortest curve that connects two points <math>\left(x_1, y_1\right)</math> and <math>\left(x_2, y_2\right)</math> is <math display="block">f(x) = m x + b \qquad \text{with} \ \ m = \frac{y_2 - y_1}{x_2 - x_1} \quad \text{and} \quad b = \frac{x_2 y_1 - x_1 y_2}{x_2 - x_1}</math> and we have thus found the extremal function <math>f(x)</math> that minimizes the functional <math>A[y]</math> so that <math>A[f]</math> is a minimum. The equation for a straight line is <math>y = mx+b.</math> In other words, the shortest distance between two points is a straight line.{{efn|name=ArchimedesStraight| As a historical note, this is an axiom of [[Archimedes]]. See e.g. Kelland (1843).<ref>{{cite book |last=Kelland |first=Philip |author-link=Philip Kelland| title=Lectures on the principles of demonstrative mathematics |year=1843 |page=58 |url=https://books.google.com/books?id=yQCFAAAAIAAJ&pg=PA58 |via=Google Books}}</ref>}}
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Calculus of variations
(section)
Add topic