Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Differential calculus
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Applications of derivatives == === Optimization === If {{math|''f''}} is a [[differentiable function]] on {{math|β}} (or an [[open interval]]) and {{math|''x''}} is a [[local maximum]] or a [[local minimum]] of {{math|''f''}}, then the derivative of {{math|''f''}} at {{math|''x''}} is zero. Points where {{math|''f<nowiki>'</nowiki>''(''x'') {{=}} 0}} are called ''[[critical point (mathematics)|critical points]]'' or ''[[stationary point]]s'' (and the value of {{math|''f''}} at {{math|''x''}} is called a '''critical value'''). If {{math|''f''}} is not assumed to be everywhere differentiable, then points at which it fails to be differentiable are also designated critical points. If {{math|''f''}} is twice differentiable, then conversely, a critical point {{math|''x''}} of {{math|''f''}} can be analysed by considering the [[second derivative]] of {{math|''f''}} at {{math|''x''}} : * if it is positive, {{math|''x''}} is a local minimum; * if it is negative, {{math|''x''}} is a local maximum; * if it is zero, then {{math|''x''}} could be a local minimum, a local maximum, or neither. (For example, {{math|''f''(''x'') {{=}} ''x''<sup>3</sup>}} has a critical point at {{math|''x'' {{=}} 0}}, but it has neither a maximum nor a minimum there, whereas {{math|''f''(''x'') {{=}} Β± ''x''<sup>4</sup>}} has a critical point at {{math|''x'' {{=}} 0}} and a minimum and a maximum, respectively, there.) This is called the [[second derivative test]]. An alternative approach, called the [[first derivative test]], involves considering the sign of the {{math|''f<nowiki>'</nowiki>''}} on each side of the critical point. Taking derivatives and solving for critical points is therefore often a simple way to find local minima or maxima, which can be useful in [[Optimization (mathematics)|optimization]]. By the [[extreme value theorem]], a continuous function on a [[closed interval]] must attain its minimum and maximum values at least once. If the function is differentiable, the minima and maxima can only occur at critical points or endpoints. This also has applications in graph sketching: once the local minima and maxima of a differentiable function have been found, a rough plot of the graph can be obtained from the observation that it will be either increasing or decreasing between critical points. In [[higher dimension]]s, a critical point of a [[Scalar (mathematics)|scalar value]]d function is a point at which the [[gradient]] is zero. The [[second derivative]] test can still be used to analyse critical points by considering the [[eigenvalue]]s of the [[Hessian matrix]] of second partial derivatives of the function at the critical point. If all of the eigenvalues are positive, then the point is a local minimum; if all are negative, it is a local maximum. If there are some positive and some negative eigenvalues, then the critical point is called a "[[saddle point]]", and if none of these cases hold (i.e., some of the eigenvalues are zero) then the test is considered to be inconclusive. ==== Calculus of variations ==== {{Main|Calculus of variations}} One example of an optimization problem is: Find the shortest curve between two points on a surface, assuming that the curve must also lie on the surface. If the surface is a plane, then the shortest curve is a line. But if the surface is, for example, egg-shaped, then the [[Shortest path problem|shortest path]] is not immediately clear. These paths are called [[geodesic]]s, and one of the most fundamental problems in the calculus of variations is finding geodesics. Another example is: Find the smallest area surface filling in a closed curve in space. This surface is called a [[minimal surface]] and it, too, can be found using the calculus of variations. === Physics === Calculus is of vital importance in physics: many physical processes are described by equations involving derivatives, called [[differential equation]]s. Physics is particularly concerned with the way quantities change and develop over time, and the concept of the "'''[[time derivative]]'''" — the rate of change over time — is essential for the precise definition of several important concepts. In particular, the time derivatives of an object's position are significant in [[Newtonian physics]]: * [[velocity]] is the derivative (with respect to time) of an object's displacement (distance from the original position) * [[acceleration]] is the derivative (with respect to time) of an object's velocity, that is, the second derivative (with respect to time) of an object's position. For example, if an object's position on a line is given by : <math>x(t) = -16t^2 + 16t + 32 , \,\!</math> then the object's velocity is : <math>\dot x(t) = x'(t) = -32t + 16, \,\!</math> and the object's acceleration is : <math>\ddot x(t) = x''(t) = -32, \,\!</math> which is constant. === Differential equations === {{Main|Differential equation}} A differential equation is a relation between a collection of functions and their derivatives. An [[ordinary differential equation]] is a differential equation that relates functions of one variable to their derivatives with respect to that variable. A [[partial differential equation]] is a differential equation that relates functions of more than one variable to their [[partial derivative]]s. Differential equations arise naturally in the physical sciences, in mathematical modelling, and within mathematics itself. For example, [[Newton's second law]], which describes the relationship between acceleration and force, can be stated as the ordinary differential equation :<math>F(t) = m\frac{d^2x}{dt^2}.</math> The [[heat equation]] in one space variable, which describes how heat diffuses through a straight rod, is the partial differential equation :<math>\frac{\partial u}{\partial t} = \alpha\frac{\partial^2 u}{\partial x^2}.</math> Here {{math|''u''(''x'',''t'')}} is the temperature of the rod at position {{math|''x''}} and time {{math|''t''}} and {{math|''Ξ±''}} is a constant that depends on how fast heat diffuses through the rod. === Mean value theorem === {{Main|Mean value theorem}} [[File:Mvt2.svg|thumb|The mean value theorem: For each differentiable function <math>f:[a,b]\to\R</math> with <math>a<b</math> there is a <math>c\in(a,b)</math> with <math>f'(c) = \tfrac{f(b) - f(a)}{b - a}</math>.]] The mean value theorem gives a relationship between values of the derivative and values of the original function. If {{math|''f''(''x'')}} is a real-valued function and {{math|''a''}} and {{math|''b''}} are numbers with {{math|''a'' < ''b''}}, then the mean value theorem says that under mild hypotheses, the slope between the two points {{math|(''a'', ''f''(''a''))}} and {{math|(''b'', ''f''(''b''))}} is equal to the slope of the tangent line to {{math|''f''}} at some point {{math|''c''}} between {{math|''a''}} and {{math|''b''}}. In other words, :<math>f'(c) = \frac{f(b) - f(a)}{b - a}.</math> In practice, what the mean value theorem does is control a function in terms of its derivative. For instance, suppose that {{math|''f''}} has derivative equal to zero at each point. This means that its tangent line is horizontal at every point, so the function should also be horizontal. The mean value theorem proves that this must be true: The slope between any two points on the graph of {{math|''f''}} must equal the slope of one of the tangent lines of {{math|''f''}}. All of those slopes are zero, so any line from one point on the graph to another point will also have slope zero. But that says that the function does not move up or down, so it must be a horizontal line. More complicated conditions on the derivative lead to less precise but still highly useful information about the original function. === Taylor polynomials and Taylor series === {{Main|Taylor polynomial|Taylor series}} The derivative gives the best possible linear approximation of a function at a given point, but this can be very different from the original function. One way of improving the approximation is to take a quadratic approximation. That is to say, the linearization of a real-valued function {{math|''f''(''x'')}} at the point {{math|''x''<sub>0</sub>}} is a linear [[polynomial]] {{math|''a'' + ''b''(''x'' β ''x''<sub>0</sub>)}}, and it may be possible to get a better approximation by considering a quadratic polynomial {{math|''a'' + ''b''(''x'' β ''x''<sub>0</sub>) + ''c''(''x'' β ''x''<sub>0</sub>)<sup>2</sup>}}. Still better might be a cubic polynomial {{math|''a'' + ''b''(''x'' β ''x''<sub>0</sub>) + ''c''(''x'' β ''x''<sub>0</sub>)<sup>2</sup> + ''d''(''x'' β ''x''<sub>0</sub>)<sup>3</sup>}}, and this idea can be extended to arbitrarily high degree polynomials. For each one of these polynomials, there should be a best possible choice of coefficients {{math|''a''}}, {{math|''b''}}, {{math|''c''}}, and {{math|''d''}} that makes the approximation as good as possible. In the [[Neighbourhood (mathematics)|neighbourhood]] of {{math|''x''<sub>0</sub>}}, for {{math|''a''}} the best possible choice is always {{math|''f''(''x''<sub>0</sub>)}}, and for {{math|''b''}} the best possible choice is always {{math|''f<nowiki>'</nowiki>''(''x''<sub>0</sub>)}}. For {{math|''c''}}, {{math|''d''}}, and higher-degree coefficients, these coefficients are determined by higher derivatives of {{math|''f''}}. {{math|''c''}} should always be {{math|{{sfrac|''f<nowiki>''</nowiki>''(''x''<sub>0</sub>)|2}}}}, and {{math|''d''}} should always be {{math|{{sfrac|''f<nowiki>'''</nowiki>''(''x''<sub>0</sub>)|3!}}}}. Using these coefficients gives the '''Taylor polynomial''' of {{math|''f''}}. The Taylor polynomial of degree {{math|''d''}} is the polynomial of degree {{math|''d''}} which best approximates {{math|''f''}}, and its coefficients can be found by a generalization of the above formulas. [[Taylor's theorem]] gives a precise bound on how good the approximation is. If {{math|''f''}} is a polynomial of degree less than or equal to {{math|''d''}}, then the Taylor polynomial of degree {{math|''d''}} equals {{math|''f''}}. The limit of the Taylor polynomials is an infinite series called the '''Taylor series'''. The Taylor series is frequently a very good approximation to the original function. Functions which are equal to their Taylor series are called [[analytic function]]s. It is impossible for functions with discontinuities or sharp corners to be analytic; moreover, there exist [[smooth function]]s which are also not analytic. === Implicit function theorem === {{Main|Implicit function theorem}} Some natural geometric shapes, such as [[circle]]s, cannot be drawn as the [[graph of a function]]. For instance, if {{math|''f''(''x'', ''y'') {{=}} ''x''<sup>2</sup> + ''y''<sup>2</sup> β 1}}, then the circle is the set of all pairs {{math|(''x'', ''y'')}} such that {{math|''f''(''x'', ''y'') {{=}} 0}}. This set is called the zero set of {{math|''f''}}, and is not the same as the graph of {{math|''f''}}, which is a [[paraboloid]]. The implicit function theorem converts relations such as {{math|''f''(''x'', ''y'') {{=}} 0}} into functions. It states that if {{math|''f''}} is [[continuously differentiable]], then around most points, the zero set of {{math|''f''}} looks like graphs of functions pasted together. The points where this is not true are determined by a condition on the derivative of {{math|''f''}}. The circle, for instance, can be pasted together from the graphs of the two functions {{math|Β± {{sqrt|1 - ''x''<sup>2</sup>}}}}. In a neighborhood of every point on the circle except {{nobreak|(β1, 0)}} and {{nobreak|(1, 0)}}, one of these two functions has a graph that looks like the circle. (These two functions also happen to meet {{nobreak|(β1, 0)}} and {{nobreak|(1, 0)}}, but this is not guaranteed by the implicit function theorem.) The implicit function theorem is closely related to the [[inverse function theorem]], which states when a function looks like graphs of [[invertible function]]s pasted together.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Differential calculus
(section)
Add topic