Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Mean value theorem
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Mean value inequality=== {{see also|Calculus on Euclidean space#Derivative of a map and chain rule}} [[Jean Dieudonné]] in his classic treatise ''Foundations of Modern Analysis'' discards the mean value theorem and replaces it by mean inequality as the proof is not constructive and one cannot find the mean value and in applications one only needs mean inequality. [[Serge Lang]] in ''Analysis I ''uses the mean value theorem, in integral form, as an instant reflex but this use requires the continuity of the derivative. If one uses the [[Henstock–Kurzweil integral]] one can have the mean value theorem in integral form without the additional assumption that derivative should be continuous as every derivative is Henstock–Kurzweil integrable. The reason why there is no analog of mean value equality is the following: If {{math|''f'' : ''U'' → '''R'''<sup>''m''</sup>}} is a differentiable function (where {{math|''U'' ⊂ '''R'''<sup>''n''</sup>}} is open) and if {{math|''x'' + ''th''}}, {{math|''x'', ''h'' ∈ '''R'''<sup>''n''</sup>, ''t'' ∈ [0, 1]}} is the line segment in question (lying inside {{mvar|U}}), then one can apply the above parametrization procedure to each of the component functions {{math|1=''f<sub>i</sub>'' (''i'' = 1, …, ''m'')}} of ''f'' (in the above notation set {{math|1=''y'' = ''x'' + ''h''}}). In doing so one finds points {{math|''x'' + ''t<sub>i</sub>h''}} on the line segment satisfying :<math>f_i(x+h) - f_i(x) = \nabla f_i (x + t_ih) \cdot h.</math> But generally there will not be a ''single'' point {{math|''x'' + ''t''*''h''}} on the line segment satisfying :<math>f_i(x+h) - f_i(x) = \nabla f_i (x + t^* h) \cdot h.</math> for all {{mvar|i}} ''simultaneously''. For example, define: :<math>\begin{cases} f : [0, 2 \pi] \to \R^2 \\ f(x) = (\cos(x), \sin(x)) \end{cases}</math> Then <math>f(2\pi) - f(0) = \mathbf{0} \in \R^2</math>, but <math>f_1'(x) = -\sin (x)</math> and <math>f_2'(x) = \cos (x)</math> are never simultaneously zero as <math>x</math> ranges over <math>\left[0, 2 \pi\right]</math>. The above theorem implies the following: {{math_theorem|name=Mean value inequality<ref>{{harvnb|Hörmander|2015|loc=Theorem 1.1.1. and remark following it.}}</ref> |math_statement=For a continuous function <math>\textbf{f} : [a, b] \to \mathbb{R}^k</math>, if <math>\textbf{f}</math> is differentiable on <math>(a, b)</math>, then :<math>|\textbf{f}(b) - \textbf{f}(a)| \le (b-a)\sup_{(a, b)} |\textbf{f}'|</math>.}} In fact, the above statement suffices for many applications and can be proved directly as follows. (We shall write <math>f</math> for <math>\textbf{f}</math> for readability.) {{math proof|First assume <math>f</math> is differentiable at <math>a</math> too. If <math>f'</math> is unbounded on <math>(a, b)</math>, there is nothing to prove. Thus, assume <math>\sup_{(a, b)} |f'| < \infty</math>. Let <math>M > \sup_{(a, b)} |f'|</math> be some real number. Let <math display="block">E = \{ 0 \le t \le 1 \mid |f(a + t(b-a)) - f(a)| \le Mt(b-a) \}.</math> We want to show <math>1 \in E</math>. By continuity of <math>f</math>, the set <math>E</math> is closed. It is also nonempty as <math>0</math> is in it. Hence, the set <math>E</math> has the largest element <math>s</math>. If <math>s = 1</math>, then <math>1 \in E</math> and we are done. Thus suppose otherwise. For <math>1 > t > s</math>, :<math>\begin{align} &|f(a + t(b-a)) - f(a)| \\ &\le |f(a + t(b-a)) - f(a+s(b - a)) - f'(a + s(b-a))(t-s)(b-a)| + |f'(a+s(b-a))|(t-s)(b-a) \\ &+|f(a + s(b-a)) - f(a)|. \end{align} </math> Let <math>\epsilon > 0</math> be such that <math>M - \epsilon > \sup_{(a, b)} |f'|</math>. By the differentiability of <math>f</math> at <math>a + s(b-a)</math> (note <math>s</math> may be 0), if <math>t</math> is sufficiently close to <math>s</math>, the first term is <math>\le \epsilon (t-s)(b-a)</math>. The second term is <math>\le (M - \epsilon) (t-s)(b-a)</math>. The third term is <math>\le Ms(b-a)</math>. Hence, summing the estimates up, we get: <math>|f(a + t(b-a)) - f(a)| \le tM|b-a|</math>, a contradiction to the maximality of <math>s</math>. Hence, <math>1 = s \in M</math> and that means: :<math>|f(b) - f(a)| \le M(b-a).</math> Since <math>M</math> is arbitrary, this then implies the assertion. Finally, if <math>f</math> is not differentiable at <math>a</math>, let <math>a' \in (a, b)</math> and apply the first case to <math>f</math> restricted on <math>[a', b]</math>, giving us: :<math>|f(b) - f(a')| \le (b-a')\sup_{(a, b)} |f'|</math> since <math>(a', b) \subset (a, b)</math>. Letting <math>a' \to a</math> finishes the proof.}}
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Mean value theorem
(section)
Add topic