Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Chebyshev's inequality
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Cantelli's inequality=== [[Cantelli's inequality]]<ref name=Cantelli1910>Cantelli F. (1910) Intorno ad un teorema fondamentale della teoria del rischio. Bolletino dell Associazione degli Attuari Italiani</ref> due to [[Francesco Paolo Cantelli]] states that for a real random variable (''X'') with mean (''μ'') and variance (''σ''<sup>2</sup>) : <math> \Pr(X - \mu \ge a) \le \frac{\sigma^2}{ \sigma^2 + a^2 } </math> where ''a'' ≥ 0. This inequality can be used to prove a one tailed variant of Chebyshev's inequality with ''k'' > 0<ref name=Grimmett00>Grimmett and Stirzaker, problem 7.11.9. Several proofs of this result can be found in [http://www.mcdowella.demon.co.uk/Chebyshev.html Chebyshev's Inequalities] {{Webarchive|url=https://web.archive.org/web/20190224000121/http://www.mcdowella.demon.co.uk/Chebyshev.html |date=2019-02-24 }} by A. G. McDowell.</ref> :<math> \Pr(X - \mu \geq k \sigma) \leq \frac{ 1 }{ 1 + k^2 }. </math> The bound on the one tailed variant is known to be sharp. To see this consider the random variable ''X'' that takes the values : <math> X = 1 </math> with probability <math> \frac{ \sigma^2 } { 1 + \sigma^2 }</math> : <math> X = - \sigma^2 </math> with probability <math> \frac{ 1 } { 1 + \sigma^2 }.</math> Then E(''X'') = 0 and E(''X''<sup>2</sup>) = ''σ''<sup>2</sup> and P(''X'' < 1) = 1 / (1 + ''σ''<sup>2</sup>). ==== An application: distance between the mean and the median ==== <!-- This section is linked from [[median]] and [[exponential distribution]]. --> The one-sided variant can be used to prove the proposition that for [[probability distribution]]s having an [[expected value]] and a [[median]], the mean and the median can never differ from each other by more than one [[standard deviation]]. To express this in symbols let ''μ'', ''ν'', and ''σ'' be respectively the mean, the median, and the standard deviation. Then :<math> \left | \mu - \nu \right | \leq \sigma. </math> There is no need to assume that the variance is finite because this inequality is trivially true if the variance is infinite. The proof is as follows. Setting ''k'' = 1 in the statement for the one-sided inequality gives: :<math>\Pr(X - \mu \geq \sigma) \leq \frac{ 1 }{ 2 } \implies \Pr(X \geq \mu + \sigma) \leq \frac{ 1 }{ 2 }. </math> Changing the sign of ''X'' and of ''μ'', we get :<math>\Pr(X \leq \mu - \sigma) \leq \frac{ 1 }{ 2 }. </math> As the median is by definition any real number ''m'' that satisfies the inequalities :<math>\Pr(X\leq m) \geq \frac{1}{2}\text{ and }\Pr(X\geq m) \geq \frac{1}{2}</math> this implies that the median lies within one standard deviation of the mean. A proof using Jensen's inequality also [[Median#Inequality relating means and medians|exists]].
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Chebyshev's inequality
(section)
Add topic