Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Chebyshev's inequality
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Sharpened bounds== Chebyshev's inequality is important because of its applicability to any distribution. As a result of its generality it may not (and usually does not) provide as sharp a bound as alternative methods that can be used if the distribution of the random variable is known. To improve the sharpness of the bounds provided by Chebyshev's inequality a number of methods have been developed; for a review see eg.<ref name="Godwin55"/><ref>[http://nvlpubs.nist.gov/nistpubs/jres/65B/jresv65Bn3p211_A1b.pdf Savage, I. Richard. "Probability inequalities of the Tchebycheff type." Journal of Research of the National Bureau of Standards-B. Mathematics and Mathematical Physics B 65 (1961): 211-222]</ref> ===Cantelli's inequality=== [[Cantelli's inequality]]<ref name=Cantelli1910>Cantelli F. (1910) Intorno ad un teorema fondamentale della teoria del rischio. Bolletino dell Associazione degli Attuari Italiani</ref> due to [[Francesco Paolo Cantelli]] states that for a real random variable (''X'') with mean (''μ'') and variance (''σ''<sup>2</sup>) : <math> \Pr(X - \mu \ge a) \le \frac{\sigma^2}{ \sigma^2 + a^2 } </math> where ''a'' ≥ 0. This inequality can be used to prove a one tailed variant of Chebyshev's inequality with ''k'' > 0<ref name=Grimmett00>Grimmett and Stirzaker, problem 7.11.9. Several proofs of this result can be found in [http://www.mcdowella.demon.co.uk/Chebyshev.html Chebyshev's Inequalities] {{Webarchive|url=https://web.archive.org/web/20190224000121/http://www.mcdowella.demon.co.uk/Chebyshev.html |date=2019-02-24 }} by A. G. McDowell.</ref> :<math> \Pr(X - \mu \geq k \sigma) \leq \frac{ 1 }{ 1 + k^2 }. </math> The bound on the one tailed variant is known to be sharp. To see this consider the random variable ''X'' that takes the values : <math> X = 1 </math> with probability <math> \frac{ \sigma^2 } { 1 + \sigma^2 }</math> : <math> X = - \sigma^2 </math> with probability <math> \frac{ 1 } { 1 + \sigma^2 }.</math> Then E(''X'') = 0 and E(''X''<sup>2</sup>) = ''σ''<sup>2</sup> and P(''X'' < 1) = 1 / (1 + ''σ''<sup>2</sup>). ==== An application: distance between the mean and the median ==== <!-- This section is linked from [[median]] and [[exponential distribution]]. --> The one-sided variant can be used to prove the proposition that for [[probability distribution]]s having an [[expected value]] and a [[median]], the mean and the median can never differ from each other by more than one [[standard deviation]]. To express this in symbols let ''μ'', ''ν'', and ''σ'' be respectively the mean, the median, and the standard deviation. Then :<math> \left | \mu - \nu \right | \leq \sigma. </math> There is no need to assume that the variance is finite because this inequality is trivially true if the variance is infinite. The proof is as follows. Setting ''k'' = 1 in the statement for the one-sided inequality gives: :<math>\Pr(X - \mu \geq \sigma) \leq \frac{ 1 }{ 2 } \implies \Pr(X \geq \mu + \sigma) \leq \frac{ 1 }{ 2 }. </math> Changing the sign of ''X'' and of ''μ'', we get :<math>\Pr(X \leq \mu - \sigma) \leq \frac{ 1 }{ 2 }. </math> As the median is by definition any real number ''m'' that satisfies the inequalities :<math>\Pr(X\leq m) \geq \frac{1}{2}\text{ and }\Pr(X\geq m) \geq \frac{1}{2}</math> this implies that the median lies within one standard deviation of the mean. A proof using Jensen's inequality also [[Median#Inequality relating means and medians|exists]]. ===Bhattacharyya's inequality=== Bhattacharyya<ref name=Bhattacharyya1987>{{cite journal |last=Bhattacharyya |first=B. B. |title=One-sided chebyshev inequality when the first four moments are known |journal=Communications in Statistics – Theory and Methods |year=1987 |volume=16 |issue=9 |pages=2789–91 |doi=10.1080/03610928708829540 |issn=0361-0926}}</ref> extended Cantelli's inequality using the third and fourth moments of the distribution. Let <math>\mu = 0</math> and <math>\sigma^2</math> be the variance. Let <math>\gamma = E[X^3] / \sigma^3</math> and <math>\kappa = E[X^4]/\sigma^4</math>. If <math>k^2 - k \gamma - 1 > 0</math> then :<math> \Pr(X > k\sigma) \le \frac{ \kappa - \gamma^2 - 1 }{ (\kappa - \gamma^2 - 1) (1 + k^2) + (k^2 - k\gamma - 1) }.</math> The necessity of <math>k^2 - k \gamma - 1 > 0</math> may require <math>k</math> to be reasonably large. In the case <math>E[X^3]=0</math> this simplifies to :<math>\Pr(X > k\sigma) \le \frac{\kappa-1}{\kappa \left(k^2+1\right)-2} \quad \text{for } k > 1. </math> Since <math>\frac{\kappa-1}{\kappa \left(k^2+1\right)-2} = \frac{1}{2}-\frac{\kappa (k-1)}{2 (\kappa-1)}+O\left((k-1)^2\right)</math> for <math>k</math> close to 1, this bound improves slightly over Cantelli's bound <math>\frac{1}{2}-\frac{k-1}{2}+O\left((k-1)^2\right)</math> as <math>\kappa > 1</math>. wins a factor 2 over Chebyshev's inequality. ===Gauss's inequality=== {{main|Gauss's inequality}} In 1823 [[Gauss]] showed that for a [[unimodal distribution|distribution with a unique mode]] at zero,<ref name=Gauss1823>Gauss C. F. Theoria Combinationis Observationum Erroribus Minimis Obnoxiae. Pars Prior. Pars Posterior. Supplementum. Theory of the Combination of Observations Least Subject to Errors. Part One. Part Two. Supplement. 1995. Translated by G. W. Stewart. Classics in Applied Mathematics Series, Society for Industrial and Applied Mathematics, Philadelphia</ref> : <math> \Pr( | X | \ge k ) \le \frac{ 4 \operatorname{ E }( X^2 ) } { 9k^2 } \quad\text{if} \quad k^2 \ge \frac{ 4 } { 3 } \operatorname{E} (X^2) ,</math> : <math> \Pr( | X | \ge k ) \le 1 - \frac{ k } { \sqrt{3} \operatorname{ E }( X^2 ) } \quad \text{if} \quad k^2 \le \frac{ 4 } { 3 } \operatorname{ E }( X^2 ). </math> ===Vysochanskij–Petunin inequality=== {{main|Vysochanskij–Petunin inequality}} The Vysochanskij–Petunin inequality generalizes Gauss's inequality, which only holds for deviation from the mode of a unimodal distribution, to deviation from the mean, or more generally, any center.<ref name="Pukelsheim94">{{Cite journal|last=Pukelsheim|first=Friedrich|date=May 1994|title=The Three Sigma Rule|url=http://www.tandfonline.com/doi/abs/10.1080/00031305.1994.10476030|journal=The American Statistician|language=en|volume=48|issue=2|pages=88–91|doi=10.1080/00031305.1994.10476030|s2cid=122587510 |issn=0003-1305}}</ref> If ''X'' is a [[unimodal distribution]] with mean ''μ'' and variance ''σ''<sup>2</sup>, then the inequality states that : <math> \Pr( | X - \mu | \ge k \sigma ) \le \frac{ 4 }{ 9k^2 } \quad \text{if} \quad k \ge \sqrt{8/3} = 1.633.</math> : <math> \Pr( | X - \mu | \ge k \sigma ) \le \frac{ 4 }{ 3k^2 } - \frac13 \quad \text{if} \quad k \le \sqrt{8/3}.</math> For symmetrical unimodal distributions, the median and the mode are equal, so both the Vysochanskij–Petunin inequality and Gauss's inequality apply to the same center. Further, for symmetrical distributions, one-sided bounds can be obtained by noticing that :<math> \Pr( X - \mu \ge k \sigma ) = \Pr( X - \mu \le -k \sigma ) = \frac{1}{2} \Pr( |X - \mu| \ge k \sigma ).</math> The additional fraction of <math>4/9</math> present in these tail bounds lead to better confidence intervals than Chebyshev's inequality. For example, for any symmetrical unimodal distribution, the Vysochanskij–Petunin inequality states that 4/(9 × 3^2) = 4/81 ≈ 4.9% of the distribution lies outside 3 standard deviations of the mode. ===Bounds for specific distributions=== DasGupta has shown that if the distribution is known to be normal<ref name=DasGupta2000>{{cite journal | last1 = DasGupta | first1 = A | year = 2000 | title = Best constants in Chebychev inequalities with various applications | journal = Metrika | volume = 5 | issue = 1| pages = 185–200 | doi = 10.1007/s184-000-8316-9 | s2cid = 121436601 }}</ref> : <math> \Pr( | X - \mu | \ge k \sigma ) \le \frac{ 1 }{ 3 k^2 } .</math> From DasGupta's inequality it follows that for a normal distribution at least 95% lies within approximately 2.582 standard deviations of the mean. This is less sharp than the true figure (approximately 1.96 standard deviations of the mean). *DasGupta has determined a set of best possible bounds for a [[normal distribution]] for this inequality.<ref name=DasGupta2000 /> *Steliga and Szynal have extended these bounds to the [[Pareto distribution]].<ref name=Steliga2010>{{cite journal |last1=Steliga |first1=Katarzyna |last2=Szynal |first2=Dominik |title=On Markov-Type Inequalities |journal=International Journal of Pure and Applied Mathematics |year=2010 |volume=58 |issue=2 |pages=137–152 |url=http://ijpam.eu/contents/2010-58-2/2/2.pdf |access-date=10 October 2012 |issn=1311-8080}}</ref> *Grechuk et al. developed a general method for deriving the best possible bounds in Chebyshev's inequality for any family of distributions, and any [[deviation risk measure]] in place of standard deviation. In particular, they derived Chebyshev inequality for distributions with [[Logarithmically concave function|log-concave]] densities.<ref name="cheb">Grechuk, B., Molyboha, A., Zabarankin, M. (2010). [https://www.researchgate.net/publication/231939730_Chebyshev_inequalities_with_law-invariant_deviation_measures Chebyshev Inequalities with Law Invariant Deviation Measures], Probability in the Engineering and Informational Sciences, 24(1), 145-170.</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Chebyshev's inequality
(section)
Add topic