Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Convergence of random variables
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Properties== Provided the probability space is [[complete measure|complete]]: * If <math>X_n\ \xrightarrow{\overset{}{p}}\ X</math> and <math>X_n\ \xrightarrow{\overset{}{p}}\ Y</math>, then <math>X=Y</math> [[almost surely]]. * If <math>X_n\ \xrightarrow{\overset{}\text{a.s.}}\ X</math> and <math>X_n\ \xrightarrow{\overset{}\text{a.s.}}\ Y</math>, then <math>X=Y</math> almost surely. * If <math>X_n\ \xrightarrow{\overset{}{L^r}}\ X</math> and <math>X_n\ \xrightarrow{\overset{}{L^r}}\ Y</math>, then <math>X=Y</math> almost surely. * If <math>X_n\ \xrightarrow{\overset{}{p}}\ X</math> and <math>Y_n\ \xrightarrow{\overset{}{p}}\ Y</math>, then <math>aX_n+bY_n\ \xrightarrow{\overset{}{p}}\ aX+bY</math> (for any real numbers {{mvar|a}} and {{mvar|b}}) and <math>X_n Y_n\xrightarrow{\overset{}{p}}\ XY</math>. * If <math>X_n\ \xrightarrow{\overset{}\text{a.s.}}\ X</math> and <math>Y_n\ \xrightarrow{\overset{}\text{a.s.}}\ Y</math>, then <math>aX_n+bY_n\ \xrightarrow{\overset{}\text{a.s.}}\ aX+bY</math> (for any real numbers {{mvar|a}} and {{mvar|b}}) and <math>X_n Y_n\xrightarrow{\overset{}\text{a.s.}}\ XY</math>. * If <math>X_n\ \xrightarrow{\overset{}{L^r}}\ X</math> and <math>Y_n\ \xrightarrow{\overset{}{L^r}}\ Y</math>, then <math>aX_n+bY_n\ \xrightarrow{\overset{}{L^r}}\ aX+bY</math> (for any real numbers {{mvar|a}} and {{mvar|b}}). * None of the above statements are true for convergence in distribution. The chain of implications between the various notions of convergence are noted in their respective sections. They are, using the arrow notation: : <math>\begin{matrix} \xrightarrow{\overset{}{L^s}} & \underset{s>r\geq1}{\Rightarrow} & \xrightarrow{\overset{}{L^r}} & & \\ & & \Downarrow & & \\ \xrightarrow{\text{a.s.}} & \Rightarrow & \xrightarrow{p} & \Rightarrow & \xrightarrow{d} \end{matrix}</math> These properties, together with a number of other special cases, are summarized in the following list: * {{anchor|propA1}} Almost sure convergence implies convergence in probability:<ref name="vdv2">{{harvnb|van der Vaart|1998|loc=Theorem 2.7}}</ref><sup>[[Proofs of convergence of random variables#propA1|[proof]]]</sup> *:<math>X_n\ \xrightarrow{\text{a.s.}}\ X \quad\Rightarrow\quad X_n\ \xrightarrow{\overset{}{p}}\ X</math> *{{anchor|propA1}} Convergence in probability implies there exists a sub-sequence <math>(n_k)</math> which almost surely converges:<ref>{{cite book|last=Gut|first=Allan|title=Probability: A graduate course|year=2005|publisher=Springer|location=Theorem 3.4|isbn=978-0-387-22833-4}}</ref> *: <math>X_n\ \xrightarrow{\overset{}{p}}\ X \quad\Rightarrow\quad X_{n_k}\ \xrightarrow{\text{a.s.}}\ X</math> *{{anchor|propA2}} Convergence in probability implies convergence in distribution:<ref name="vdv2"/><sup>[[Proofs of convergence of random variables#propA2|[proof]]]</sup> *: <math>X_n\ \xrightarrow{\overset{}{p}}\ X \quad\Rightarrow\quad X_n\ \xrightarrow{\overset{}{d}}\ X</math> * {{anchor|propA3}} Convergence in ''r''-th order mean implies convergence in probability: *: <math>X_n\ \xrightarrow{\overset{}{L^r}}\ X \quad\Rightarrow\quad X_n\ \xrightarrow{\overset{}{p}}\ X</math> * {{anchor|propA4}} Convergence in ''r''-th order mean implies convergence in lower order mean, assuming that both orders are greater than or equal to one: *: <math>X_n\ \xrightarrow{\overset{}{L^r}}\ X \quad\Rightarrow\quad X_n\ \xrightarrow{\overset{}{L^s}}\ X,</math> <span style="position:relative;top:.4em;left:2em;">provided ''r'' ≥ ''s'' ≥ 1.</span> * {{anchor|propB1}} If ''X''<sub>''n''</sub> converges in distribution to a constant ''c'', then ''X''<sub>''n''</sub> converges in probability to ''c'':<ref name="vdv2"/><sup>[[Proofs of convergence of random variables#propB1|[proof]]]</sup> *: <math>X_n\ \xrightarrow{\overset{}{d}}\ c \quad\Rightarrow\quad X_n\ \xrightarrow{\overset{}{p}}\ c,</math> <span style="position:relative;top:.4em;left:2em;">provided ''c'' is a constant.</span> * {{anchor|propB2}} If {{mvar|X<sub>n</sub>}} converges in distribution to ''X'' and the difference between ''X<sub>n</sub>'' and ''Y<sub>n</sub>'' converges in probability to zero, then ''Y<sub>n</sub>'' also converges in distribution to ''X'':<ref name="vdv2"/><sup>[[Proofs of convergence of random variables#propB2|[proof]]]</sup> *: <math>X_n\ \xrightarrow{\overset{}{d}}\ X,\ \ |X_n-Y_n|\ \xrightarrow{\overset{}{p}}\ 0\ \quad\Rightarrow\quad Y_n\ \xrightarrow{\overset{}{d}}\ X</math> * {{anchor|propB3}} If {{mvar|X<sub>n</sub>}} converges in distribution to ''X'' and ''Y<sub>n</sub>'' converges in distribution to a constant ''c'', then the joint vector {{math|(''X''<sub>''n''</sub>, ''Y''<sub>''n''</sub>)}} converges in distribution to {{tmath|(X, c)}}:<ref name="vdv2"/><sup>[[Proofs of convergence of random variables#propB3|[proof]]]</sup> *: <math>X_n\ \xrightarrow{\overset{}{d}}\ X,\ \ Y_n\ \xrightarrow{\overset{}{d}}\ c\ \quad\Rightarrow\quad (X_n,Y_n)\ \xrightarrow{\overset{}{d}}\ (X,c)</math> <span style="position:relative;top:.4em;left:2em;">provided ''c'' is a constant.</span> *:Note that the condition that {{mvar|Y<sub>n</sub>}} converges to a constant is important, if it were to converge to a random variable ''Y'' then we wouldn't be able to conclude that {{math|(''X''<sub>''n''</sub>, ''Y''<sub>''n''</sub>)}} converges to {{tmath|(X, Y)}}. * {{anchor|propB4}} If ''X<sub>n</sub>'' converges in probability to ''X'' and ''Y<sub>n</sub>'' converges in probability to ''Y'', then the joint vector {{math|(''X''<sub>''n''</sub>, ''Y''<sub>''n''</sub>)}} converges in probability to {{math|(''X'', ''Y'')}}:<ref name="vdv2"/><sup>[[Proofs of convergence of random variables#propB4|[proof]]]</sup> *: <math>X_n\ \xrightarrow{\overset{}{p}}\ X,\ \ Y_n\ \xrightarrow{\overset{}{p}}\ Y\ \quad\Rightarrow\quad (X_n,Y_n)\ \xrightarrow{\overset{}{p}}\ (X,Y)</math> * If {{mvar|X<sub>n</sub>}} converges in probability to ''X'', and if {{math|'''P'''({{mabs|''X<sub>n</sub>''}} ≤ ''b'') {{=}} 1}} for all ''n'' and some ''b'', then {{mvar|X<sub>n</sub>}} converges in ''r''th mean to ''X'' for all {{math|''r'' ≥ 1}}. In other words, if {{mvar|X<sub>n</sub>}} converges in probability to ''X'' and all random variables {{mvar|X<sub>n</sub>}} are almost surely bounded above and below, then {{mvar|X<sub>n</sub>}} converges to ''X'' also in any ''r''th mean.<ref>{{harvnb|Grimmett|Stirzaker|2020|p=354}}</ref> * '''Almost sure representation'''. Usually, convergence in distribution does not imply convergence almost surely. However, for a given sequence {''X<sub>n</sub>''} which converges in distribution to ''X''<sub>0</sub> it is always possible to find a new probability space (Ω, ''F'', P) and random variables {''Y<sub>n</sub>'', ''n'' = 0, 1, ...} defined on it such that ''Y<sub>n</sub>'' is equal in distribution to {{mvar|X<sub>n</sub>}} for each {{math|''n'' ≥ 0}}, and ''Y<sub>n</sub>'' converges to ''Y''<sub>0</sub> almost surely.<ref>{{harvnb|van der Vaart|1998|loc=Th.2.19}}</ref><ref>{{Harvnb|Fristedt|Gray|1997|loc=Theorem 14.5}}</ref> * If for all ''ε'' > 0, *::<math>\sum_n \mathbb{P} \left(|X_n - X| > \varepsilon\right) < \infty,</math> *:then we say that {{mvar|X<sub>n</sub>}} ''converges almost completely'', or ''almost in probability'' towards ''X''. When {{mvar|X<sub>n</sub>}} converges almost completely towards ''X'' then it also converges almost surely to ''X''. In other words, if {{mvar|X<sub>n</sub>}} converges in probability to ''X'' sufficiently quickly (i.e. the above sequence of tail probabilities is summable for all {{math|''ε'' > 0}}), then {{mvar|X<sub>n</sub>}} also converges almost surely to ''X''. This is a direct implication from the [[Borel–Cantelli lemma]]. * If {{mvar|S<sub>n</sub>}} is a sum of ''n'' real independent random variables: *::<math>S_n = X_1+\cdots+X_n \, </math> *:then {{mvar|S<sub>n</sub>}} converges almost surely if and only if {{mvar|S<sub>n</sub>}} converges in probability. The proof can be found in Page 126 (Theorem 5.3.4) of the book by [[Kai Lai Chung]].<ref name="Chung">{{cite book|last1=Chung|first1=Kai-lai|title=A Course in Probability Theory|date=2001|page=126}}</ref> *:However, for a sequence of mutually independent random variables, convergence in probability does not imply almost sure convergence.<ref>{{Cite web |title=Proofs of convergence of random variables |url=https://en.wikipedia.org/wiki/Proofs_of_convergence_of_random_variables |access-date=2024-09-23 |website=Wikipedia}}</ref>{{Circular reference|date=February 2025}} * The [[dominated convergence theorem]] gives sufficient conditions for almost sure convergence to imply ''L''<sup>1</sup>-convergence: {{NumBlk|*::|<math>\left. \begin{matrix} X_n\xrightarrow{\overset{}\text{a.s.}} X \\ |X_n| < Y \\ \mathbb{E}[Y] < \infty \end{matrix}\right\} \quad\Rightarrow \quad X_n\xrightarrow{{L^1}} X </math>|{{EquationRef|5}}}} *A necessary and sufficient condition for ''L''<sup>1</sup> convergence is <math>X_n\xrightarrow{\overset{}{P}} X</math> and the sequence (''X<sub>n</sub>'') is [[uniformly integrable]]. *If <math>X_n\ \xrightarrow{\overset{}{p}}\ X </math>, the followings are equivalent<ref>{{Cite web |title=real analysis - Generalizing Scheffe's Lemma using only Convergence in Probability |url=https://math.stackexchange.com/questions/4401886/generalizing-scheffes-lemma-using-only-convergence-in-probability |access-date=2022-03-12 |website=Mathematics Stack Exchange}}</ref> **<math>X_n\ \xrightarrow{\overset{}{L^r}}\ X</math>, **<math> \mathbb{E}[|X_n|^r] \rightarrow \mathbb{E}[|X|^r] < \infty </math>, **<math>\{|X_n|^r\}</math> is [[uniformly integrable]].
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Convergence of random variables
(section)
Add topic