Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Uncorrelatedness (probability theory)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Theory in probability}} {{more citations needed|date=January 2013}} In [[probability theory]] and [[statistics]], two real-valued [[random variable]]s, <math>X</math>, <math>Y</math>, are said to be '''uncorrelated''' if their [[covariance]], <math>\operatorname{cov}[X,Y] = \operatorname{E}[XY] - \operatorname{E}[X] \operatorname{E}[Y]</math>, is zero. If two variables are uncorrelated, there is no linear relationship between them. Uncorrelated random variables have a [[Pearson correlation coefficient]], when it exists, of zero, except in the trivial case when either variable has zero [[variance]] (is a constant). In this case the correlation is undefined. In general, uncorrelatedness is not the same as [[orthogonality]], except in the special case where at least one of the two random variables has an expected value of 0. In this case, the [[covariance]] is the expectation of the product, and <math>X</math> and <math>Y</math> are uncorrelated [[if and only if]] <math>\operatorname{E}[XY] = 0</math>. If <math>X</math> and <math>Y</math> are [[statistical independence|independent]], with finite [[second moment]]s, then they are uncorrelated. However, not all uncorrelated variables are independent.<ref name="Papoulis">{{cite book | last = Papoulis| first =Athanasios| title = Probability, Random Variables and Stochastic Processes | publisher = MCGraw Hill | year = 1991| isbn = 0-07-048477-5}}</ref>{{rp|p. 155}} ==Definition== ===Definition for two real random variables=== Two random variables <math>X,Y</math> are called uncorrelated if their covariance <math>\operatorname{Cov}[X,Y]=\operatorname{E}[(X-\operatorname{E}[X]) (Y-\operatorname{E}[Y])]</math> is zero.<ref name="Papoulis" />{{rp|p. 153}}<ref name=KunIlPark>Kun Il Park, Fundamentals of Probability and Stochastic Processes with Applications to Communications, Springer, 2018, 978-3-319-68074-3</ref>{{rp|p. 121}} Formally: {{Equation box 1 |indent = |title= |equation = <math>X,Y \text{ uncorrelated} \quad \iff \quad \operatorname{E}[XY] = \operatorname{E}[X] \cdot \operatorname{E}[Y]</math> |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA}} ===Definition for two complex random variables=== Two [[complex random variable]]s <math>Z,W</math> are called uncorrelated if their covariance <math>\operatorname{K}_{ZW}=\operatorname{E}[(Z-\operatorname{E}[Z])\overline{(W-\operatorname{E}[W])}]</math> and their pseudo-covariance <math>\operatorname{J}_{ZW}=\operatorname{E}[(Z-\operatorname{E}[Z]) (W-\operatorname{E}[W])]</math> is zero, i.e. <math>Z,W \text{ uncorrelated} \quad \iff \quad \operatorname{E}[Z\overline{W}] = \operatorname{E}[Z] \cdot \operatorname{E}[\overline{W}] \text{ and } \operatorname{E}[ZW] = \operatorname{E}[Z] \cdot \operatorname{E}[W]</math> ===Definition for more than two random variables=== A set of two or more random variables <math>X_1,\ldots,X_n</math> is called uncorrelated if each pair of them is uncorrelated. This is equivalent to the requirement that the non-diagonal elements of the [[autocovariance matrix]] <math>\operatorname{K}_{\mathbf{X}\mathbf{X}}</math> of the [[random vector]] <math>\mathbf{X} = [X_1 \ldots X_n]^\mathrm{T}</math> are all zero. The autocovariance matrix is defined as: :<math>\operatorname{K}_{\mathbf{X}\mathbf{X}} = \operatorname{cov}[\mathbf{X},\mathbf{X}] = \operatorname{E}[(\mathbf{X}-\operatorname{E}[\mathbf{X}])(\mathbf{X}-\operatorname{E}[\mathbf{X}]))^{\rm T}]= \operatorname{E}[\mathbf{X} \mathbf{X}^T] - \operatorname{E}[\mathbf{X}]\operatorname{E}[\mathbf{X}]^T</math> ==Examples of dependence without correlation == {{main|Correlation and dependence}} ===Example 1=== * Let <math>X</math> be a random variable that takes the value 0 with probability 1/2, and takes the value 1 with probability 1/2. * Let <math>Y</math> be a random variable, ''independent'' of <math>X</math>, that takes the value β1 with probability 1/2, and takes the value 1 with probability 1/2. * Let <math>U</math> be a random variable constructed as <math>U=XY</math>. The claim is that <math>U</math> and <math>X</math> have zero covariance (and thus are uncorrelated), but are not independent. Proof: Taking into account that :<math>\operatorname{E}[U] = \operatorname{E}[XY] = \operatorname{E}[X] \operatorname{E}[Y] = \operatorname{E}[X] \cdot 0 = 0,</math> where the second equality holds because <math>X</math> and <math>Y</math> are independent, one gets :<math> \begin{align} \operatorname{cov}[U,X] & = \operatorname{E}[(U-\operatorname E[U])(X-\operatorname E[X])] = \operatorname{E}[ U (X-\tfrac12)] \\ & = \operatorname{E}[X^2 Y - \tfrac12 XY] = \operatorname{E}[(X^2-\tfrac12 X)Y] = \operatorname{E}[(X^2-\tfrac12 X)] \operatorname E[Y] = 0 \end{align} </math> Therefore, <math>U</math> and <math>X</math> are uncorrelated. Independence of <math>U</math> and <math>X</math> means that for all <math>a</math> and <math>b</math>, <math>\Pr(U=a\mid X=b) = \Pr(U=a)</math>. This is not true, in particular, for <math>a=1</math> and <math>b=0</math>. * <math>\Pr(U=1\mid X=0) = \Pr(XY=1\mid X=0) = 0</math> * <math>\Pr(U=1) = \Pr(XY=1) = 1/4 </math> Thus <math>\Pr(U=1\mid X=0)\ne \Pr(U=1)</math> so <math>U</math> and <math>X</math> are not independent. Q.E.D. ===Example 2=== If <math>X</math> is a continuous random variable [[uniform distribution (continuous)|uniformly distributed]] on <math>[-1,1]</math> and <math>Y = X^2</math>, then <math>X</math> and <math>Y</math> are uncorrelated even though <math>X</math> determines <math>Y</math> and a particular value of <math>Y</math> can be produced by only one or two values of <math>X</math> : <math> f_X(t)= {1 \over 2} I_{[-1,1]} ; f_Y(t)= {1 \over {2 \sqrt{t}}} I_{]0,1]}</math> on the other hand, <math> f_{X,Y}</math> is 0 on the triangle defined by <math>0<X<Y<1</math> although <math>f_X \times f_Y </math> is not null on this domain. Therefore <math> f_{X,Y} (X,Y) \neq f_X (X) \times f_Y (Y) </math> and the variables are not independent. <math> E[X] = {{1-1} \over 4} = 0 ; E[Y]= {{1^3 - (-1)^3}\over {3 \times 2} } = {1 \over 3} </math> <math> Cov[X,Y]=E \left [(X-E[X])(Y-E[Y]) \right ] = E \left [X^3- {X \over 3} \right ] = {{1^4-(-1)^4}\over{4 \times 2}}=0 </math> Therefore the variables are uncorrelated. ==When uncorrelatedness implies independence== There are cases in which uncorrelatedness does imply independence.<!-- but not the only two --> One of these cases is the one in which both random variables are two-valued (so each can be linearly transformed to have a [[Bernoulli distribution]]).<ref>[http://www.math.uah.edu/stat/expect/Covariance.html Virtual Laboratories in Probability and Statistics: Covariance and Correlation], item 17.</ref> Further, two jointly normally distributed random variables are independent if they are uncorrelated,<ref>{{cite book|chapter=Chapter 5.5 Conditional Expectation|pages=185β186|title=Introduction to Probability and Mathematical Statistics|year=1992|last1=Bain|first1=Lee|last2=Engelhardt|first2=Max|edition=2nd|isbn=0534929303}}</ref> although this does not hold for variables whose marginal distributions are normal and uncorrelated but whose joint distribution is not joint normal (see [[Normally distributed and uncorrelated does not imply independent]]). ==Generalizations== ===Uncorrelated random vectors=== Two [[random vector]]s <math>\mathbf{X}=(X_1,\ldots,X_m)^T </math> and <math>\mathbf{Y}=(Y_1,\ldots,Y_n)^T </math> are called uncorrelated if :<math>\operatorname{E}[\mathbf{X} \mathbf{Y}^T] = \operatorname{E}[\mathbf{X}]\operatorname{E}[\mathbf{Y}]^T</math>. They are uncorrelated if and only if their [[cross-covariance matrix]] <math>\operatorname{K}_{\mathbf{X}\mathbf{Y}}</math> is zero.<ref name=Gubner>{{cite book |first=John A. |last=Gubner |year=2006 |title=Probability and Random Processes for Electrical and Computer Engineers |publisher=Cambridge University Press |isbn=978-0-521-86470-1}}</ref>{{rp|p.337}} Two complex random vectors <math>\mathbf{Z}</math> and <math>\mathbf{W}</math> are called '''uncorrelated''' if their cross-covariance matrix and their pseudo-cross-covariance matrix is zero, i.e. if :<math>\operatorname{K}_{\mathbf{Z}\mathbf{W}}=\operatorname{J}_{\mathbf{Z}\mathbf{W}}=0</math> where :<math> \operatorname{K}_{\mathbf{Z}\mathbf{W}} =\operatorname{E}[(\mathbf{Z}-\operatorname{E}[\mathbf{Z}]){(\mathbf{W}-\operatorname{E}[\mathbf{W}])}^{\mathrm H}]</math> and :<math> \operatorname{J}_{\mathbf{Z}\mathbf{W}} =\operatorname{E}[(\mathbf{Z}-\operatorname{E}[\mathbf{Z}]){(\mathbf{W}-\operatorname{E}[\mathbf{W}])}^{\mathrm T}]</math>. ===Uncorrelated stochastic processes=== Two [[stochastic process]]es <math>\left\{X_t\right\}</math> and <math>\left\{Y_t\right\}</math> are called '''uncorrelated''' if their cross-covariance <math>\operatorname{K}_{\mathbf{X}\mathbf{Y}}(t_1,t_2) = \operatorname{E} \left[ \left( X(t_1)- \mu_X(t_1) \right) \left( Y(t_2)- \mu_Y(t_2) \right) \right]</math> is zero for all times.<ref name=KunIlPark/>{{rp|p. 142}} Formally: :<math>\left\{X_t\right\},\left\{Y_t\right\} \text{ uncorrelated} \quad :\iff \quad \forall t_1,t_2 \colon \operatorname{K}_{\mathbf{X}\mathbf{Y}}(t_1,t_2) = 0</math>. ==See also== *[[Correlation and dependence]] *[[Binomial distribution#Covariance between two binomials|Binomial distribution: Covariance between two binomials]]{{Broken anchor|date=2024-03-24|bot=User:Cewbot/log/20201008/configuration|reason= The anchor (Covariance between two binomials) [[Special:Diff/927506908|has been deleted]].}} *[[Representative elementary volume|Uncorrelated Volume Element]] ==References== {{reflist}} ==Further reading== *''Probability for Statisticians'', [[Galen R. Shorack]], Springer (c2000) {{ISBN|0-387-98953-6}} [[Category:Covariance and correlation]]
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Templates used on this page:
Template:Broken anchor
(
edit
)
Template:Cite book
(
edit
)
Template:Equation box 1
(
edit
)
Template:ISBN
(
edit
)
Template:Main
(
edit
)
Template:More citations needed
(
edit
)
Template:Reflist
(
edit
)
Template:Rp
(
edit
)
Template:Short description
(
edit
)
Search
Search
Editing
Uncorrelatedness (probability theory)
Add topic