Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Variance
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===={{visible anchor|Biased sample variance}}==== In many practical situations, the true variance of a population is not known ''a priori'' and must be computed somehow. When dealing with extremely large populations, it is not possible to count every object in the population, so the computation must be performed on a [[sample (statistics)|sample]] of the population.<ref>{{cite book | last = Navidi | first = William | year = 2006 | title = Statistics for Engineers and Scientists | publisher = McGraw-Hill | page = 14 }}</ref> This is generally referred to as '''sample variance''' or '''empirical variance'''. Sample variance can also be applied to the estimation of the variance of a continuous distribution from a sample of that distribution. We take a [[statistical sample|sample with replacement]] of {{mvar|n}} values {{math|''Y''<sub>1</sub>, ..., ''Y''<sub>''n''</sub>}} from the population of size {{mvar|N}}, where {{math|''n'' < ''N''}}, and estimate the variance on the basis of this sample.<ref>Montgomery, D. C. and Runger, G. C. (1994) ''Applied statistics and probability for engineers'', page 201. John Wiley & Sons New York</ref> Directly taking the variance of the sample data gives the average of the [[squared deviations]]:<ref>{{cite conference | author1 = Yuli Zhang | author2 = Huaiyu Wu | author3 = Lei Cheng | date = June 2012 | title = Some new deformation formulas about variance and covariance | conference = Proceedings of 4th International Conference on Modelling, Identification and Control(ICMIC2012) | pages = 987β992 }}</ref> <math display="block">\tilde{S}_Y^2 = \frac{1}{n} \sum_{i=1}^n \left(Y_i - \overline{Y}\right)^2 = \left(\frac 1n \sum_{i=1}^n Y_i^2\right) - \overline{Y}^2 = \frac{1}{n^2} \sum_{i,j\,:\,i<j}\left(Y_i - Y_j\right)^2. </math> (See the section [[Variance#Population variance|Population variance]] for the derivation of this formula.) Here, <math>\overline{Y}</math> denotes the [[sample mean]]: <math display="block">\overline{Y} = \frac{1}{n} \sum_{i=1}^n Y_i .</math> Since the {{math|''Y''<sub>''i''</sub>}} are selected randomly, both <math>\overline{Y}</math> and <math>\tilde{S}_Y^2</math> are [[Random variable|random variables]]. Their expected values can be evaluated by averaging over the ensemble of all possible samples {{math|{''Y''<sub>''i''</sub>}<nowiki/>}} of size {{mvar|n}} from the population. For <math>\tilde{S}_Y^2</math> this gives: <math display="block">\begin{align} \operatorname{E}[\tilde{S}_Y^2] &= \operatorname{E}\left[ \frac{1}{n} \sum_{i=1}^n {\left(Y_i - \frac{1}{n} \sum_{j=1}^n Y_j \right)}^2 \right] \\[5pt] &= \frac 1n \sum_{i=1}^n \operatorname{E}\left[ Y_i^2 - \frac{2}{n} Y_i \sum_{j=1}^n Y_j + \frac{1}{n^2} \sum_{j=1}^n Y_j \sum_{k=1}^n Y_k \right] \\[5pt] &= \frac 1n \sum_{i=1}^n \left( \operatorname{E}\left[Y_i^2\right] - \frac{2}{n} \left( \sum_{j \neq i} \operatorname{E}\left[Y_i Y_j\right] + \operatorname{E}\left[Y_i^2\right] \right) + \frac{1}{n^2} \sum_{j=1}^n \sum_{k \neq j}^n \operatorname{E}\left[Y_j Y_k\right] +\frac{1}{n^2} \sum_{j=1}^n \operatorname{E}\left[Y_j^2\right] \right) \\[5pt] &= \frac 1n \sum_{i=1}^n \left( \frac{n - 2}{n} \operatorname{E}\left[Y_i^2\right] - \frac{2}{n} \sum_{j \neq i} \operatorname{E}\left[Y_i Y_j\right] + \frac{1}{n^2} \sum_{j=1}^n \sum_{k \neq j}^n \operatorname{E}\left[Y_j Y_k\right] +\frac{1}{n^2} \sum_{j=1}^n \operatorname{E}\left[Y_j^2\right] \right) \\[5pt] &= \frac 1n \sum_{i=1}^n \left[ \frac{n - 2}{n} \left(\sigma^2 + \mu^2\right) - \frac{2}{n} (n - 1)\mu^2 + \frac{1}{n^2} n(n - 1)\mu^2 + \frac{1}{n} \left(\sigma^2 + \mu^2\right) \right] \\[5pt] &= \frac{n - 1}{n} \sigma^2. \end{align}</math> Here <math display="inline">\sigma^2 = \operatorname{E}[Y_i^2] - \mu^2 </math> derived in the section is [[Variance#Population variance|population variance]] and <math display="inline">\operatorname{E}[Y_i Y_j] = \operatorname{E}[Y_i] \operatorname{E}[Y_j] = \mu^2</math> due to independency of <math display="inline">Y_i</math> and <math display="inline">Y_j</math>. Hence <math display="inline">\tilde{S}_Y^2</math> gives an estimate of the population variance <math display="inline">\sigma^2</math> that is biased by a factor of <math display="inline">\frac{n - 1}{n}</math> because the expectation value of <math display="inline">\tilde{S}_Y^2</math> is smaller than the population variance (true variance) by that factor. For this reason, <math display="inline">\tilde{S}_Y^2</math> is referred to as the ''biased sample variance''.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Variance
(section)
Add topic