Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Estimator
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Efficiency=== {{Main|Efficiency (statistics)}} The efficiency of an estimator is used to estimate the quantity of interest in a "minimum error" manner. In reality, there is not an explicit best estimator; there can only be a better estimator. Whether the efficiency of an estimator is better or not is based on the choice of a particular [[loss function]], and it is reflected by two naturally desirable properties of estimators: to be unbiased <math>\operatorname{E}(\widehat{\theta}) - \theta=0</math> and have minimal [[mean squared error]] (MSE) <math>\operatorname{E}[(\widehat{\theta} - \theta )^2]</math>. These cannot in general both be satisfied simultaneously: a biased estimator may have a lower mean squared error than any unbiased estimator (see [[estimator bias]]). This equation relates the mean squared error with the estimator bias:<ref name=Dekker2005 /> : <math> \operatorname{E}[(\widehat{\theta} - \theta )^2]=(\operatorname{E}(\widehat{\theta}) - \theta)^2+\operatorname{Var}(\widehat\theta)\ </math> The first term represents the mean squared error; the second term represents the square of the estimator bias; and the third term represents the variance of the estimator. The quality of the estimator can be identified from the comparison between the variance, the square of the estimator bias, or the MSE. The variance of the good estimator (good efficiency) would be smaller than the variance of the bad estimator (bad efficiency). The square of an estimator bias with a good estimator would be smaller than the estimator bias with a bad estimator. The MSE of a good estimator would be smaller than the MSE of the bad estimator. Suppose there are two estimator, <math>\widehat\theta_1</math> is the good estimator and <math>\widehat\theta_2</math> is the bad estimator. The above relationship can be expressed by the following formulas. : <math>\operatorname{Var}(\widehat\theta_1)<\operatorname{Var}(\widehat\theta_2)</math> : <math>|\operatorname{E}(\widehat\theta_1) - \theta|<\left|\operatorname{E}(\widehat\theta_2) - \theta\right|</math> : <math>\operatorname{MSE}(\widehat\theta_1)<\operatorname{MSE}(\widehat\theta_2)</math> Besides using formula to identify the efficiency of the estimator, it can also be identified through the graph. If an estimator is efficient, in the frequency vs. value graph, there will be a curve with high frequency at the center and low frequency on the two sides. For example: [[File:Good estimator.jpg|center|thumb]] If an estimator is not efficient, the frequency vs. value graph, there will be a relatively more gentle curve. [[File:Bad estimator.jpg|center|thumb]] To put it simply, the good estimator has a narrow curve, while the bad estimator has a large curve. Plotting these two curves on one graph with a shared ''y''-axis, the difference becomes more obvious. [[File:The comparsion between a good and a bad estimator.jpg|center|thumb|Comparison between good and bad estimator.]] Among unbiased estimators, there often exists one with the lowest variance, called the minimum variance unbiased estimator ([[MVUE]]). In some cases an unbiased [[efficiency (statistics)|efficient estimator]] exists, which, in addition to having the lowest variance among unbiased estimators, satisfies the [[Cramér–Rao bound]], which is an absolute lower bound on variance for statistics of a variable. Concerning such "best unbiased estimators", see also [[Cramér–Rao bound]], [[Gauss–Markov theorem]], [[Lehmann–Scheffé theorem]], [[Rao–Blackwell theorem]].
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Estimator
(section)
Add topic