Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Sufficient statistic
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Fisher–Neyman factorization theorem== ''[[Ronald Fisher|Fisher's]] factorization theorem'' or ''factorization criterion'' provides a convenient '''characterization''' of a sufficient statistic. If the [[probability density function]] is ƒ<sub>''θ''</sub>(''x''), then ''T'' is sufficient for ''θ'' [[if and only if]] nonnegative functions ''g'' and ''h'' can be found such that :<math> f(x;\theta)=h(x) \, g(\theta,T(x)), </math> i.e., the density ƒ can be factored into a product such that one factor, ''h'', does not depend on ''θ'' and the other factor, which does depend on ''θ'', depends on ''x'' only through ''T''(''x''). A general proof of this was given by Halmos and Savage<ref>{{Cite journal |last1=Halmos |first1=P. R. |last2=Savage |first2=L. J. |date=1949 |title=Application of the Radon-Nikodym Theorem to the Theory of Sufficient Statistics |url=http://projecteuclid.org/euclid.aoms/1177730032 |journal=The Annals of Mathematical Statistics |language=en |volume=20 |issue=2 |pages=225–241 |doi=10.1214/aoms/1177730032 |issn=0003-4851|doi-access=free }}</ref> and the theorem is sometimes referred to as the Halmos–Savage factorization theorem.<ref>{{Cite web |title=Factorization theorem - Encyclopedia of Mathematics |url=https://encyclopediaofmath.org/wiki/Factorization_theorem |access-date=2022-09-07 |website=encyclopediaofmath.org}}</ref> The proofs below handle special cases, but an alternative general proof along the same lines can be given.<ref>{{Cite journal |last=Taraldsen |first=G. |date=2022 |title=The Factorization Theorem for Sufficiency |url= |journal=Preprint |language=en |doi=10.13140/RG.2.2.15068.87687}}</ref> In many simple cases the probability density function is fully specified by <math>\theta</math> and <math>T(x)</math>, and <math>h(x)=1</math> (see [[Sufficient statistic#Examples|Examples]]). It is easy to see that if ''F''(''t'') is a one-to-one function and ''T'' is a sufficient statistic, then ''F''(''T'') is a sufficient statistic. In particular we can multiply a sufficient statistic by a nonzero constant and get another sufficient statistic. ===Likelihood principle interpretation=== An implication of the theorem is that when using likelihood-based inference, two sets of data yielding the same value for the sufficient statistic ''T''(''X'') will always yield the same inferences about ''θ''. By the factorization criterion, the likelihood's dependence on ''θ'' is only in conjunction with ''T''(''X''). As this is the same in both cases, the dependence on ''θ'' will be the same as well, leading to identical inferences. ===Proof=== Due to Hogg and Craig.<ref name="HoggCraig">{{cite book | last = Hogg | first = Robert V. |author2=Craig, Allen T. | title = Introduction to Mathematical Statistics | publisher=Prentice Hall | year = 1995 | isbn=978-0-02-355722-4}}</ref> Let <math>X_1, X_2, \ldots, X_n</math>, denote a random sample from a distribution having the [[Probability density function|pdf]] ''f''(''x'', ''θ'') for ''ι'' < ''θ'' < ''δ''. Let ''Y''<sub>1</sub> = ''u''<sub>1</sub>(''X''<sub>1</sub>, ''X''<sub>2</sub>, ..., ''X''<sub>''n''</sub>) be a statistic whose pdf is ''g''<sub>1</sub>(''y''<sub>1</sub>; ''θ''). What we want to prove is that ''Y''<sub>1</sub> = ''u''<sub>1</sub>(''X''<sub>1</sub>, ''X''<sub>2</sub>, ..., ''X''<sub>''n''</sub>) is a sufficient statistic for ''θ'' if and only if, for some function ''H'', :<math> \prod_{i=1}^n f(x_i; \theta) = g_1 \left[u_1 (x_1, x_2, \dots, x_n); \theta \right] H(x_1, x_2, \dots, x_n). </math> First, suppose that :<math> \prod_{i=1}^n f(x_i; \theta) = g_1 \left[u_1 (x_1, x_2, \dots, x_n); \theta \right] H(x_1, x_2, \dots, x_n). </math> We shall make the transformation ''y''<sub>''i''</sub> = ''u''<sub>i</sub>(''x''<sub>1</sub>, ''x''<sub>2</sub>, ..., ''x''<sub>''n''</sub>), for ''i'' = 1, ..., ''n'', having inverse functions ''x''<sub>''i''</sub> = ''w''<sub>''i''</sub>(''y''<sub>1</sub>, ''y''<sub>2</sub>, ..., ''y''<sub>''n''</sub>), for ''i'' = 1, ..., ''n'', and [[Jacobian matrix and determinant|Jacobian]] <math> J = \left[w_i/y_j \right] </math>. Thus, :<math> \prod_{i=1}^n f \left[ w_i(y_1, y_2, \dots, y_n); \theta \right] = |J| g_1 (y_1; \theta) H \left[ w_1(y_1, y_2, \dots, y_n), \dots, w_n(y_1, y_2, \dots, y_n) \right]. </math> The left-hand member is the joint pdf ''g''(''y''<sub>1</sub>, ''y''<sub>2</sub>, ..., ''y''<sub>''n''</sub>; θ) of ''Y''<sub>1</sub> = ''u''<sub>1</sub>(''X''<sub>1</sub>, ..., ''X''<sub>''n''</sub>), ..., ''Y''<sub>''n''</sub> = ''u''<sub>''n''</sub>(''X''<sub>1</sub>, ..., ''X''<sub>''n''</sub>). In the right-hand member, <math>g_1(y_1;\theta)</math> is the pdf of <math>Y_1</math>, so that <math>H[ w_1, \dots , w_n] |J|</math> is the quotient of <math>g(y_1,\dots,y_n;\theta)</math> and <math>g_1(y_1;\theta)</math>; that is, it is the conditional pdf <math>h(y_2, \dots, y_n \mid y_1; \theta)</math> of <math>Y_2,\dots,Y_n</math> given <math>Y_1=y_1</math>. But <math>H(x_1,x_2,\dots,x_n)</math>, and thus <math>H\left[w_1(y_1,\dots,y_n), \dots, w_n(y_1, \dots, y_n))\right]</math>, was given not to depend upon <math>\theta</math>. Since <math>\theta</math> was not introduced in the transformation and accordingly not in the Jacobian <math>J</math>, it follows that <math>h(y_2, \dots, y_n \mid y_1; \theta)</math> does not depend upon <math>\theta</math> and that <math>Y_1</math> is a sufficient statistics for <math>\theta</math>. The converse is proven by taking: :<math>g(y_1,\dots,y_n;\theta)=g_1(y_1; \theta) h(y_2, \dots, y_n \mid y_1),</math> where <math>h(y_2, \dots, y_n \mid y_1)</math> does not depend upon <math>\theta</math> because <math>Y_2 ... Y_n</math> depend only upon <math>X_1 ... X_n</math>, which are independent on <math>\Theta</math> when conditioned by <math>Y_1</math>, a sufficient statistics by hypothesis. Now divide both members by the absolute value of the non-vanishing Jacobian <math>J</math>, and replace <math>y_1, \dots, y_n</math> by the functions <math>u_1(x_1, \dots, x_n), \dots, u_n(x_1,\dots, x_n)</math> in <math>x_1,\dots, x_n</math>. This yields :<math>\frac{g\left[ u_1(x_1, \dots, x_n), \dots, u_n(x_1, \dots, x_n); \theta \right]}{|J^*|}=g_1\left[u_1(x_1,\dots,x_n); \theta\right] \frac{h(u_2, \dots, u_n \mid u_1)}{|J^*|}</math> where <math>J^*</math> is the Jacobian with <math>y_1,\dots,y_n</math> replaced by their value in terms <math>x_1, \dots, x_n</math>. The left-hand member is necessarily the joint pdf <math>f(x_1;\theta)\cdots f(x_n;\theta)</math> of <math>X_1,\dots,X_n</math>. Since <math>h(y_2,\dots,y_n\mid y_1)</math>, and thus <math>h(u_2,\dots,u_n\mid u_1)</math>, does not depend upon <math>\theta</math>, then :<math>H(x_1,\dots,x_n)=\frac{h(u_2,\dots,u_n\mid u_1)}{|J^*|}</math> is a function that does not depend upon <math>\theta</math>. ===Another proof=== A simpler more illustrative proof is as follows, although it applies only in the discrete case. We use the shorthand notation to denote the joint probability density of <math>(X, T(X))</math> by <math>f_\theta(x,t)</math>. Since <math>T</math> is a deterministic function of <math>X</math>, we have <math>f_\theta(x,t) = f_\theta(x)</math>, as long as <math>t = T(x)</math> and zero otherwise. Therefore: :<math> \begin{align} f_\theta(x) & = f_\theta(x,t) \\[5pt] & = f_\theta (x\mid t) f_\theta(t) \\[5pt] & = f(x\mid t) f_\theta(t) \end{align} </math> with the last equality being true by the definition of sufficient statistics. Thus <math>f_\theta(x)=a(x) b_\theta(t)</math> with <math>a(x) = f_{X \mid t}(x)</math> and <math>b_\theta(t) = f_\theta(t)</math>. Conversely, if <math>f_\theta(x)=a(x) b_\theta(t)</math>, we have :<math> \begin{align} f_\theta(t) & = \sum _{x : T(x) = t} f_\theta(x, t) \\[5pt] & = \sum _{x : T(x) = t} f_\theta(x) \\[5pt] & = \sum _{x : T(x) = t} a(x) b_\theta(t) \\[5pt] & = \left( \sum _{x : T(x) = t} a(x) \right) b_\theta(t). \end{align}</math> With the first equality by the [[Probability density function#Densities associated with multiple variables|definition of pdf for multiple variables]], the second by the remark above, the third by hypothesis, and the fourth because the summation is not over <math>t</math>. Let <math>f_{X\mid t}(x)</math> denote the conditional probability density of <math>X</math> given <math>T(X)</math>. Then we can derive an explicit expression for this: :<math> \begin{align} f_{X\mid t}(x) & = \frac{f_\theta(x, t)}{f_\theta(t)} \\[5pt] & = \frac{f_\theta(x)}{f_\theta(t)} \\[5pt] & = \frac{a(x) b_\theta(t)}{\left( \sum _{x : T(x) = t} a(x) \right) b_\theta(t)} \\[5pt] & = \frac{a(x)}{\sum _{x : T(x) = t} a(x)}. \end{align}</math> With the first equality by definition of conditional probability density, the second by the remark above, the third by the equality proven above, and the fourth by simplification. This expression does not depend on <math>\theta</math> and thus <math>T</math> is a sufficient statistic.<ref>{{cite web | url=http://cnx.org/content/m11480/1.6/ | title=The Fisher–Neyman Factorization Theorem}}. Webpage at Connexions (cnx.org)</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Sufficient statistic
(section)
Add topic