Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Likelihood function
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Likelihood equations=== If the log-likelihood function is [[Smoothness|smooth]], its [[gradient]] with respect to the parameter, known as the [[Score (statistics)|score]] and written <math display="inline">s_{n}(\theta) \equiv \nabla_{\theta} \ell_{n}(\theta)</math>, exists and allows for the application of [[differential calculus]]. The basic way to maximize a differentiable function is to find the [[stationary point]]s (the points where the [[derivative]] is zero); since the derivative of a sum is just the sum of the derivatives, but the derivative of a product requires the [[product rule]], it is easier to compute the stationary points of the log-likelihood of independent events than for the likelihood of independent events. The equations defined by the stationary point of the score function serve as [[estimating equations]] for the maximum likelihood estimator. <math display="block">s_{n}(\theta) = \mathbf{0}</math> In that sense, the maximum likelihood estimator is implicitly defined by the value at <math display="inline">\mathbf{0}</math> of the [[inverse function]] <math display="inline">s_{n}^{-1}: \mathbb{E}^{d} \to \Theta</math>, where <math display="inline">\mathbb{E}^{d}</math> is the <var>d</var>-dimensional [[Euclidean space]], and <math display="inline">\Theta</math> is the parameter space. Using the [[inverse function theorem]], it can be shown that <math display="inline">s_{n}^{-1}</math> is [[well-defined]] in an [[open neighborhood]] about <math display="inline">\mathbf{0}</math> with probability going to one, and <math display="inline">\hat{\theta}_{n} = s_{n}^{-1}(\mathbf{0})</math> is a consistent estimate of <math display="inline">\theta</math>. As a consequence there exists a sequence <math display="inline">\left\{ \hat{\theta}_{n} \right\}</math> such that <math display="inline">s_{n}(\hat{\theta}_{n}) = \mathbf{0}</math> asymptotically [[almost surely]], and <math display="inline">\hat{\theta}_{n} \xrightarrow{\text{p}} \theta_{0}</math>.<ref>{{cite journal |first=Robert V. |last=Foutz |title=On the Unique Consistent Solution to the Likelihood Equations |journal=[[Journal of the American Statistical Association]] |volume=72 |year=1977 |issue=357 |pages=147β148 |doi=10.1080/01621459.1977.10479926 }}</ref> A similar result can be established using [[Rolle's theorem]].<ref>{{cite journal |first1=Robert E. |last1=Tarone |first2=Gary |last2=Gruenhage |title=A Note on the Uniqueness of Roots of the Likelihood Equations for Vector-Valued Parameters |journal=Journal of the American Statistical Association |volume=70 |year=1975 |issue=352 |pages=903β904 |doi=10.1080/01621459.1975.10480321 }}</ref><ref>{{cite journal |first1=Kamta |last1=Rai |first2=John |last2=Van Ryzin |title=A Note on a Multivariate Version of Rolle's Theorem and Uniqueness of Maximum Likelihood Roots |journal=Communications in Statistics |series=Theory and Methods |volume=11 |year=1982 |issue=13 |pages=1505β1510 |doi=10.1080/03610928208828325 }}</ref> The second derivative evaluated at <math display="inline">\hat{\theta}</math>, known as [[Fisher information]], determines the curvature of the likelihood surface,<ref>{{citation |first=B. Raja |last=Rao |title=A formula for the curvature of the likelihood surface of a sample drawn from a distribution admitting sufficient statistics |journal=[[Biometrika]] |volume=47 |issue=1β2 |year=1960 |pages=203β207 |doi=10.1093/biomet/47.1-2.203 |mode=cs1 }}</ref> and thus indicates the [[Precision (statistics)|precision]] of the estimate.<ref>{{citation |first1=Michael D. |last1=Ward |first2=John S. |last2=Ahlquist |title=Maximum Likelihood for Social Science : Strategies for Analysis |publisher= [[Cambridge University Press]] |year=2018 |pages=25β27 |mode=cs1 }}</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Likelihood function
(section)
Add topic