Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Maximum likelihood estimation
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Restricted parameter space === {{Distinguish|restricted maximum likelihood}} While the domain of the likelihood function—the [[parameter space]]—is generally a finite-dimensional subset of [[Euclidean space]], additional [[Restriction (mathematics)|restriction]]s sometimes need to be incorporated into the estimation process. The parameter space can be expressed as <math display="block">\Theta = \left\{ \theta : \theta \in \mathbb{R}^{k},\; h(\theta) = 0 \right\} ~,</math> where <math>\; h(\theta) = \left[ h_{1}(\theta), h_{2}(\theta), \ldots, h_{r}(\theta) \right] \;</math> is a [[vector-valued function]] mapping <math>\, \mathbb{R}^{k} \,</math> into <math>\; \mathbb{R}^{r} ~.</math> Estimating the true parameter <math>\theta</math> belonging to <math>\Theta</math> then, as a practical matter, means to find the maximum of the likelihood function subject to the [[Constraint (mathematics)|constraint]] <math>~h(\theta) = 0 ~.</math> Theoretically, the most natural approach to this [[constrained optimization]] problem is the method of substitution, that is "filling out" the restrictions <math>\; h_{1}, h_{2}, \ldots, h_{r} \;</math> to a set <math>\; h_{1}, h_{2}, \ldots, h_{r}, h_{r+1}, \ldots, h_{k} \;</math> in such a way that <math>\; h^{\ast} = \left[ h_{1}, h_{2}, \ldots, h_{k} \right] \;</math> is a [[one-to-one function]] from <math>\mathbb{R}^{k}</math> to itself, and reparameterize the likelihood function by setting <math>\; \phi_{i} = h_{i}(\theta_{1}, \theta_{2}, \ldots, \theta_{k}) ~.</math><ref name="Silvey p79">{{cite book |first=S. D. |last=Silvey |year=1975 |title=Statistical Inference |location=London, UK |publisher=Chapman and Hall |isbn=0-412-13820-4 |page=79 |url=https://books.google.com/books?id=qIKLejbVMf4C&pg=PA79 }}</ref> Because of the equivariance of the maximum likelihood estimator, the properties of the MLE apply to the restricted estimates also.<ref>{{cite web |first=David |last=Olive |year=2004 |title=Does the MLE maximize the likelihood? |website=Southern Illinois University |url=http://lagrange.math.siu.edu/Olive/simle.pdf }}</ref> For instance, in a [[multivariate normal distribution]] the [[covariance matrix]] <math>\, \Sigma \,</math> must be [[Positive-definite matrix|positive-definite]]; this restriction can be imposed by replacing <math>\; \Sigma = \Gamma^{\mathsf{T}} \Gamma \;,</math> where <math>\Gamma</math> is a real [[upper triangular matrix]] and <math>\Gamma^{\mathsf{T}}</math> is its [[transpose]].<ref>{{cite journal |first=Daniel P. |last=Schwallie |year=1985 |title=Positive definite maximum likelihood covariance estimators |journal=Economics Letters |volume=17 |issue=1–2 |pages=115–117 |doi=10.1016/0165-1765(85)90139-9 }}</ref> In practice, restrictions are usually imposed using the method of Lagrange which, given the constraints as defined above, leads to the ''restricted likelihood equations'' <math display="block">\frac{\partial \ell}{\partial \theta} - \frac{\partial h(\theta)^\mathsf{T}}{\partial \theta} \lambda = 0</math> and <math>h(\theta) = 0 \;,</math> where <math>~ \lambda = \left[ \lambda_{1}, \lambda_{2}, \ldots, \lambda_{r}\right]^\mathsf{T} ~</math> is a column-vector of [[Lagrange multiplier]]s and <math>\; \frac{\partial h(\theta)^\mathsf{T}}{\partial \theta} \;</math> is the {{mvar|k × r}} [[Jacobian matrix]] of partial derivatives.<ref name="Silvey p79"/> Naturally, if the constraints are not binding at the maximum, the Lagrange multipliers should be zero.<ref>{{cite book |first=Jan R. |last=Magnus |year=2017 |title=Introduction to the Theory of Econometrics |location=Amsterdam |publisher=VU University Press |pages=64–65 |isbn=978-90-8659-766-6}}</ref> This in turn allows for a statistical test of the "validity" of the constraint, known as the [[Lagrange multiplier test]].
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Maximum likelihood estimation
(section)
Add topic