Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Kalman filter
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Marginal likelihood == Related to the recursive Bayesian interpretation described above, the Kalman filter can be viewed as a [[generative model]], i.e., a process for ''generating'' a stream of random observations '''z''' = ('''z'''<sub>0</sub>, '''z'''<sub>1</sub>, '''z'''<sub>2</sub>, ...). Specifically, the process is # Sample a hidden state <math>\mathbf{x}_0</math> from the Gaussian prior distribution <math>p\left(\mathbf{x}_0\right) = \mathcal{N}\left(\hat{\mathbf{x}}_{0 \mid 0}, \mathbf{P}_{0 \mid 0}\right)</math>. # Sample an observation <math>\mathbf{z}_0</math> from the observation model <math>p\left(\mathbf{z}_0 \mid \mathbf{x}_0\right) = \mathcal{N}\left(\mathbf{H}_0\mathbf{x}_0, \mathbf{R}_0\right)</math>. # For <math>k = 1, 2, 3, \ldots</math>, do ## Sample the next hidden state <math>\mathbf{x}_k</math> from the transition model <math>p\left(\mathbf{x}_k \mid \mathbf{x}_{k-1}\right) = \mathcal{N}\left(\mathbf{F}_k \mathbf{x}_{k-1} + \mathbf{B}_k\mathbf{u}_k, \mathbf{Q}_k\right).</math> ## Sample an observation <math>\mathbf{z}_k</math> from the observation model <math>p\left(\mathbf{z}_k \mid \mathbf{x}_k\right) = \mathcal{N}\left(\mathbf{H}_k\mathbf{x}_k, \mathbf{R}_k\right).</math> This process has identical structure to the [[hidden Markov model]], except that the discrete state and observations are replaced with continuous variables sampled from Gaussian distributions. In some applications, it is useful to compute the ''probability'' that a Kalman filter with a given set of parameters (prior distribution, transition and observation models, and control inputs) would generate a particular observed signal. This probability is known as the [[marginal likelihood]] because it integrates over ("marginalizes out") the values of the hidden state variables, so it can be computed using only the observed signal. The marginal likelihood can be useful to evaluate different parameter choices, or to compare the Kalman filter against other models using [[Bayesian model comparison]]. It is straightforward to compute the marginal likelihood as a side effect of the recursive filtering computation. By the [[Chain rule (probability)|chain rule]], the likelihood can be factored as the product of the probability of each observation given previous observations, :<math>p(\mathbf{z}) = \prod_{k=0}^T p\left(\mathbf{z}_k \mid \mathbf{z}_{k-1}, \ldots, \mathbf{z}_0\right)</math>, and because the Kalman filter describes a Markov process, all relevant information from previous observations is contained in the current state estimate <math>\hat{\mathbf{x}}_{k \mid k-1}, \mathbf{P}_{k \mid k-1}.</math> Thus the marginal likelihood is given by :<math>\begin{align} p(\mathbf{z}) &= \prod_{k=0}^T \int p\left(\mathbf{z}_k \mid \mathbf{x}_k\right) p\left(\mathbf{x}_k \mid \mathbf{z}_{k-1}, \ldots,\mathbf{z}_0\right) d\mathbf{x}_k\\ &= \prod_{k=0}^T \int \mathcal{N}\left(\mathbf{z}_k; \mathbf{H}_k\mathbf{x}_k, \mathbf{R}_k\right) \mathcal{N}\left(\mathbf{x}_k; \hat{\mathbf{x}}_{k \mid k-1}, \mathbf{P}_{k \mid k-1}\right) d\mathbf{x}_k\\ &= \prod_{k=0}^T \mathcal{N}\left(\mathbf{z}_k; \mathbf{H}_k\hat{\mathbf{x}}_{k \mid k-1}, \mathbf{R}_k + \mathbf{H}_k \mathbf{P}_{k \mid k-1} \mathbf{H}_k^\textsf{T}\right)\\ &= \prod_{k=0}^T \mathcal{N}\left(\mathbf{z}_k; \mathbf{H}_k\hat{\mathbf{x}}_{k \mid k-1}, \mathbf{S}_k\right), \end{align}</math> i.e., a product of Gaussian densities, each corresponding to the density of one observation '''z'''<sub>''k''</sub> under the current filtering distribution <math>\mathbf{H}_k\hat{\mathbf{x}}_{k \mid k-1}, \mathbf{S}_k</math>. This can easily be computed as a simple recursive update; however, to avoid [[Arithmetic underflow|numeric underflow]], in a practical implementation it is usually desirable to compute the ''log'' marginal likelihood <math>\ell = \log p(\mathbf{z})</math> instead. Adopting the convention <math>\ell^{(-1)} = 0</math>, this can be done via the recursive update rule :<math>\ell^{(k)} = \ell^{(k-1)} - \frac{1}{2} \left(\tilde{\mathbf{y}}_k^\textsf{T} \mathbf{S}^{-1}_k \tilde{\mathbf{y}}_k + \log \left|\mathbf{S}_k\right| + d_y\log 2\pi \right),</math> where <math>d_y</math> is the dimension of the measurement vector.<ref>{{Cite book|last=Lütkepohl|first= Helmut|title=Introduction to Multiple Time Series Analysis|publisher= Springer-Verlag Berlin |location=Heidelberg|year= 1991|page=435}}</ref> An important application where such a (log) likelihood of the observations (given the filter parameters) is used is multi-target tracking. For example, consider an object tracking scenario where a stream of observations is the input, however, it is unknown how many objects are in the scene (or, the number of objects is known but is greater than one). For such a scenario, it can be unknown apriori which observations/measurements were generated by which object. A multiple hypothesis tracker (MHT) typically will form different track association hypotheses, where each hypothesis can be considered as a Kalman filter (for the linear Gaussian case) with a specific set of parameters associated with the hypothesized object. Thus, it is important to compute the likelihood of the observations for the different hypotheses under consideration, such that the most-likely one can be found.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Kalman filter
(section)
Add topic