Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Autocorrelation
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Autocorrelation of stochastic processes == In [[statistics]], the autocorrelation of a real or complex [[random process]] is the [[Pearson correlation coefficient|Pearson correlation]] between values of the process at different times, as a function of the two times or of the time lag. Let <math>\left\{ X_t \right\}</math> be a random process, and <math>t</math> be any point in time (<math>t</math> may be an [[integer]] for a [[discrete-time]] process or a [[real number]] for a [[continuous-time]] process). Then <math>X_t</math> is the value (or [[Realization (probability)|realization]]) produced by a given [[Execution (computing)|run]] of the process at time <math>t</math>. Suppose that the process has [[mean]] <math>\mu_t</math> and [[variance]] <math>\sigma_t^2</math> at time <math>t</math>, for each <math>t</math>. Then the definition of the '''autocorrelation function''' between times <math>t_1</math> and <math>t_2</math> is<ref name=Gubner>{{cite book |first=John A. |last=Gubner |year=2006 |title=Probability and Random Processes for Electrical and Computer Engineers |publisher=Cambridge University Press |isbn=978-0-521-86470-1}}</ref>{{rp|p.388}}<ref name=KunIlPark>Kun Il Park, Fundamentals of Probability and Stochastic Processes with Applications to Communications, Springer, 2018, {{ISBN|978-3-319-68074-3}}</ref>{{rp|p.165}} {{Equation box 1 |indent = : |title= |equation = <math>\operatorname{R}_{XX}(t_1,t_2) = \operatorname{E} \left[ X_{t_1} \overline{X}_{t_2}\right]</math> |cellpadding= 6 |border colour = #0073CF |background colour=#F5FFFA}} where <math>\operatorname{E}</math> is the [[expected value]] operator and the bar represents [[complex conjugation]]. Note that the expectation may not be [[well defined]]. Subtracting the mean before multiplication yields the '''auto-covariance function''' between times <math>t_1</math> and <math>t_2</math>:<ref name=Gubner/>{{rp|p.392}}<ref name=KunIlPark/>{{rp|p.168}} {{Equation box 1 |indent = : |title= |equation = <math> \begin{align} \operatorname{K}_{XX}(t_1,t_2) &= \operatorname{E} \left[ (X_{t_1} - \mu_{t_1})\overline{(X_{t_2} - \mu_{t_2})} \right] \\ &= \operatorname{E}\left[X_{t_1} \overline{X}_{t_2} \right] - \mu_{t_1}\overline{\mu}_{t_2} \\ &= \operatorname{R}_{XX}(t_1,t_2) - \mu_{t_1}\overline{\mu}_{t_2} \end{align} </math> |cellpadding= 6 |border colour = #0073CF |background colour=#F5FFFA}} Note that this expression is not well defined for all-time series or processes, because the mean may not exist, or the variance may be zero (for a constant process) or infinite (for processes with distribution lacking well-behaved moments, such as certain types of [[power law]]). === Definition for wide-sense stationary stochastic process === If <math>\left\{ X_t \right\}</math> is a [[wide-sense stationary process]] then the mean <math>\mu</math> and the variance <math>\sigma^2</math> are time-independent, and further the autocovariance function depends only on the lag between <math>t_1</math> and <math>t_2</math>: the autocovariance depends only on the time-distance between the pair of values but not on their position in time. This further implies that the autocovariance and autocorrelation can be expressed as a function of the time-lag, and that this would be an [[even function]] of the lag <math>\tau=t_2-t_1</math>. This gives the more familiar forms for the '''autocorrelation function'''<ref name=Gubner/>{{rp|p.395}} {{Equation box 1 |indent = : |title= |equation = <math>\operatorname{R}_{XX}(\tau) = \operatorname{E}\left[X_{t+\tau} \overline{X}_{t} \right]</math> |cellpadding= 6 |border colour = #0073CF |background colour=#F5FFFA}} and the '''auto-covariance function''': {{Equation box 1 |indent = : |title= |equation = <math> \begin{align} \operatorname{K}_{XX}(\tau) &= \operatorname{E}\left[ (X_{t+\tau} - \mu)\overline{(X_{t} - \mu)} \right] \\ &= \operatorname{E} \left[ X_{t+\tau} \overline{X}_{t} \right] - \mu\overline{\mu} \\ &= \operatorname{R}_{XX}(\tau) - \mu\overline{\mu} \end{align} </math> |cellpadding= 6 |border colour = #0073CF |background colour=#F5FFFA}} In particular, note that <math display=block>\operatorname{K}_{XX}(0) = \sigma^2 .</math> === Normalization === It is common practice in some disciplines (e.g. statistics and [[time series analysis]]) to normalize the autocovariance function to get a time-dependent [[Pearson correlation coefficient]]. However, in other disciplines (e.g. engineering) the normalization is usually dropped and the terms "autocorrelation" and "autocovariance" are used interchangeably. The definition of the autocorrelation coefficient of a stochastic process is<ref name=KunIlPark/>{{rp|p.169}} <math display=block>\rho_{XX}(t_1,t_2) = \frac{\operatorname{K}_{XX}(t_1,t_2)}{\sigma_{t_1}\sigma_{t_2}} = \frac{\operatorname{E}\left[(X_{t_1} - \mu_{t_1}) \overline{(X_{t_2} - \mu_{t_2})} \right]}{\sigma_{t_1}\sigma_{t_2}} .</math> If the function <math>\rho_{XX}</math> is well defined, its value must lie in the range <math>[-1,1]</math>, with 1 indicating perfect correlation and −1 indicating perfect [[anti-correlation]]. For a [[Stationary process#wide-sense stationarity|wide-sense stationary]] (WSS) process, the definition is <math display=block>\rho_{XX}(\tau) = \frac{\operatorname{K}_{XX}(\tau)}{\sigma^2} = \frac{\operatorname{E} \left[(X_{t+\tau} - \mu)\overline{(X_{t} - \mu)}\right]}{\sigma^2}</math>. The normalization is important both because the interpretation of the autocorrelation as a correlation provides a scale-free measure of the strength of [[statistical dependence]], and because the normalization has an effect on the statistical properties of the estimated autocorrelations. ===Properties=== ====Symmetry property==== The fact that the autocorrelation function <math>\operatorname{R}_{XX}</math> is an [[even function]] can be stated as<ref name=KunIlPark/>{{rp|p.171}} <math display=block>\operatorname{R}_{XX}(t_1,t_2) = \overline{\operatorname{R}_{XX}(t_2,t_1)}</math> respectively for a WSS process:<ref name=KunIlPark/>{{rp|p.173}} <math display=block>\operatorname{R}_{XX}(\tau) = \overline{\operatorname{R}_{XX}(-\tau)} .</math> ====Maximum at zero==== For a WSS process:<ref name=KunIlPark/>{{rp|p.174}} <math display=block>\left|\operatorname{R}_{XX}(\tau)\right| \leq \operatorname{R}_{XX}(0)</math> Notice that <math>\operatorname{R}_{XX}(0)</math> is always real. ====Cauchy–Schwarz inequality==== The [[Cauchy–Schwarz inequality]], inequality for stochastic processes:<ref name=Gubner/>{{rp|p.392}} <math display=block>\left|\operatorname{R}_{XX}(t_1,t_2)\right|^2 \leq \operatorname{E}\left[ |X_{t_1}|^2\right] \operatorname{E}\left[|X_{t_2}|^2\right]</math> ====Autocorrelation of white noise==== The autocorrelation of a continuous-time [[white noise]] signal will have a strong peak (represented by a [[Dirac delta function]]) at <math>\tau=0</math> and will be exactly <math>0</math> for all other <math>\tau</math>. ====Wiener–Khinchin theorem==== The [[Wiener–Khinchin theorem]] relates the autocorrelation function <math>\operatorname{R}_{XX}</math> to the [[spectral density|power spectral density]] <math>S_{XX}</math> via the [[Fourier transform]]: <math display=block>\operatorname{R}_{XX}(\tau) = \int_{-\infty}^\infty S_{XX}(f) e^{i 2 \pi f \tau} \, {\rm d}f</math> <math display=block>S_{XX}(f) = \int_{-\infty}^\infty \operatorname{R}_{XX}(\tau) e^{- i 2 \pi f \tau} \, {\rm d}\tau .</math> For real-valued functions, the symmetric autocorrelation function has a real symmetric transform, so the [[Wiener–Khinchin theorem]] can be re-expressed in terms of real cosines only: <math display=block>\operatorname{R}_{XX}(\tau) = \int_{-\infty}^\infty S_{XX}(f) \cos(2 \pi f \tau) \, {\rm d}f</math> <math display=block>S_{XX}(f) = \int_{-\infty}^\infty \operatorname{R}_{XX}(\tau) \cos(2 \pi f \tau) \, {\rm d}\tau .</math>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Autocorrelation
(section)
Add topic