Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
White noise
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Mathematical definitions== ===White noise vector=== A [[random vector]] (that is, a random variable with values in ''R<sup>n</sup>'') is said to be a white noise vector or white random vector if its components each have a [[probability distribution]] with zero mean and finite [[variance]],{{what|reason=Why aren't the variances required to be identical, like in the Gaussian case (two paragraphs below) and the discrete-time case (next section)?|date=October 2023}} and are [[statistically independent]]: that is, their [[joint probability distribution]] must be the product of the distributions of the individual components.<ref name="fessler"> Jeffrey A. Fessler (1998), ''[https://web.archive.org/web/20131218214647/http://andywilliamson.org/_/wp-content/uploads/2010/04/White-Noise.pdf On Transformations of Random Vectors.]'' Technical report 314, Dept. of Electrical Engineering and Computer Science, Univ. of Michigan. ([[PDF]])</ref> A necessary (but, [[normally distributed and uncorrelated does not imply independent|in general, not sufficient]]) condition for statistical independence of two variables is that they be [[correlation|statistically uncorrelated]]; that is, their [[covariance]] is zero. Therefore, the [[covariance matrix]] ''R'' of the components of a white noise vector ''w'' with ''n'' elements must be an ''n'' by ''n'' [[diagonal matrix]], where each diagonal element ''R<sub>ii</sub>'' is the [[variance]] of component ''w<sub>i</sub>''; and the [[Correlation and dependence#Correlation matrices|correlation]] matrix must be the ''n'' by ''n'' identity matrix. If, in addition to being independent, every variable in ''w'' also has a [[normal distribution]] with zero mean and the same variance <math>\sigma^2</math>, ''w'' is said to be a Gaussian white noise vector. In that case, the joint distribution of ''w'' is a [[multivariate normal distribution]]; the independence between the variables then implies that the distribution has [[elliptical distribution|spherical symmetry]] in ''n''-dimensional space. Therefore, any [[orthogonal transformation]] of the vector will result in a Gaussian white random vector. In particular, under most types of [[discrete Fourier transform]], such as [[FFT]] and [[discrete Hartley transform|Hartley]], the transform ''W'' of ''w'' will be a Gaussian white noise vector, too; that is, the ''n'' Fourier coefficients of ''w'' will be independent Gaussian variables with zero mean and the same variance <math>\sigma^2</math>. The [[power spectrum]] ''P'' of a random vector ''w'' can be defined as the expected value of the [[squared modulus]] of each coefficient of its Fourier transform ''W'', that is, ''P<sub>i</sub>'' = E(|''W<sub>i</sub>''|<sup>2</sup>). Under that definition, a Gaussian white noise vector will have a perfectly flat power spectrum, with ''P<sub>i</sub>'' = ''σ''<sup>2</sup> for all ''i''. If ''w'' is a white random vector, but not a Gaussian one, its Fourier coefficients ''W<sub>i</sub>'' will not be completely independent of each other; although for large ''n'' and common probability distributions the dependencies are very subtle, and their pairwise correlations can be assumed to be zero. Often the weaker condition statistically uncorrelated is used in the definition of white noise, instead of statistically independent. However, some of the commonly expected properties of white noise (such as flat power spectrum) may not hold for this weaker version. Under this assumption, the stricter version can be referred to explicitly as independent white noise vector.<ref name="ezivot">Eric Zivot and Jiahui Wang (2006), [http://faculty.washington.edu/ezivot/econ584/notes/timeSeriesConcepts.pdf Modeling Financial Time Series with S-PLUS]. Second Edition. ([[PDF]])</ref>{{rp|p.60}} Other authors use strongly white and weakly white instead.<ref name="diebold">[[Francis X. Diebold]] (2007), ''[https://www.sas.upenn.edu/~fdiebold/Teaching221/FullBook.pdf Elements of Forecasting],'' 4th edition. ([[PDF]])</ref> An example of a random vector that is Gaussian white noise in the weak but not in the strong sense is <math>x=[x_1,x_2]</math> where <math>x_1</math> is a normal random variable with zero mean, and <math>x_2</math> is equal to <math>+x_1</math> or to <math>-x_1</math>, with equal probability. These two variables are uncorrelated and individually normally distributed, but they are not jointly normally distributed and are not independent. If <math>x</math> is rotated by 45 degrees, its two components will still be uncorrelated, but their distribution will no longer be normal. In some situations, one may relax the definition by allowing each component of a white random vector <math>w</math> to have non-zero expected value <math>\mu</math>. In [[image processing]] especially, where samples are typically restricted to positive values, one often takes <math>\mu</math> to be one half of the maximum sample value. In that case, the Fourier coefficient <math>W_0</math> corresponding to the zero-frequency component (essentially, the average of the <math>w_i</math>) will also have a non-zero expected value <math>\mu\sqrt{n}</math>; and the power spectrum <math>P</math> will be flat only over the non-zero frequencies. ===Discrete-time white noise=== A discrete-time [[stochastic process]] <math>W(n)</math> is a generalization of a random vector with a finite number of components to infinitely many components. A discrete-time stochastic process <math>W(n)</math> is called white noise if its mean is equal to zero for all <math>n</math> , i.e. <math>\operatorname{E}[W(n)] = 0</math> and if the autocorrelation function <math>R_{W}(n) = \operatorname{E}[W(k+n)W(k)]</math> has a nonzero value only for <math>n = 0</math>, i.e. <math>R_{W}(n) = \sigma^2 \delta(n)</math>.{{citation needed|date=February 2024}}{{what|reason=Why aren't the W(n) required to be statistically independent, as for the finite-component white noise of the last section?|date=October 2023}} ===Continuous-time white noise=== In order to define the notion of white noise in the theory of [[continuous-time]] signals, one must replace the concept of a random vector by a continuous-time random signal; that is, a random process that generates a function <math>w</math> of a real-valued parameter <math>t</math>. Such a process is said to be white noise in the strongest sense if the value <math>w(t)</math> for any time <math>t</math> is a random variable that is statistically independent of its entire history before <math>t</math>. A weaker definition requires independence only between the values <math>w(t_1)</math> and <math>w(t_2)</math> at every pair of distinct times <math>t_1</math> and <math>t_2</math>. An even weaker definition requires only that such pairs <math>w(t_1)</math> and <math>w(t_2)</math> be uncorrelated.<ref name=econterms>[http://economics.about.com/od/economicsglossary/g/whitenoise.htm ''White noise process''] {{Webarchive|url=https://web.archive.org/web/20160911134507/http://economics.about.com/od/economicsglossary/g/whitenoise.htm |date=2016-09-11 }}. By Econterms via About.com. Accessed on 2013-02-12.</ref> As in the discrete case, some authors adopt the weaker definition for white noise, and use the qualifier independent to refer to either of the stronger definitions. Others use weakly white and strongly white to distinguish between them. However, a precise definition of these concepts is not trivial, because some quantities that are finite sums in the finite discrete case must be replaced by integrals that may not converge. Indeed, the set of all possible instances of a signal <math>w</math> is no longer a finite-dimensional space <math>\mathbb{R}^n</math>, but an infinite-dimensional [[function space]]. Moreover, by any definition a white noise signal <math>w</math> would have to be essentially discontinuous at every point; therefore even the simplest operations on <math>w</math>, like integration over a finite interval, require advanced mathematical machinery. Some authors{{citation needed|date=October 2023}}{{what|reason=Since the definition proposed in this section is not remotely workable in a mathematical sense, I doubt that any authors do this. Instead, we are looking at a heuristic only.|date=October 2023}} require each value <math>w(t)</math> to be a real-valued random variable with expectation <math>\mu</math> and some finite variance <math>\sigma^2</math>. Then the covariance <math>\mathrm{E}(w(t_1)\cdot w(t_2))</math> between the values at two times <math>t_1</math> and <math>t_2</math> is well-defined: it is zero if the times are distinct, and <math>\sigma^2</math> if they are equal. However, by this definition, the integral : <math>W_{[a,a+r]} = \int_a^{a+r} w(t)\, dt</math> over any interval with positive width <math>r</math> would be simply the width times the expectation: <math>r\mu</math>.{{what|reason=The *expectation value* of the mean is zero. And this is not a problem.|date=October 2023}} This property renders the concept inadequate as a model of white noise signals either in a physical or mathematical sense.{{what|reason=Why?|date=October 2023}} Therefore, most authors define the signal <math>w</math> indirectly by specifying random values for the integrals of <math>w(t)</math> and <math>|w(t)|^2</math> over each interval <math>[a,a+r]</math>. In this approach, however, the value of <math>w(t)</math> at an isolated time cannot be defined as a real-valued random variable{{Citation needed|reason=an authoritative work on white noise given one such example should be given|date=January 2017}}. Also the covariance <math>\mathrm{E}(w(t_1)\cdot w(t_2))</math> becomes infinite when <math>t_1=t_2</math>; and the [[autocorrelation]] function <math>\mathrm{R}(t_1,t_2)</math> must be defined as <math>N \delta(t_1-t_2)</math>, where <math>N</math> is some real constant and <math>\delta</math> is the [[Dirac delta function]].{{what|reason=Correlation can only take on values in [0,1], so N must be 1 and delta must take the value 1 for t_1 = t_2; it is not the dirac measure here. However, all these concepts are fishy.|date=October 2023}} In this approach, one usually specifies that the integral <math>W_I</math> of <math>w(t)</math> over an interval <math>I=[a,b]</math> is a real random variable with normal distribution, zero mean, and variance <math>(b-a)\sigma^2</math>; and also that the covariance <math>\mathrm{E}(W_I\cdot W_J)</math> of the integrals <math>W_I</math>, <math>W_J</math> is <math>r\sigma^2</math>, where <math>r</math> is the width of the intersection <math>I\cap J</math> of the two intervals <math>I,J</math>. This model is called a Gaussian white noise signal (or process). In the mathematical field known as [[white noise analysis]], a Gaussian white noise <math>w</math> is defined as a stochastic tempered distribution, i.e. a random variable with values in the space <math>\mathcal S'(\mathbb R)</math> of [[Distribution (mathematics)#Tempered distribution|tempered distributions]]. Analogous to the case for finite-dimensional random vectors, a probability law on the infinite-dimensional space <math>\mathcal S'(\mathbb R)</math> can be defined via its characteristic function (existence and uniqueness are guaranteed by an extension of the Bochner–Minlos theorem, which goes under the name Bochner–Minlos–Sazanov theorem); analogously to the case of the multivariate normal distribution <math>X \sim \mathcal N_n (\mu , \Sigma )</math>, which has characteristic function : <math>\forall k \in \mathbb R^n: \quad \mathrm{E}(\mathrm e^{\mathrm{i} \langle k, X \rangle }) = \mathrm e^{\mathrm i \langle k, \mu \rangle - \frac 1 2 \langle k, \Sigma k \rangle } ,</math> the white noise <math>w : \Omega \to \mathcal S'(\mathbb R)</math> must satisfy : <math>\forall \varphi \in \mathcal S (\mathbb R) : \quad \mathrm{E}(\mathrm e^{\mathrm{i} \langle w, \varphi \rangle }) = \mathrm e^{- \frac 1 2 \| \varphi \|_2^2},</math> where <math>\langle w, \varphi \rangle</math> is the natural pairing of the tempered distribution <math>w(\omega)</math> with the Schwartz function <math>\varphi</math>, taken scenariowise for <math>\omega \in \Omega</math>, and <math>\| \varphi \|_2^2 = \int_{\mathbb R} \vert \varphi (x) \vert^2\,\mathrm d x </math>.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
White noise
(section)
Add topic