Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Discrete Fourier transform
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==== Probabilistic uncertainty principle ==== If the random variable {{math|''X''<sub>''k''</sub>}} is constrained by :<math>\sum_{n=0}^{N-1} |X_n|^2 = 1 ,</math> then :<math>P_n=|X_n|^2</math> may be considered to represent a discrete [[probability mass function]] of {{mvar|n}}, with an associated probability mass function constructed from the transformed variable, :<math>Q_m = N |x_m|^2 .</math> For the case of continuous functions <math>P(x)</math> and <math>Q(k)</math>, the [[Heisenberg uncertainty principle]] states that :<math>D_0(X)D_0(x)\ge\frac{1}{16\pi^2}</math> where <math>D_0(X)</math> and <math>D_0(x)</math> are the variances of <math>|X|^2</math> and <math>|x|^2</math> respectively, with the equality attained in the case of a suitably normalized [[Gaussian distribution]]. Although the variances may be analogously defined for the DFT, an analogous uncertainty principle is not useful, because the uncertainty will not be shift-invariant. Still, a meaningful uncertainty principle has been introduced by Massar and Spindel.<ref name=Massar/> However, the Hirschman [[entropic uncertainty]] will have a useful analog for the case of the DFT.<ref name=DeBrunner/> The Hirschman uncertainty principle is expressed in terms of the [[Entropy (information theory)|Shannon entropy]] of the two probability functions. In the discrete case, the Shannon entropies are defined as :<math>H(X)=-\sum_{n=0}^{N-1} P_n\ln P_n</math> and :<math>H(x)=-\sum_{m=0}^{N-1} Q_m\ln Q_m ,</math> and the [[entropic uncertainty]] principle becomes<ref name=DeBrunner/> :<math>H(X)+H(x) \ge \ln(N) .</math> The equality is obtained for <math>P_n</math> equal to translations and modulations of a suitably normalized [[Kronecker comb]] of period <math>A</math> where <math>A</math> is any exact integer divisor of <math>N</math>. The probability mass function <math>Q_m</math> will then be proportional to a suitably translated [[Kronecker comb]] of period <math>B=N/A</math>.<ref name=DeBrunner/>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Discrete Fourier transform
(section)
Add topic