Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Information theory
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Historical background== {{Main|History of information theory}} The landmark event ''establishing'' the discipline of information theory and bringing it to immediate worldwide attention was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the ''[[Bell System Technical Journal]]'' in July and October 1948. Historian [[James Gleick]] rated the paper as the most important development of 1948, noting that the paper was "even more profound and more fundamental" than the [[transistor]].{{sfn|Gleick|2011|pp=3–4}} He came to be known as the "father of information theory".<ref>{{Cite web |last=Horgan |first=John |date=2016-04-27 |title=Claude Shannon: Tinkerer, Prankster, and Father of Information Theory |url=https://spectrum.ieee.org/claude-shannon-tinkerer-prankster-and-father-of-information-theory |access-date=2023-09-30 |website=[[IEEE]] |language=en}}</ref><ref>{{Cite magazine |last=Roberts |first=Siobhan |date=2016-04-30 |title=The Forgotten Father of the Information Age |language=en-US |magazine=The New Yorker |url=https://www.newyorker.com/tech/annals-of-technology/claude-shannon-the-father-of-the-information-age-turns-1100100 |access-date=2023-09-30 |issn=0028-792X}}</ref><ref name=":1">{{Cite web |last=Tse |first=David |date=2020-12-22 |title=How Claude Shannon Invented the Future |url=https://www.quantamagazine.org/how-claude-shannons-information-theory-invented-the-future-20201222/ |access-date=2023-09-30 |website=Quanta Magazine}}</ref> Shannon outlined some of his initial ideas of information theory as early as 1939 in a letter to [[Vannevar Bush]].<ref name=":1" /> Prior to this paper, limited information-theoretic ideas had been developed at [[Bell Labs]], all implicitly assuming events of equal probability. [[Harry Nyquist]]'s 1924 paper, ''Certain Factors Affecting Telegraph Speed'', contains a theoretical section quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation {{math|1=''W'' = ''K'' log ''m''}} (recalling the [[Boltzmann constant]]), where ''W'' is the speed of transmission of intelligence, ''m'' is the number of different voltage levels to choose from at each time step, and ''K'' is a constant. [[Ralph Hartley]]'s 1928 paper, ''Transmission of Information'', uses the word ''information'' as a measurable quantity, reflecting the receiver's ability to distinguish one [[sequence of symbols]] from any other, thus quantifying information as {{math|1=''H'' = log ''S''<sup>''n''</sup> = ''n'' log ''S''}}, where ''S'' was the number of possible symbols, and ''n'' the number of symbols in a transmission. The unit of information was therefore the [[decimal digit]], which since has sometimes been called the [[Hartley (unit)|hartley]] in his honor as a unit or scale or measure of information. [[Alan Turing]] in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war [[Cryptanalysis of the Enigma|Enigma]] ciphers.{{citation needed|date=April 2024}} Much of the mathematics behind information theory with events of different probabilities were developed for the field of [[thermodynamics]] by [[Ludwig Boltzmann]] and [[J. Willard Gibbs]]. Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by [[Rolf Landauer]] in the 1960s, are explored in ''[[Entropy in thermodynamics and information theory]]''.{{citation needed|date=April 2024}} In Shannon's revolutionary and groundbreaking paper, the work for which had been substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion: :"''The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point.''" With it came the ideas of: * the [[information entropy]] and [[redundancy (information theory)|redundancy]] of a source, and its relevance through the [[source coding theorem]]; * the [[mutual information]], and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem; * the practical result of the [[Shannon–Hartley law]] for the channel capacity of a [[Gaussian channel]]; as well as * the [[bit]]—a new way of seeing the most fundamental unit of information.{{citation needed|date=April 2024}}
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Information theory
(section)
Add topic