Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Shannon–Hartley theorem
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Hartley's law=== During 1928, Hartley formulated a way to quantify information and its [[line rate]] (also known as [[data signalling rate]] ''R'' bits per second).<ref>{{cite journal | first = R. V. L.|last=Hartley | title = Transmission of Information | url = http://www.dotrose.com/etext/90_Miscellaneous/transmission_of_information_1928b.pdf | journal = Bell System Technical Journal |date=July 1928| volume = 7 | issue = 3 | pages = 535–563 | doi = 10.1002/j.1538-7305.1928.tb01236.x }}</ref> This method, later known as Hartley's law, became an important precursor for Shannon's more sophisticated notion of channel capacity. Hartley argued that the maximum number of distinguishable pulse levels that can be transmitted and received reliably over a communications channel is limited by the dynamic range of the signal amplitude and the precision with which the receiver can distinguish amplitude levels. Specifically, if the amplitude of the transmitted signal is restricted to the range of [−''A'' ... +''A''] volts, and the precision of the receiver is ±Δ''V'' volts, then the maximum number of distinct pulses ''M'' is given by :<math>M = 1 + { A \over \Delta V } </math>. By taking information per pulse in bit/pulse to be the base-2-[[logarithm]] of the number of distinct messages ''M'' that could be sent, Hartley<ref>{{cite book | title = Information Theory; and its Engineering Applications | first= D. A.| last=Bell | edition = 3rd | year = 1962 | publisher = Pitman | location = New York|isbn=9780273417576|url=https://books.google.com/books?id=_O1SAAAAMAAJ}}</ref> constructed a measure of the line rate ''R'' as: :<math> R = f_p \log_2(M), </math> where <math>f_p</math> is the pulse rate, also known as the symbol rate, in symbols/second or [[baud]]. Hartley then combined the above quantification with Nyquist's observation that the number of independent pulses that could be put through a channel of one-sided bandwidth <math>B</math> [[hertz]] was <math>2B</math> pulses per second, to arrive at his quantitative measure for achievable line rate. Hartley's law is sometimes quoted as just a proportionality between the [[bandwidth (signal processing)|analog bandwidth]], <math>B</math>, in Hertz and what today is called the [[bandwidth (computing)|digital bandwidth]], <math>R</math>, in bit/s.<ref>{{cite book | title = Introduction to Telecommunications | edition = 2nd |first= Anu A. |last=Gokhale | publisher = Thomson Delmar Learning | year = 2004 | isbn = 1-4018-5648-9 | url = https://books.google.com/books?id=QowmxWAOEtYC&q=%22hartley%27s+law%22+proportional&pg=PA37 }}</ref> Other times it is quoted in this more quantitative form, as an achievable line rate of <math>R</math> bits per second:<ref>{{cite book | title = Telecommunications Engineering | first1 = John|last1=Dunlop|first2=D. Geoffrey|last2=Smith | publisher = CRC Press | year = 1998 | url = https://books.google.com/books?id=-kyPyn3Dst8C&q=%22hartley%27s+law%22&pg=RA4-PA30 | isbn = 0-7487-4044-9 }}</ref> :<math> R \le 2B \log_2(M). </math> Hartley did not work out exactly how the number ''M'' should depend on the noise statistics of the channel, or how the communication could be made reliable even when individual symbol pulses could not be reliably distinguished to ''M'' levels; with Gaussian noise statistics, system designers had to choose a very conservative value of <math>M</math> to achieve a low error rate. The concept of an error-free capacity awaited Claude Shannon, who built on Hartley's observations about a logarithmic measure of information and Nyquist's observations about the effect of bandwidth limitations. Hartley's rate result can be viewed as the capacity of an errorless ''M''-ary channel of <math>2B</math> symbols per second. Some authors refer to it as a capacity. But such an errorless channel is an idealization, and if M is chosen small enough to make the noisy channel nearly errorless, the result is necessarily less than the Shannon capacity of the noisy channel of bandwidth <math>B</math>, which is the Hartley–Shannon result that followed later.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Shannon–Hartley theorem
(section)
Add topic