Jump to content

Shannon–Hartley theorem

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Psychonaut (talk | contribs) at 21:51, 30 September 2005 (fixed punctuation). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In information theory, the Shannon–Hartley theorem states the maximum amount of error-free digital data (that is, information) that can be transmitted over a communication link with a specified bandwidth in the presence of noise interference. The law is named after Claude Shannon and Ralph Hartley. The Shannon limit or Shannon capacity of a communications channel is the theoretical maximum information transfer rate of the channel.

Theorem

Proved by Claude Shannon in 1948, the theorem describes the maximum possible efficiency of error-correcting methods versus levels of noise interference and data corruption. The theory doesn't describe how to construct the error-correcting method, it only tells us how good the best possible method can be. Shannon's theorem has wide-ranging applications in both communications and data storage applications. This theorem is of foundational importance to the modern field of information theory.

If we had such a thing as an infinite-bandwidth, noise-free analog channel we could transmit unlimited amounts of error-free data over it per unit of time. However real life signals have both bandwidth and noise-interference limitations.

Shannon and Hartley asked: How do bandwidth and noise affect the rate at which information can be transmitted over an analog channel? Surprisingly, bandwidth limitations alone do not impose a cap on maximum information transfer. This is because it is still possible (at least in a thought-experiment model) for the signal to take on an infinite number of different voltage levels on each cycle, with each slightly different level being assigned a different meaning or bit sequence. If we combine both noise and bandwidth limitations, however, we do find there is a limit to the amount of information that can be transferred, even when clever multi-level encoding techniques are used. This is because the noise signal obliterates the fine differences that distinguish the various signal levels, limiting in practice the number of detection levels we can use in our scheme.

The Shannon theorem states that given a channel with information capacity C and information is transmitted at a rate R, then if

there exists a coding technique which allows the probability of error at the receiver to be made arbitrarily small. This means that theoretically, it is possible to transmit information without error up to a limit, C.

The converse is also important. If

the probability of error at the receiver increases without bound. This implies that no useful information can be transmitted beyond the channel capacity.

Capacity of a binary symmetric channel with Gaussian noise

Considering all possible multi-level and multi-phase encoding techniques, Shannon's theorem gives the theoretical maximum rate of clean (or arbitrarily low bit error rate) data C with a given average signal power that can be sent through an analog communication channel subject to additive, white, Gaussian-distribution noise interference:

where

C is the channel capacity in bits per second inclusive of error correction;
BW is the bandwidth of the channel in hertz; and
S/N is the signal-to-noise ratio of the communication signal to the Gaussian noise interference expressed as a straight power ratio (not as decibels)

For large or small signal-to-noise ratios, this formula can be approximated.

If S/N >> 1, C = 0.332 · BW · SNR (in dB).

If S/N << 1, C = 1.44 · BW · S/N (in power).

Simple schemes such as "send the message 3 times and use at best 2 out of 3 voting scheme if the copies differ" are inefficient uses of bandwidth, and thus are far from the Shannon limit. Advanced techniques such as Reed-Solomon codes and, more recently, Turbo codes come much closer to reaching the theoretical Shannon limit, but at a cost of high computational complexity. With Turbo codes and the computing power in today's digital signal processors, it is now possible to reach within 1/10 of one decibel of the Shannon limit.

The V.34 modem standard advertises a rate of 33.6 kbit/s, and V.90 claims a rate of 56 kbit/s, apparently in excess of the Shannon limit (telephone bandwidth is 3.3 kHz). In fact, neither standard actually reaches the Shannon limit, but closely approaches it. The speed improvement of V.90 was made possible by the elimination of an additional step of analog-to-digital conversion by the use of fully digital equipment at the other end of a modem connection. This improves the signal to noise ratio, which in turn produces the required headroom to exceed 33.6 kbit/s which was otherwise near the Shannon limit.

Examples

  1. If the S/N is 20 dB, and the bandwidth available is 4 kHz, which is appropriate for telephone communications, then C = 4 log2(1 + 100) = 4 log2 (101) = 26.63 kbit/s. Note that the value of 100 is appropriate for an S/N of 20 dB.
  2. If it is required to transmit at 50 kbit/s, and a bandwidth of 1 MHz is used, then the minimum S/N required is given by 50 = 1000 log2(1+S/N) so S/N = 2C/W -1 = 0.035 corresponding to an S/N of -14.5 dB. This shows that it is possible to transmit using signals which are actually much weaker than the background noise level, as in spread-spectrum communications.

References

  • C. E. Shannon, The Mathematical Theory of Information. Urbana, IL:University of Illinois Press, 1949 (reprinted 1998).
  • Herbert Taub, Donald L. Schilling, "Principles of Communication Systems", McGraw-Hill, 1986

See also