Wednesday, 30 January 2013

The signal-to-noise ratio


The signal-to-noise ratio


The signal-to-noise ratio is important in the transmISSIOn of digital data
because it sets the upper bound on the achievable data rate. Shannon's result is that
the maximum channel capacity, in bits per second, obeys the equation
C = B log2(1 + SNR)
where C is the capacity of the channel in bits per second and B is the bandwidth of
the channel in Hertz. The Shannon formula represents the theoretical maximum
that can be achieved. In practice, however, only much lower rates are achieved. One
reason for this is that the formula assumes white noise (thermal noise). Impulse
noise is not accounted for, nor are attenuation distortion or delay distortion. Various
types of noise and distortion are discussed in Chapter 5.
The capacity indicated in the preceding equation is referred to as the error-free
capacity. Shannon proved that if the actual information rate on a channel is less than
the error-free capacity, then it is theoretically possible to use a suitable signal code to
achieve error-free transmission through the channeL Shannon's theorem unfortunately
does not suggest a means for finding such codes, but it does provide a yardstick
by which the performance of practical communication schemes may be measured.
Several other observations concerning the preceding equation may be instructive.
For a given level of noise, it would appear that the data rate could be increased
by increasing either signal strength or bandwidth. However, as the signal strength
increases, so do the effects of nonlinearities in the system, leading to an increase in
intermodulation noise. Note also that, because noise is assumed to be white, the
wider the bandwidth, the more noise is admitted to the system. Thus, as B increases,
SNR decreases.

No comments:

Post a Comment