The
Shannon–Hartley theorem says that the limit of reliable
information rate (data rate exclusive of error-correcting codes) of a channel depends on bandwidth and signal-to-noise ratio according to: I where is the
information rate in
bits per second excluding
error-correcting codes, is the
bandwidth of the channel in
hertz, is the total signal power (equivalent to the carrier power ), and is the total noise power in the bandwidth. This equation can be used to establish a bound on E_b/N_0 for any system that achieves reliable communication, by considering a gross bit rate equal to the net bit rate and therefore an average energy per bit of E_b = S/R, with noise spectral density of N_0 = N/B. For this calculation, it is conventional to define a normalized rate R_l = R/(2B), a bandwidth utilization parameter of bits per second per half hertz, or bits per dimension (a signal of bandwidth can be encoded with 2B dimensions, according to the
Nyquist–Shannon sampling theorem). Making appropriate substitutions, the Shannon limit is: {R \over B} = 2 R_l Which can be solved to get the Shannon-limit bound on E_b/N_0: \frac{E_\text{b}}{N_0} > \frac{2^{2R_l} - 1}{2R_l} When the data rate is small compared to the bandwidth, so that R_l is near zero, the bound, sometimes called the
ultimate Shannon limit, is: \frac{E_\text{b}}{N_0} > \ln(2) which corresponds to −1.59dB. This often-quoted limit of −1.59 dB applies
only to the theoretical case of infinite bandwidth. The Shannon limit for finite-bandwidth signals is always higher. == Cutoff rate ==