SoC Currently, most ADSL communication is
full-duplex. Full-duplex ADSL communication is usually achieved on a wire pair by either frequency-division duplex (FDD),
echo-cancelling duplex (ECD), or
time-division duplex (TDD). FDD uses two separate frequency bands, referred to as the upstream and downstream bands. The
upstream band is used for communication from the end user to the telephone central office. The
downstream band is used for communicating from the central office to the end user. for ADSL Annex A, with
frequency-division multiplexing. Red area is the frequency range used by normal voice telephony (
PSTN), the green (upstream) and blue (downstream) areas are used for ADSL. With commonly deployed ADSL over
POTS (Annex A), the band from 26.075
kHz to 137.825 kHz is used for upstream communication, while 138–1104 kHz is used for downstream communication. Under the usual
discrete multitone modulation (DMT) scheme, each of these is further divided into smaller frequency channels of 4.3125 kHz. These frequency channels are sometimes termed
bins. During initial training to optimize transmission quality and speed, the
ADSL modem tests each of the bins to determine the
signal-to-noise ratio at each bin's frequency. Distance from the
telephone exchange, cable characteristics, interference from
AM radio stations, and local interference and electrical noise at the modem's location can adversely affect the
signal-to-noise ratio at particular frequencies. Bins for frequencies exhibiting a reduced signal-to-noise ratio will be used at a lower throughput rate or not at all; this reduces the maximum link capacity but allows the modem to maintain an adequate connection. The DSL modem will make a plan on how to exploit each of the bins, sometimes termed "bits per bin" allocation. Those bins that have a good signal-to-noise ratio (SNR) will be chosen to transmit signals chosen from a greater number of possible encoded values (this range of possibilities equating to more bits of data sent) in each main clock cycle. The number of possibilities must not be so large that the receiver might incorrectly decode which one was intended in the presence of noise. Noisy bins may only be required to carry as few as two bits, a choice from only one of four possible patterns, or only one bit per bin in the case of ADSL2+, and very noisy bins are not used at all. If the pattern of noise versus frequencies heard in the bins changes, the DSL modem can alter the bits-per-bin allocations, in a process called "bitswap", where bins that have become noisier are only required to carry fewer bits and other channels will be chosen to be given a higher burden. The data transfer capacity the DSL modem therefore reports is determined by the total of the bits-per-bin allocations of all the bins combined. Higher signal-to-noise ratios and more bins being in use gives a higher total link capacity, while lower signal-to-noise ratios or fewer bins being used gives a low link capacity. The total maximum capacity derived from summing the bits-per-bin is reported by DSL modems and is sometimes termed
sync rate. This will always be rather misleading: the true maximum link capacity for user data transfer rate will be significantly lower because extra data are transmitted that are termed
protocol overhead, reduced figures for
PPPoA connections of around 84–87 percent, at most, being common. In addition, some ISPs will have traffic policies that limit maximum transfer rates further in the networks beyond the exchange, and traffic congestion on the Internet, heavy loading on servers and slowness or inefficiency in customers' computers may all contribute to reductions below the maximum attainable. When a wireless access point is used, low or unstable wireless signal quality can also cause reduction or fluctuation of actual speed. In fixed-rate mode, the sync rate is predefined by the operator and the DSL modem chooses a bits-per-bin allocation that yields an approximately equal error rate in each bin. In variable-rate mode, the bits-per-bin are chosen to maximize the sync rate, subject to a tolerable error risk. These choices can either be conservative, where the modem chooses to allocate fewer bits per bin than it possibly could, a choice that makes for a slower connection, or less conservative in which more bits per bin are chosen in which case there is a greater risk case of error should future signal-to-noise ratios deteriorate to the point where the bits-per-bin allocations chosen are too high to cope with the greater noise present. This conservatism, involving a choice of using fewer bits per bin as a safeguard against future noise increases, is reported as the signal-to-noise ratio
margin or
SNR margin. The telephone exchange can indicate a suggested SNR margin to the customer's DSL modem when it initially connects, and the modem may make its bits-per-bin allocation plan accordingly. A high SNR margin will mean a reduced maximum throughput, but greater reliability and stability of the connection. A low SNR margin will mean high speeds, provided the noise level does not increase too much; otherwise, the connection will have to be dropped and renegotiated (resynced). ADSL2+ can better accommodate such circumstances, offering a feature termed
seamless rate adaptation (SRA), which can accommodate changes in total link capacity with less disruption to communications. of modem on ADSL line Vendors may support the usage of higher frequencies as a proprietary extension to the standard. However, this requires matching vendor-supplied equipment on both ends of the line, and will likely result in crosstalk problems that affect other lines in the same bundle. There is a direct relationship between the number of channels available and the throughput capacity of the ADSL connection. The exact data capacity per channel depends on the
modulation method used. ADSL initially existed in two versions (similar to
VDSL), namely
CAP and DMT. CAP was the
de facto standard for ADSL deployments up until 1996, deployed in 90 percent of ADSL installations at the time. However, DMT was chosen for the first ITU-T ADSL standards, G.992.1 and G.992.2 (also called
G.dmt and
G.lite respectively). Therefore, all modern installations of ADSL are based on the DMT modulation scheme.
Interleaving and fastpath ISPs have the option to use
interleaving of packets to counter the effects of
burst noise on the telephone line. An interleaved line has a depth, usually 8 to 64, which describes how many
Reed–Solomon codewords are accumulated before they are sent. As they can all be sent together, their
forward error correction codes can be made more resilient. Interleaving adds
latency as all the packets have to first be gathered (or replaced by empty packets) and they, of course, all take time to transmit. 8 frame interleaving adds 5 ms
round-trip time, while 64 deep interleaving adds 25 ms. Other possible depths are 16 and 32. "Fastpath" connections have an interleaving depth of 1, that is one packet is sent at a time. This has a low latency, usually around 10 ms (interleaving adds to it, this is not greater than interleaved) but it is extremely prone to errors, as any burst of noise can take out the entire packet and so require it all to be retransmitted. Such a burst on a large interleaved packet only blanks part of the packet, it can be recovered from error correction information in the rest of the packet. A "fastpath" connection will result in extremely high latency on a poor line, as each packet will take many retries. ==Transport protocols==