Error Location Analysis Error Location Analysis became a cornerstone technique for diagnosing problems with digital communications by identifying the precise location of bit-errors during a bit error rate testing session. Fundamental bit error rate testing is accomplished by comparing a communicated bit-stream with a reference (correct) bit-stream. The correct bits may be stored in a memory device or synthesized in real-time as reception of the communicated bit-stream occurs. It is common for these bit-streams to contain
pseudo-random binary sequence patterns that permit easy synchronization with just a few correctly received consecutive bits. This self-synchronizing feature of PRBS patterns make them ideal for BERT testing. Traditional BERT devices possess generators that produce test bit-streams for output, and detectors that receive test bit-streams and count the number of bits being received and the number of mis-matches between the received and the reference bit-streams. Using the simple formula,
BER = Number-of-Errors / Number-of-Bits-Transmitted, the BER is calculated. Error location analysis extends these fundamental features by employing a hardware memory device that effectively stores the value of the received bit counter whenever a mismatching error is detected. This produces a stream of error locations that are processed in real-time, and/or stored for post-processing. Many types of statistical processing algorithms were developed by SyntheSys Research, producing a comprehensive set of error location analysis tools. Error location analysis tools: • Basic BER • Burst BER Statistics • Define a burst using 'Minimum Error Free Interval' and 'Minimum Burst Length' parameters • Burst Length Histogram • Error Free Interval Histogram • Errors Modulo-N Histogram • Errors Modulo External Hardware Trigger Histogram • Pattern-Sensitivity Histogram • 2D Error Mapping • Simulated Error Correction Coding • Define hypothetical multidimensional Reed-Solomon error correcting architecture to act as a
filter and view all other analyses after correctable errors are removed from the error location stream. One of the earliest uses of 2D Error Mapping was applied to the recorded errors from a transverse-scan digital tape recording device made by Ampex Corporation. The accompanying
Media Scan image demonstrates that error locations are key to recognizing that multiple error-producing syndromes are occurring simultaneously and affecting the total bit error rate experienced during the test recording session. This technique is described in U.S. Patent US6636994 B1
Eye-diagramming on a BERT BERT instruments are often found near
oscilloscopes because they complement each other. The BERT provides the bottom-line capability of determining if digital bits are being communicated effectively, but if they are not, the oscilloscope provides the ability to see the underlying analog waveform used to communicate the digital bits. It is common for underlying analog problems, such as timing jitter or amplitude noise or slow rise-time, to translate into digital bit errors. Beginning in a digital television product called the DVA184-C, and then following with the high-definition video product HDVA-292, and every BERT product afterwards, SyntheSys Research engineers integrated the ability to display the analog
eye-diagram of the signal being evaluated. The notable advantage of this approach is two-fold. First, precisely the same electronic circuits and components that are used to produce the BER measurements are also used to produce the eye diagram display. This ensures that the differences between how two separate circuits would interpret the signal – owing to their unique frequency responses – are eliminated and there is perfect correlation between BER results and eye diagram results, which is very important and impossible to do otherwise. Second, the mechanism invented by Tom Waschura to create the eye diagram utilizes a counter and a variable-threshold comparator (in some instances, two comparators) to image the eye. By selecting different threshold levels in combination with certain time delays, individual pixels of the eye diagram are evaluated by counting the number of bits that exceed the threshold – or in the case of dual thresholds, are in between two thresholds. This capability is important because the speed of the evaluation is dependent on the clocking speed of the transmission that is often hundreds of times faster than the sampling rate of an oscilloscope. This is an important advantage in signal integrity test applications, where it is difficult to capture low-probability occurring events. Based on
deep measurements of a two-dimensional grid, the BERT instrument was capable of making common waveform measurements such as rise time, jitter, amplitude, and so on. This technique is described in U.S. Patent EP 1315327 A3.
BER Contour By combining eye diagramming techniques with precisely positioned BER measurements, Jim Waschura invented a technique of sweeping the interior of an eye diagram to curve-fit measurements to mathematical models representing the interior slopes of the eye diagram for random and deterministic effects. This technique is notable because many performance criteria are defined in terms of the
eye opening at a specified probability level (e.g. 1E-12), that requires an impossible amount of data acquisition to measure directly. This technique allows for modest extrapolations to be performed to evaluate the 1E-12 eye opening with measurements that are acquired in only a few minutes. This technique also became a very powerful means to communicate the strength of the BERTScope's eye diagramming capability since it demonstrates how eyes that appear open if sampled to a shallow depth - such as those sampled by contemporary oscilloscopes - very quickly could become closed if the underlying problem was a random process such as jitter or noise. See the accompanying picture and compare how wide open the middle eye diagram appears versus how quickly the eye width shrinks with longer test intervals because of the random jitter and the low slope of the contour it produces.
Stressed-Eye Testing During the analog to digital transition that occurred during the life-span of SyntheSys Research, many techniques were employed by engineers to evaluate digital transmission. Analog transmission had been characterized by modestly simple measurements of the analog waveform such as rise-time, signal-to-noise ratio and frequency response. Digital transmission, however, employed sophisticated coding and error correction and other mechanisms that could perfectly recreate digital bits even if the underlying analog waveform was quite bad, and made it much more difficult to have a single qualitative figure of merit for the performance of the digital transmission. BER was a very useful measure, but it didn't degrade politely like analog signal-to-noise measurements. In digital communications, a
cliff-effect occurs which means communication is often perfect until it fails completely. Early digital testing was focused on the transmitter to ensure that the waveform transmission characteristics met a standard set of characteristics for parameters such as rise time, amplitude, jitter, and so on. It was assumed that if the signal originated correctly, and was communicated well, that the receiver would interpret it properly and reproduce the correct bits. This was often augmented with modest frame-based checksum approaches for the receiver to finally validate the BER of what it had received. As transmission rates increased, however, it became important to allocate the system performance margin between both the transmitter and the receiver, and this produced new requirements to evaluate the receivers in the communications system. Each standards-body and groups of engineers approached this problem differently. Early digital television standards were the first to employ testing with a 100-meter length of cable (later replaced by an electronic circuit called a
cable clone) to validate that the signal may be received properly when used
under stress. 10 Gb Ethernet and other standards employed jitter and noise-producing circuits to electronically degrade the transmission and produce a
Stressed-Eye for testing. It became very important to have multiple sources of different types of amplitude noise and time-domain modulated jitter sources to produce a calibrated
cocktail of degradations that would close the transmitted eye for receiver testing, but making calibrated jitter and noise sources required difficult engineering that had to be reproduced exactly each time receiver testing was to be performed. In 2005, with the introduction of the BERTScope 12500-S, SyntheSys Research provided the first built-in stressed-eye sources that greatly assisted engineers in performing popular stressed-eye receiver testing.
ECC Emulation To increase communications channel data rates, common techniques are applied such as speeding up clocking rates, and increasing the dimensions of modulation, and increasing coding efficiencies. Ultimately, channels become so optimized they are operating at the extents of their physical limitations, and in these cases, improving reliability is done by adding
Forward error correction (FEC) also known as Error Correcting Code (ECC) capabilities that trade the overhead of transmitting extra information with the advantage of being able to correct errors during transmission. To design efficient FEC strategies it is important to know the profile of the raw errors in the underlying channel and Error Location Analysis proved very helpful for this purpose. Features like Burst Length Histogram helped engineers choose FEC interleave depths, and features like Block Error Histogram indicated the correction strengths required for full correction. The Error Location Analysis feature aimed at the FEC application was called ECC Emulation. Using this feature, the system could be configured to produce full-range error analysis of a raw communications channel as-if it had a specified FEC architecture operating on it. FEC architectures were specified by configuring the number of rows and columns and tables in a hypothetical 3-dimensional interleave FEC system, together with correction strengths in each dimension. The system also enabled activity monitoring for each layer of correction capability to observe headroom when operating on live or pre-recorded raw error sessions. == Key roles ==