When motion picture film was developed, the movie screen had to be illuminated at a high rate to prevent visible
flicker. The exact rate necessary varies by brightness — 50 Hz is (barely) acceptable for small, low brightness displays in dimly lit rooms, whilst 80 Hz or more may be necessary for bright displays that extend into peripheral vision. The film solution was to project each frame of film three times using a three-bladed shutter: a movie shot at 16 frames per second illuminated the screen 48 times per second. Later, when sound film became available, the higher projection speed of 24 frames per second enabled a two-bladed shutter to produce 48 times per second illumination—but only in projectors incapable of projecting at the lower speed. This solution could not be used for television. To store a full video frame and display it twice requires a
frame buffer—electronic memory (
RAM)—sufficient to store a video frame. This method did not become feasible until the late 1980s and with digital technology. In addition, avoiding on-screen
interference patterns caused by studio lighting and the limits of
vacuum tube technology required that CRTs for TV be scanned at
AC line frequency. (This was 60 Hz in the US, 50 Hz Europe.) Several different interlacing patents have been proposed since 1914 in the context of still or moving image transmission, but few of them were practicable. In 1930, German
Telefunken engineer Fritz Schröter first formulated and patented the concept of breaking a single image frame into successive interlaced lines, based on his earlier experiments with phototelegraphy. In the US,
RCA engineer
Randall C. Ballard patented the same idea in 1932, initially for the purpose of reformatting sound film to television rather than for the transmission of live images. Commercial implementation began in 1934 as cathode-ray tube screens became brighter, increasing the level of flicker caused by
progressive (sequential) scanning. In 1936, when the UK was setting analog standards, early
thermionic valve based CRT drive electronics could only scan at around 200 lines in 1/50 of a second (i.e. approximately a 10 kHz repetition rate for the sawtooth horizontal deflection waveform). Using interlace, a pair of 202.5-line fields could be superimposed to become a sharper
405 line frame (with around 377 used for the actual image, and yet fewer visible within the screen bezel; in modern parlance, the standard would be "377i"). The vertical scan frequency remained 50 Hz, but visible detail was noticeably improved. As a result, this system supplanted
John Logie Baird's 240 line mechanical progressive scan system that was also being trialled at the time. From the 1940s onward, improvements in technology allowed the US and the rest of Europe to adopt systems using increasingly higher line-scan frequencies and more radio signal bandwidth to produce higher line counts at the same frame rate, thus achieving better picture quality. However the fundamentals of interlaced scanning were at the heart of all of these systems. The US adopted the
525 line system, later incorporating the composite color standard known as
NTSC, Europe adopted the
625 line system, and the UK switched from its idiosyncratic 405 line system to (the much more US-like) 625 to avoid having to develop a (wholly) unique method of color TV. France switched from its similarly unique
819 line monochrome system to the more European standard of 625. Europe in general, including the UK, then adopted the
PAL color encoding standard, which was essentially based on NTSC, but inverted the color carrier phase with each line (and frame) in order to cancel out the hue-distorting phase shifts that dogged NTSC broadcasts. France instead adopted its own unique, twin-FM-carrier based
SECAM system, which offered improved quality at the cost of greater electronic complexity, and was also used by some other countries, notably Russia and its satellite states. Though the color standards are often used as synonyms for the underlying video standard - NTSC for 525i/60, PAL/SECAM for 625i/50 - there are several cases of inversions or other modifications; e.g. PAL color is used on otherwise "NTSC" (that is, 525i/60) broadcasts in
Brazil, as well as vice versa elsewhere, along with cases of PAL bandwidth being squeezed to 3.58 MHz to fit in the broadcast waveband allocation of NTSC, or NTSC being expanded to take up PAL's 4.43 MHz. Interlacing was ubiquitous in displays until the 1970s, when the needs of
computer monitors resulted in the reintroduction of progressive scan, including on regular TVs or simple monitors based on the same circuitry; most CRT based displays are entirely capable of displaying both progressive and interlace regardless of their original intended use, so long as the horizontal and vertical frequencies match, as the technical difference is simply that of either starting/ending the vertical sync cycle halfway along a scanline every other frame (interlace), or always synchronising right at the start/end of a line (progressive). Interlace is still used for most standard definition TVs, and the
1080i HDTV broadcast standard, but not for
LCD, micromirror (
DLP), or most
plasma displays; these displays do not use a
raster scan to create an image (their panels may still be updated in a left-to-right, top-to-bottom scanning fashion, but always in a progressive fashion, and not necessarily at the same rate as the input signal), and so cannot benefit from interlacing (where older LCDs use a "dual scan" system to provide higher resolution with slower-updating technology, the panel is instead divided into two
adjacent halves that are updated
simultaneously): in practice, they have to be driven with a progressive scan signal. The
deinterlacing circuitry to get progressive scan from a normal interlaced broadcast television signal can add to the cost of a television set using such displays. Currently, progressive displays dominate the HDTV market.
Interlace and computers In the 1970s, computers and home video game systems began using TV sets as display devices. At that point, a 480-line
NTSC signal was well beyond the graphics abilities of low cost computers, so these systems used a simplified video signal that made each video field scan directly on top of the previous one, rather than each line between two lines of the previous field, along with relatively low horizontal pixel counts. This marked the return of
progressive scanning not seen since the 1920s. Since each field became a complete frame on its own, modern terminology would call this
240p on NTSC sets, and
288p on
PAL. While consumer devices were permitted to create such signals, broadcast regulations prohibited TV stations from transmitting video like this. Computer monitor standards such as the TTL-RGB mode available on the
CGA and e.g.
BBC Micro were further simplifications to NTSC, which improved picture quality by omitting modulation of color, and allowing a more direct connection between the computer's graphics system and the CRT. By the mid-1980s, computers had outgrown these video systems and needed better displays. Most home and basic office computers suffered from the use of the old scanning method, with the highest display resolution being around 640x200 (or sometimes 640x256 in 625-line/50 Hz regions), resulting in a severely distorted tall narrow
pixel shape, making the display of high resolution text alongside realistic proportioned images difficult (logical "square pixel" modes were possible but only at low resolutions of 320x200 or less). Solutions from various companies varied widely. Because PC monitor signals did not need to be broadcast, they could consume far more than the 6, 7 and 8
MHz of bandwidth that NTSC and PAL signals were confined to. IBM's
Monochrome Display Adapter and
Enhanced Graphics Adapter as well as the
Hercules Graphics Card and the original
Macintosh computer generated video signals of 342 to 350p, at 50 to 60 Hz, with approximately 16 MHz of bandwidth, some enhanced
PC clones such as the
AT&T 6300 (aka
Olivetti M24) as well as computers made for the Japanese home market managed 480p instead at around 24 MHz, and the
Atari ST pushed that to 71 Hz with 32 MHz bandwidth - all of which required dedicated high-frequency (and usually single-mode, i.e. not "video"-compatible) monitors due to their increased line rates. The
Commodore Amiga instead created a true interlaced 480i60/576i50
RGB signal at broadcast video rates (and with a 7 or 14 MHz bandwidth), suitable for NTSC/PAL encoding (where it was smoothly decimated to 3.5~4.5 MHz). This ability (plus built-in
genlocking) resulted in the Amiga dominating the video production field until the mid-1990s, but the interlaced display mode caused flicker problems for more traditional PC applications where single-pixel detail is required, with "flicker-fixer" scan-doubler peripherals plus high-frequency RGB monitors (or Commodore's own specialist scan-conversion A2024 monitor) being popular, if expensive, purchases amongst power users. 1987 saw the introduction of
VGA, on which PCs soon standardized, as well as Apple's
Macintosh II range which offered displays of similar, then superior resolution and color depth, with rivalry between the two standards (and later PC quasi-standards such as XGA and SVGA) rapidly pushing up the quality of display available to both professional and home users. In the late 1980s and early 1990s, monitor and graphics card manufacturers introduced newer high resolution standards that once again included interlace. These monitors ran at higher scanning frequencies, typically allowing a 75 to 90 Hz field rate (i.e. 37.5 to 45 Hz frame rate), and tended to use longer-persistence phosphors in their CRTs, all of which was intended to alleviate flicker and shimmer problems. Such monitors proved generally unpopular, outside of specialist ultra-high-resolution applications such as
CAD and
DTP which demanded as many pixels as possible, with interlace being a necessary evil and better than trying to use the progressive-scan equivalents. Whilst flicker was often not immediately obvious on these displays, eyestrain and lack of focus nevertheless became a serious problem, and the trade-off for a longer afterglow was reduced brightness and poor response to moving images, leaving visible and often off-colored trails behind. These colored trails were a minor annoyance for monochrome displays, and the generally slower-updating screens used for design or database-query purposes, but much more troublesome for color displays and the faster motions inherent in the increasingly popular window-based operating systems, as well as the full-screen scrolling in WYSIWYG word-processors, spreadsheets, and of course for high-action games. Additionally, the regular, thin horizontal lines common to early GUIs, combined with low color depth that meant window elements were generally high-contrast (indeed, frequently stark black-and-white), made shimmer even more obvious than with otherwise lower fieldrate video applications. As rapid technological advancement made it practical and affordable, barely a decade after the first ultra-high-resolution interlaced upgrades appeared for the IBM PC, to provide sufficiently high pixel clocks and horizontal scan rates for hi-rez progressive-scan modes in first professional and then consumer-grade displays, the practice was soon abandoned. For the rest of the 1990s, monitors and graphics cards instead made great play of their highest stated resolutions being "non-interlaced", even where the overall framerate was barely any higher than what it had been for the interlaced modes (e.g. SVGA at 56p versus 43i to 47i), and usually including a top mode technically exceeding the CRT's actual resolution (number of color-phosphor triads) which meant there was no additional image clarity to be gained through interlacing and/or increasing the signal bandwidth still further. This experience is why the PC industry today remains against interlace in HDTV, and lobbied for the 720p standard, and continues to push for the adoption of 1080p (at 60 Hz for NTSC legacy countries, and 50 Hz for PAL); however, 1080i remains the most common HD broadcast resolution, if only for reasons of backward compatibility with older HDTV hardware that cannot support 1080p - and sometimes not even 720p - without the addition of an external scaler, similar to how and why most SD-focussed digital broadcasting still relies on the otherwise obsolete
MPEG2 standard embedded into e.g.
DVB-T. ==See also==