Choosing a color process Color broadcast studio television cameras in the 1960s, such as the
RCA TK-41, were large, heavy and high in energy consumption. They used three imaging tubes to generate red, green and blue (RGB) video signals which were combined to produce a
composite color picture. These cameras required complex optics to keep the tubes aligned. Since temperature variations and vibration would easily put a three-tube system out of alignment, a more robust system was needed for lunar surface operations. In the 1940s,
CBS Laboratories invented an early color system that utilized a wheel, containing six color filters, rotated in front of a single video camera tube to generate the RGB signal. Called a field-sequential color system, it used
interlaced video, with sequentially alternating color
video fields to produce one complete video frame. That meant that the first field would be red, the second blue, and the third field green – matching the color filters on the wheel. This system was both simpler and more reliable than a standard three-tube color camera, and more power-efficient.
The camera Stanley Lebar and his Westinghouse team wanted to add color to their camera as early as 1967, and they knew that the CBS system would likely be the best system to study. The
Westinghouse lunar color camera used a modified version of CBS's field-sequential color system. A color wheel, with six filter segments, was placed behind the lens mount. It rotated at 9.99 revolutions per second, producing a scan rate of 59.94 fields per second, the same as NTSC video. Synchronization between the color wheel and pickup tube's scan rate was provided by a magnet on the wheel, which controlled the sync pulse generator that governed the tube's timing. The color camera used the same SEC video imaging tube as the monochrome lunar camera flown on Apollo 9. The camera was larger, measuring long, including the new zoom lens. The zoom lens had a
focal length variable from 25 mm to 150 mm, i.e. a zoom ratio of 6:1. At its widest angle, it had a 43-degree field of view, while in its extreme telephoto mode, it had a 7-degree field of view. The
aperture ranged from F4 to F44, with a T5
light transmittance rating.
Color decoding and signal processing Signal processing was needed at the Earth receiving ground stations to compensate for the
Doppler effect, caused by the spacecraft moving away from or towards the Earth. The Doppler Effect would distort color, so a system that employed two videotape recorders (VTRs), with a tape-loop delay to compensate for the effect, was developed. The cleaned signal was then transmitted to Houston in
NTSC-compatible black and white. Unlike the CBS system that required a special mechanical receiver on a TV set to decode the color, the signal was decoded in Houston's Mission Control Center. This video processing occurred in real time. The decoder separately recorded each red, blue and green field onto an analog magnetic disk recorder. Acting as a framebuffer, it then sent the coordinated color information to an encoder to produce a NTSC color video signal and then released to the broadcast pool feed. Once the color was decoded, scan conversion was not necessary, because the color camera ran at the same 60-fields-per-second video interlace rate as the NTSC standard.
Operational history It was first used on the
Apollo 10 mission. The camera used the command module's extra S-band channel and large S-band antenna to accommodate the camera's larger bandwidth. It was only used in the lunar module when it was docked to the command module. Unlike the earlier cameras, it contained a portable video monitor that could be either directly attached to the camera or float separately. Combined with the new zoom lens, it allowed the astronauts to have better precision with their framing. Apollo 12 was the first mission to use the color camera on the lunar surface. About 42 minutes into telecasting the first EVA, astronaut
Alan Bean inadvertently pointed the camera at the Sun while preparing to mount it on the tripod. The Sun's extreme brightness burned out the video pickup tube, rendering the camera useless. When the camera was returned to Earth, it was shipped to Westinghouse, and they were able to get an image on the section of the tube that wasn't damaged. Procedures were re-written in order to prevent such damage in the future, including the addition of a lens cap to protect the tube when the camera was repositioned off the MESA. " issue with color camera. The color camera successfully covered the lunar operations during the
Apollo 14 mission in 1971. Image quality issues appeared due to the camera's
automatic gain control (AGC) having problems getting the proper exposure when the astronauts were in high contrast light situations, and caused the white spacesuits to be overexposed or "
bloom". The camera did not have a
gamma correction circuit. This resulted in the image's mid-tones losing detail. After Apollo 14, it was only used in the command module, as the new RCA-built camera replaced it for lunar surface operations. The Westinghouse color camera continued to be used throughout the 1970s on all three Skylab missions and the
Apollo–Soyuz Test Project. The 1969–1970 Emmy Awards for Outstanding Achievement in Technical/Engineering Development were awarded to NASA for the conceptual aspects of the color Apollo television camera and to Westinghouse Electric Corporation for the development of the camera. ==Later use==