DVDs and Blu-ray Discs NTSC DVDs may carry closed captions in data packets of the MPEG-2 video streams inside of the Video-TS folder. Once played out of the analog outputs of a set top DVD player, the caption data is converted to the Line 21 format. They are output by the player to the
composite video (or an available
RF connector) for a connected TV's built-in decoder or a set-top decoder as usual. They can not be output on
S-Video or
component video outputs due to the lack of a
color burst signal on Line 21. (Actually, regardless of this, if the DVD player is in interlaced rather than progressive mode, closed captioning will be displayed on the TV over component video input if the TV captioning is turned on and set to CC1.) When viewed on a personal computer, caption data can be viewed by software that can read and decode the caption data packets in the MPEG-2 streams of the DVD-Video disc.
Windows Media Player (before
Windows 7) in Vista supported only closed caption channels 1 and 2 (not 3 or 4).
Apple's DVD Player does not have the ability to read and decode Line 21 caption data which are recorded on a DVD made from an over-the-air broadcast. It can display some movie DVD captions. In addition to Line 21 closed captions, video DVDs may also carry subtitles, which generally rendered from the
EIA-608 captions as a bitmap overlay that can be turned on and off via a set top DVD player or DVD player software, just like the textual captions. This type of captioning is usually carried in a subtitle track labeled either "English for the hearing impaired" or, more recently, "SDH" (subtitled for the deaf and Hard of hearing). Many popular Hollywood DVD-Videos can carry both subtitles and closed captions (e.g.
Stepmom DVD by Columbia Pictures). On some DVDs, the Line 21 captions may contain the same text as the subtitles; on others, only the Line 21 captions include the additional non-speech information (even sometimes song lyrics) needed for deaf and hard-of-hearing viewers. European Region 2 DVDs do not carry Line 21 captions, and instead list the subtitle languages available—English is often listed twice, one as the representation of the dialogue alone, and a second subtitle set which carries additional information for the deaf and hard-of-hearing audience. (Many deaf/ subtitle files on DVDs are reworkings of original teletext subtitle files.)
Blu-ray media typically cannot carry any
VBI data such as Line 21 closed captioning due to the design of
DVI-based
High-Definition Multimedia Interface (HDMI) specifications that was only extended for synchronized digital audio replacing older analog standards, such as
VGA, S-Video, component video, and
SCART. However, a few early titles from 20th Century Fox Home Entertainment carried Line 21 closed captions that are output when using the analog outputs (typically composite video) of a few Blu-ray players. Both Blu-ray and DVD can use either PNG bitmap subtitles or 'advanced subtitles' to carry SDH type subtitling, the latter being an XML-based textual format which includes font, styling and positioning information as well as a unicode representation of the text. Advanced subtitling can also include additional media accessibility features such as "descriptive audio".
Movies There are several competing technologies used to provide captioning for movies in theaters. Cinema captioning falls into the categories of open and closed. The definition of "closed" captioning in this context is different from television, as it refers to any technology that allows as few as one member of the audience to view the captions. Open captioning in a film theater can be accomplished through burned-in captions, projected text or
bitmaps, or (rarely) a display located above or below the movie screen. Typically, this display is a large LED sign. In a digital theater, open caption display capability is built into the digital projector. Closed caption capability is also available, with the ability for 3rd-party closed caption devices to plug into the digital cinema server. Probably the best known closed captioning option for film theaters is the
Rear Window Captioning System from the
National Center for Accessible Media. Upon entering the theater, viewers requiring captions are given a panel of flat translucent glass or plastic on a gooseneck stalk, which can be mounted in front of the viewer's seat. In the back of the theater is an
LED display that shows the captions in mirror image. The panel reflects captions for the viewer but is nearly invisible to surrounding patrons. The panel can be positioned so that the viewer watches the movie through the panel, and captions appear either on or near the movie image. A company called Cinematic Captioning Systems has a similar reflective system called Bounce Back. A major problem for distributors has been that these systems are each proprietary, and require separate distributions to the theater to enable them to work. Proprietary systems also incur license fees. For film projection systems,
Digital Theater Systems, the company behind the DTS
surround sound standard, has created a digital captioning device called the DTS-CSS (Cinema Subtitling System). It is a combination of a laser projector which places the captioning (words, sounds) anywhere on the screen and a thin playback device with a
CD that holds many languages. If the Rear Window Captioning System is used, the DTS-CSS player is also required for sending caption text to the Rear Window sign located in the rear of the theater. Special effort has been made to build accessibility features into digital projection systems (see
digital cinema). Through
SMPTE, standards now exist that dictate how open and closed captions, as well as hearing-impaired and visually impaired narrative audio, are packaged with the rest of the digital movie. This eliminates the proprietary caption distributions required for film, and the associated royalties. SMPTE has also standardized the communication of closed caption content between the digital cinema server and 3rd-party closed caption systems (the CSP/RPL protocol). As a result, new, competitive closed caption systems for digital cinema are now emerging that will work with any standards-compliant digital cinema server. These newer closed caption devices include cupholder-mounted electronic displays and wireless glasses which display caption text in front of the wearer's eyes. Bridge devices are also available to enable the use of Rear Window systems. As of mid-2010, the remaining challenge to the wide introduction of accessibility in digital cinema is the industry-wide transition to SMPTE DCP, the standardized packaging method for very high quality, secure distribution of digital movies.
Sports venues Captioning systems have also been adopted by most major league and high-profile college
stadiums and
arenas, typically through dedicated portions of their main
scoreboards or as part of balcony
fascia LED boards. These screens display captions of the
public address announcer and other spoken content, such as those contained within in-game segments, public service announcements, and lyrics of songs played in-stadium. In some facilities, these systems were added as a result of discrimination lawsuits. Following a lawsuit under the
Americans with Disabilities Act,
FedExField added caption screens in 2006. Some stadiums utilize on-site captioners while others outsource them to external providers who caption remotely.
Video games The infrequent appearance of closed captioning in
video games became a problem in the 1990s as games began to commonly feature voice tracks, which in some cases contained information which the player needed in order to know how to progress in the game. Closed captioning of video games is becoming more common. One of the first video game companies to feature closed captioning was
Bethesda Softworks in their 1990 release of
Hockey League Simulator and
The Terminator 2029. Infocom also offered
Zork Grand Inquisitor in 1997. Many games since then have at least offered subtitles for spoken dialog during
cutscenes, and many include significant in-game dialog and sound effects in the captions as well; for example, with subtitles turned on in the
Metal Gear Solid series of stealth games, not only are subtitles available during cut scenes, but any dialog spoken during real-time gameplay will be captioned as well, allowing players who can't hear the dialog to know what enemy guards are saying and when the main character has been detected. Also, in many of developer
Valve's video games (such as
Half-Life 2 or
Left 4 Dead), when closed captions are activated, dialog and nearly all sound effects either made by the player or from other sources (e.g. gunfire, explosions) will be captioned. Video games don't offer Line 21 captioning, decoded and displayed by the television itself but rather a built-in subtitle display, more akin to that of a DVD. The game systems themselves have no role in the captioning either; each game must have its subtitle display programmed individually. Reid Kimball, a game designer who is hearing impaired, is attempting to educate game developers about closed captioning for games. Reid started the Games[CC] group to closed caption games and serve as a research and development team to aid the industry. Kimball designed the Dynamic Closed Captioning system, writes articles and speaks at developer conferences. Games[CC]'s first closed captioning project called Doom3[CC] was nominated for an award as Best Doom3 Mod of the Year for IGDA's Choice Awards 2006 show.
Online video streaming Internet video streaming service
YouTube offers captioning services in videos. The author of the video can upload a SubViewer (*.SUB),
SubRip (*.SRT) or *.SBV file. As a beta feature, the site also added the ability to automatically transcribe and generate captioning on videos, with varying degrees of success based upon the content of the video. However, on August 30, 2020, the company announced that communal captions will end on September 28. The automatic captioning is often inaccurate on videos with background music or exaggerated emotion in speaking. Variations in volume can also result in nonsensical machine-generated captions. Additional problems arise with strong accents,
sarcasm, differing contexts, or
homonyms. On June 30, 2010, YouTube announced a new "YouTube Ready" designation for professional caption vendors in the United States. The initial list included twelve companies who passed a caption quality evaluation administered by the Described and Captioned Media Project, have a website and a YouTube channel where customers can learn more about their services and have agreed to post rates for the range of services that they offer for YouTube content.
Flash Video also supports captions using the Distribution Exchange profile (DFXP) of W3C
timed text format. The latest Flash authoring software adds free player skins and caption components that enable viewers to turn captions on/off during playback from a web page. Previous versions of Flash relied on the Captionate 3rd party component and skin to caption Flash Video. Custom Flash players designed in Flex can be tailored to support the timed-text exchange profile, Captionate
.XML, or
SAMI file (e.g.
Hulu captioning). This is the preferred method for most
US broadcast and cable networks that are mandated by the U.S.
Federal Communications Commission to provide captioned on-demand content. The media encoding firms generally use software such as
MacCaption to convert
EIA-608 captions to this format. The
Silverlight Media Framework also includes support for the timed-text exchange profile for both download and adaptive streaming media.
Windows Media Video can support closed captions for both video on demand streaming or live streaming scenarios. Typically, Windows Media captions support the
SAMI file format but can also carry embedded closed caption data. EBU-TT-D distribution format supports multiple players across multiple platforms. QuickTime video supports raw
EIA-608 caption data via proprietary closed caption track, which are just
EIA-608 byte pairs wrapped in a
QuickTime packet container with different IDs for both Line 21 fields. These captions can be turned on and off and appear in the same style as TV closed captions, with all the standard formatting (pop-on, roll-up, paint-on), and can be positioned and split anywhere on the video screen. QuickTime closed caption tracks can be viewed in
macOS or
Windows versions of
QuickTime Player,
iTunes (via QuickTime),
iPod Nano,
iPod Classic,
iPod Touch,
iPhone, and
iPad. Typical modern browsers, such as
Edge and
Chrome, and/or typical modern operating systems, such as
iOS 12+,
Android 10+,
Windows 10+, may manage the CC subtitle style.
Theatre Live plays can be open captioned by a captioner who displays lines from the
script and including non-speech elements on a large
display screen near the stage. Software is also now available that automatically generates the captioning and streams the captioning to individuals sitting in the theater, with that captioning being viewed using heads-up glasses or on a smartphone or computer tablet.
Telephones A captioned telephone is a
telephone that displays real-time captions of the current conversation. The captions are typically displayed on a screen embedded into the telephone base.
Video conferencing Some online video conferencing services, such as
Google Meet, offer the ability to display captions in real time of the current conversation.
Media monitoring services In the United States especially, most
media monitoring services capture and index closed captioning text from news and public affairs programs, allowing them to search the text for client references. The use of closed captioning for television news monitoring was pioneered by Universal Press Clipping Bureau (Universal Information Services) in 1992, and later in 1993 by Tulsa-based NewsTrak of Oklahoma (later known as Broadcast News of Mid-America, acquired by
video news release pioneer Medialink Worldwide Incorporated in 1997). US patent 7,009,657 describes a "method and system for the automatic collection and conditioning of closed caption text originating from multiple geographic locations" as used by news monitoring services.
Conversations Software programs are available that automatically generate a closed-captioning of conversations. Examples of such conversations include discussions in conference rooms, classroom lectures, or religious services.
Non-linear video editing systems and closed captioning In 2010,
Vegas Pro, the professional non-linear editor, was updated to support importing, editing, and delivering
CEA-608 closed captions. Vegas Pro 10, released on October 11, 2010, added several enhancements to the closed captioning support. TV-like CEA-608 closed captioning can now be displayed as an overlay when played back in the Preview and Trimmer windows, making it easy to check placement, edits, and timing of CC information. CEA708 style Closed Captioning is automatically created when the CEA-608 data is created. Line 21 closed captioning is now supported, as well as HD-SDI closed captioning capture and print from AJA and
Blackmagic Design cards. Line 21 support provides a workflow for existing legacy media. Other improvements include increased support for multiple closed captioning file types, as well as the ability to export closed caption data for DVD Architect, YouTube, RealPlayer, QuickTime, and Windows Media Player. In mid-2009,
Apple released
Final Cut Pro version 7 and began support for inserting closed caption data into SD and HD tape masters via
FireWire and compatible video capture cards. Up until this time, it was not possible for video editors to insert caption data with both
CEA-608 and
CEA-708 to their tape masters. The typical workflow included first printing the SD or HD video to a tape and sending it to a professional closed caption service company that had a stand-alone closed caption hardware encoder. This new closed captioning workflow known as
e-Captioning involves making a proxy video from the non-linear system to import into a third-party non-linear closed captioning software. Once the closed captioning software project is completed, it must export a closed caption file compatible with the
non-linear editing system. In the case of Final Cut Pro 7, three different file formats can be accepted: a .SCC file (Scenarist Closed Caption file) for Standard Definition video, a
QuickTime 608 closed caption track (a special 608 coded track in the .mov file wrapper) for standard-definition video, and finally a QuickTime 708 closed caption track (a special 708 coded track in the .mov file wrapper) for high-definition video output. Alternatively,
Matrox video systems devised another mechanism for inserting closed caption data by allowing the video editor to include CEA-608 and CEA-708 in a discrete audio channel on the video editing timeline. This allows real-time preview of the captions while editing and is compatible with Final Cut Pro 6 and 7. Other non-linear editing systems indirectly support closed captioning only in Standard Definition Line 21. Video files on the editing timeline must be composited with a Line 21 VBI graphic layer known in the industry as a "blackmovie" with closed caption data. Alternately, video editors working with the DV25 and DV50 FireWire workflows must encode their DV .avi or .mov file with VAUX data which includes CEA-608 closed caption data. == Logo ==