with the main heatsink removed, showing the major components of the card. The large, tilted silver object is the GPU die, which is surrounded by RAM chips, which are covered in extruded aluminum heatsinks. Power delivery circuitry is mounted next to the RAM, near the right side of the card. A modern graphics card consists of a
printed circuit board on which the components are mounted. These include:
Graphics processing unit A
graphics processing unit (
GPU), also occasionally called
visual processing unit (
VPU), is a specialized
electronic circuit designed to rapidly manipulate and alter memory to accelerate the building of images in a
frame buffer intended for output to a display. Because of the large degree of programmable computational complexity for such a task, a modern graphics card is also a computer unto itself.
Heat sink A
heat sink is mounted on most modern graphics cards. A heat sink spreads out the heat produced by the graphics processing unit evenly throughout the heat sink and unit itself. The heat sink commonly has a fan mounted to cool the heat sink and the graphics processing unit. Not all cards have heat sinks, for example, some cards are liquid-cooled and instead have a water block; additionally, cards from the 1980s and early 1990s did not produce much heat, and did not require heat sinks. Most modern graphics cards need proper thermal solutions. They can be
water-cooled or through heat sinks with additional connected heat pipes usually made of copper for the best thermal transfer.
Video BIOS The
video BIOS or
firmware contains a minimal program for the initial set up and control of the graphics card. It may contain information on the memory and memory timing, operating speeds and voltages of the graphics processor, and other details which can sometimes be changed. Modern Video BIOSes do not support full functionalities of graphics cards; they are only sufficient to identify and initialize the card to display one of a few frame buffer or text display modes. It does not support
YUV to
RGB translation, video scaling, pixel copying, compositing or any of the multitude of other 2D and 3D features of the graphics card, which must be accessed by software drivers.
Video memory The memory capacity of most modern graphics cards ranges from 2 to 24
GB. But with up to 32 GB as of the late 2010s, the applications for graphics use are becoming more powerful and widespread. Since video memory needs to be accessed by the GPU and the display circuitry, it often uses special high-speed or multi-port memory, such as
VRAM,
WRAM,
SGRAM, etc. Around 2003, the video memory was typically based on
DDR technology. During and after that year, manufacturers moved towards
DDR2,
GDDR3,
GDDR4,
GDDR5,
GDDR5X, and
GDDR6. The effective memory clock rate in modern cards is generally between 2 and 15
GHz. Video memory may be used for storing other data as well as the screen image, such as the
Z-buffer, which manages the depth coordinates in
3D graphics, as well as
textures,
vertex buffers, and compiled
shader programs.
RAMDAC The
RAMDAC, or random-access-memory digital-to-analog converter, converts
digital signals to
analog signals for use by a computer display that uses analog inputs such as
cathode-ray tube (CRT) displays. The RAMDAC is a kind of RAM chip that regulates the functioning of the graphics card. Depending on the number of
bits used and the RAMDAC-data-transfer rate, the converter will be able to support different computer-display refresh rates. With CRT displays, it is best to work over 75
Hz and never under 60 Hz, to minimize flicker. (This is not a problem with liquid-crystal displays, as they have little to no flicker.) Due to the growing popularity of digital computer displays and the integration of the RAMDAC onto the GPU die, it has mostly disappeared as a discrete component. All current LCD/plasma monitors and TVs and projectors with only digital connections work in the digital domain and do not require a RAMDAC for those connections. There are displays that feature analog inputs (
VGA, component,
SCART, etc.)
only. These require a RAMDAC, but they reconvert the analog signal back to digital before they can display it, with the unavoidable loss of quality stemming from this digital-to-analog-to-digital conversion. With the VGA standard being phased out in favor of digital formats, RAMDACs have started to disappear from graphics cards. with
Output interfaces (VIVO) for S-Video (TV-out), Digital Visual Interface (DVI) for high-definition television (HDTV), and DE-15 for Video Graphics Array (VGA) The most common connection systems between the graphics card and the computer display are:
Video Graphics Array (VGA) (DE-15) ) Also known as
D-sub, VGA is an analog-based standard adopted in the late 1980s designed for CRT displays, also called
VGA connector. Today, the VGA analog interface is used for high definition video resolutions including
1080p and higher. Some problems of this standard are
electrical noise,
image distortion and
sampling error in evaluating pixels. While the VGA transmission bandwidth is high enough to support even higher resolution playback, the picture quality can degrade depending on cable quality and length. The extent of quality difference depends on the individual's eyesight and the display; when using a DVI or HDMI connection, especially on larger sized LCD/LED monitors or TVs, quality degradation, if present, is prominently visible.
Blu-ray playback at 1080p is possible via the VGA analog interface, if
Image Constraint Token (ICT) is not enabled on the Blu-ray disc.
Digital Visual Interface (DVI) (DVI-I) Digital Visual Interface is a digital-based standard designed for displays such as flat-panel displays (
LCDs, plasma screens, wide
high-definition television displays) and video projectors. There were also some rare high-end CRT monitors that use DVI. It avoids image distortion and electrical noise, corresponding each pixel from the computer to a display pixel, using its
native resolution. Most manufacturers include a DVI-
I connector, allowing (via simple adapter) standard RGB signal output to an old CRT or LCD monitor with VGA input.
Video-in video-out (VIVO) for S-Video, composite video and component video |120x120px These connectors are included to allow connection with
televisions,
DVD players,
video recorders and
video game consoles. They often come in two 10-pin
mini-DIN connector variations, and the VIVO splitter cable generally comes with either 4 connectors (
S-Video in and out plus
composite video in and out), or 6 connectors (S-Video in and out,
component YPBPR out and composite in and out).
High-Definition Multimedia Interface (HDMI) HDMI is a compact audio/video interface for transferring
uncompressed video data and compressed/uncompressed digital
audio data from an HDMI-compliant device ("the source device") to a compatible
digital audio device,
computer monitor,
video projector, or
digital television. HDMI is a digital replacement for existing
analog video standards. HDMI supports
copy protection through
HDCP.
DisplayPort DisplayPort is a digital display interface developed by the
Video Electronics Standards Association (VESA). The interface is primarily used to connect a video source to a
display device such as a
computer monitor, though it can also be used to transmit audio, USB, and other forms of data. The VESA specification is
royalty-free. VESA designed it to replace
VGA,
DVI, and
LVDS. Backward compatibility to VGA and DVI by using adapter
dongles enables consumers to use DisplayPort fitted video sources without replacing existing display devices. Although DisplayPort has a greater throughput of the same functionality as
HDMI, it is expected to complement the interface, not replace it.
USB-C USB-C is an extensible connector used for
USB,
display port,
thunderbolt, power delivery. The USB-C is a 24 pin reversible connector that supersedes previous USB connectors. Some newer graphics cards use USB-C ports for versatility.
Other types of connection systems Motherboard interfaces graphics. As can be seen from the
PCB the layout was done in 1985, whereas the marking on the central chip
CW16800-A says "8639" meaning that chip was manufactured week 39, 1986. This card is using the
ISA 8-bit (XT) interface. Chronologically, connection systems between graphics card and motherboard were, mainly: •
S-100 bus: Designed in 1974 as a part of the Altair 8800, it is the first industry-standard bus for the microcomputer industry. •
ISA: Introduced in 1981 by
IBM, it became dominant in the marketplace in the 1980s. It is an
8- or
16-bit bus clocked at 8 MHz. •
NuBus: Used in
Macintosh II, it is a
32-bit bus with an average bandwidth of 10 to 20 MB/s. •
MCA: Introduced in 1987 by IBM it is a 32-bit bus clocked at 10 MHz. •
EISA: Released in 1988 to compete with IBM's MCA, it was compatible with the earlier ISA bus. It is a 32-bit bus clocked at 8.33 MHz. •
VLB: An extension of ISA, it is a 32-bit bus clocked at 33 MHz. Also referred to as VESA. •
PCI: Replaced the EISA, ISA, MCA and VESA buses from 1993 onwards. PCI allowed dynamic connectivity between devices, avoiding the manual adjustments required with
jumpers. It is a 32-bit bus clocked 33 MHz. •
UPA: An interconnect bus architecture introduced by
Sun Microsystems in 1995. It is a
64-bit bus clocked at 67 or 83 MHz. •
USB: Although mostly used for miscellaneous devices, such as
secondary storage devices or peripherals and
toys, USB displays and display adapters exist. It was first used in 1996. •
AGP: First used in 1997, it is a dedicated-to-graphics bus. It is a 32-bit bus clocked at 66 MHz. •
PCI-X: An extension of the PCI bus, it was introduced in 1998. It improves upon PCI by extending the width of bus to 64 bits and the clock frequency to up to 133 MHz. •
PCI Express: Abbreviated as PCIe, it is a point-to-point interface released in 2004. In 2006, it provided a data-transfer rate that is double of AGP. It should not be confused with
PCI-X, an enhanced version of the original PCI specification. This is standard for most modern graphics cards. The following table is a comparison between features of some interfaces listed above. == See also ==