GeForce 8800 GTX The 8800 series, codenamed G80, was launched on November 8, 2006, with the release of the GeForce 8800 GTX and GTS for the high-end market. A 320 MB GTS was released on February 12 and the Ultra was released on May 2, 2007. The cards are larger than their predecessors, with the 8800 GTX measuring 10.6 in (~26.9 cm) in length and the 8800 GTS measuring 9 in (~23 cm). Both cards have two
dual-link DVI connectors and an
HDTV/
S-Video out connector. The 8800 GTX requires 2 PCIe power inputs to keep within the PCIe standard, while the GTS requires just one.
8800 GS The 8800 GS is a trimmed-down 8800 GT with 96 stream processors and either 384 or 768 MB of RAM on a 192-bit bus. In May 2008, it was rebranded as the
9600 GSO in an attempt to spur sales. The early 2008
iMac models featured an 8800 GS GPU that is actually a modified version of the 8800M GTS (which is a laptop-specific GPU normally found in high-end laptops) with a slightly higher clock speed, rebranded as an 8800 GS. These newly updated models with the rebranded 8800 GS GPUs were announced by
Apple on April 28, 2008. It uses 512 MB of GDDR3 video memory clocked at 800 MHz, 64 unified stream processors, a 500 MHz core speed, a 256-bit memory bus width, and a 1250 MHz shader clock. These specifications are highly similar to that of the 8800M GTS, of which the iMac's 8800 GS GPU is based on.
8800 GTX / 8800 Ultra The 8800 GTX is equipped with 768 MB
GDDR3 RAM. The 8800 series replaced the
GeForce 7900 series as Nvidia's top-performing consumer GPU. GeForce 8800 GTX and GTS use identical GPU cores, but the GTS model disables parts of the GPU and reduces RAM size and bus width to lower production cost. At the time, the G80 was the largest commercial GPU ever constructed. It consists of 681 million
transistors covering a 480 mm2 die surface area built on a 90 nm process. (In fact the G80's total transistor count is ~686 million, but since the chip was made on a 90 nm process and due to process limitations and yield feasibility, Nvidia had to break the main design into two chips: Main shader core at 681 million transistors and NV I/O core of about ~5 million transistors making the entire G80 design standing at ~686 million transistors). A minor manufacturing defect related to a
resistor of improper value caused a recall of the 8800 GTX models just two days before the product launch, though the launch itself was unaffected. The GeForce 8800 GTX was by far the fastest GPU when first released, and 13 months after its initial debut it still remained one of the fastest. The GTX has 128 stream processors clocked at 1.35 GHz, a core clock of 575 MHz, and 768 MB of 384-bit GDDR3 memory at 1.8 GHz, giving it a memory bandwidth of 86.4 GB/s. The card performs faster than a single Radeon HD 2900 XT, and faster than 2 Radeon X1950 XTXs in
Crossfire or 2 GeForce 7900 GTXs in
SLI. The 8800 GTX also supports
HDCP, but one major flaw is its older NVIDIA PureVideo processor that uses more CPU resources. Originally retailing for around US$600, prices came down to under US$400 before it was discontinued. The 8800 GTX was also very power hungry for its time, demanding up to 155 watts of power and requiring two 6-pin
PCI-E power connectors to operate. The 8800 GTX also has 2 SLI connector ports, allowing it to support NVIDIA 3-way SLI for users who run demanding games at extreme resolutions such as 2560x1600. The 8800 Ultra, retailing at a higher price than the 8800 GTX, is identical to the GTX architecturally, but features higher clocked shaders, core and memory. Nvidia told the media in May 2007 that the 8800 Ultra was a new stepping, creating less heat therefore clocking higher. Originally retailing from $829, most users thought the card to be a poor value, offering only 10% more performance than the GTX but costing hundreds of dollars more. Prices dropped to as low as $200 before being discontinued on January 23, 2008. The core clock of the Ultra runs at 612 MHz, the shaders at 1.5 GHz, and finally the memory at 2.16 GHz, giving the Ultra a theoretical memory bandwidth of 103.7 GB/s. It has 2 SLI connector ports, allowing it to support Nvidia 3-way SLI. An updated dual slot cooler was also implemented, allowing for quieter and cooler operation at higher clock speeds.
8800 GT The 8800 GT, codenamed
G92, was released on October 29, 2007. This card is the first to transition to the 65 nm process, and supports
PCI-Express 2.0. It has a single-slot cooler as opposed to the dual-slot cooler on the 8800 GTS and GTX, and uses less power than GTS and GTX due to its aforementioned 65 nm process. While its core processing power is comparable to that of the GTX, the 256-bit memory interface and the 512 MB of GDDR3 memory often hinders its performance at very high resolutions and graphics settings. The 8800 GT, unlike other 8800 cards, is equipped with the
PureVideo HD VP2 engine for GPU assisted decoding of the
H.264 and
VC-1 codecs. The release of this card presents an odd dynamic to the graphics processing industry. With an initial projected street price at around $300, this card outperforms
ATI's flagship
HD2900XT in most situations, and even NVIDIA's own 8800 GTS 640 MB (previously priced at an
MSRP of $400). The card, while only marginally slower in synthetic and gaming benchmarks than the 8800 GTX, also takes much of the value away from Nvidia's own high-end card. Performance benchmarks at stock speeds place it above the 8800 GTS (640 MB and 320 MB versions) and slightly below the 8800 GTX. A 256 MB version of the 8800 GT with lower stock memory speeds (1.4 GHz as opposed to 1.8 GHz) but with the same core is also available. Performance benchmarks have shown that the 256 MB version of the 8800 GT has a considerable performance disadvantage when compared to its 512 MB counterpart, especially in newer games such as
Crysis. Some manufacturers also make models with 1 GB of memory; and with large resolutions and big textures, one can perceive a significant performance difference in the benchmarks. These models are more likely to take up to 2 slots of the computer due to its usage of dual-slot coolers instead of a single-slot cooler on other models. The performance (at the time) and popularity of this card is demonstrated by the fact that even as late as 2014, the 8800 GT was often listed as the minimum requirement for modern games developed for much more powerful hardware.
8800 GTS 8800GTS 640MB The first releases of the 8800 GTS line, in November 2006, came in 640 MB and 320 MB configurations of GDDR3 RAM and utilized Nvidia's G80 GPU. While the 8800 GTX has 128 stream processors and a 384-bit memory bus, these versions of 8800 GTS feature 96 stream processors and a 320-bit bus. With respect to features, however, they are identical because they use the same GPU. Around the same release date as the 8800 GT, Nvidia released a new 640 MB version of the 8800 GTS. While still based on the 90 nm G80 core, this version has 7 out of the 8 clusters of 16 stream processors enabled (as opposed to 6 out 8 on the older GTSs), giving it a total of 112 stream processors instead of 96. Most other aspects of the card remain unchanged. However, because the only 2 add-in partners producing this card (BFG and EVGA) decided to overclock it, this version of the 8800 GTS actually ran slightly faster than a stock GTX in most scenarios, especially at higher resolutions, due to the increased clock speeds. Nvidia released a new 8800 GTS 512 MB based on the 65 nm G92 GPU on December 10, 2007. This 8800 GTS has 128 stream processors, compared to the 96 processors of the original GTS models. It is equipped with 512 MB GDDR3 on a 256-bit bus. Combined with a 650 MHz core clock and architectural enhancements, this gives the card raw GPU performance exceeding that of 8800 GTX, but it is constrained by the narrower 256-bit memory bus. Its performance can match the 8800 GTX in some situations, and it outperforms the older GTS cards in all situations.
Compatibility issues with PCI Express 1.0a on GeForce 8800 GT/8800 GTS 512 MB cards Shortly after their release, an incompatibility issue with older PCI Express 1.0a motherboards surfaced. When using the PCI Express 2.0 compliant 8800 GT or 8800 GTS 512 in some motherboards with PCI Express 1.0a slots, the card would not produce any display image, but the computer would often boot (with the fan on the video card spinning at a constant 100%). The incompatibility has been confirmed on motherboards with VIA PT880Pro/Ultra, Intel 925 and Intel 5000P PCI Express 1.0a chipsets. Some graphics cards had a workaround that involves re-flashing the graphics card's
BIOS with an older Gen 1 BIOS, however this effectively made it into a
PCI Express 1.0 card, which is unable to utilize PCI Express 2.0 functions. This could be considered a non-issue however, since the card itself could not even utilize the full capacity of the regular PCI Express 1.0 slots, there was no noticeable reduction in performance. Also, flashing the video card's BIOS usually voided the warranties of most video card manufacturers (if not all), thus making it a less-than-optimum way of getting the card to work properly. A proper workaround to this is to flash the BIOS of the motherboard to the latest version, which depending on the manufacturer of the motherboard, may contain a fix. In relation to this, the high numbers of cards reported as
DOA (as much as 13–15%) were believed to be inaccurate. When it was revealed that the G92 8800 GT and 8800 GTS 512 MB were going to be designed with PCI Express 2.0 connections, NVIDIA claimed that all cards would have full backwards compatibility, but failed to mention that this was only true for PCI Express 1.1 motherboards. The source of the BIOS-flash workaround did not come from NVIDIA or any of their partners, but rather
ASRock, a mainboard producer, who mentioned the fix in one of their motherboard FAQs.
ASUSTek sells the 8800 GT with their sticker, posted a newer version of their 8800 GT BIOS on their website, but did not mention that it fixed this issue.
EVGA also posted a new bios to fix this issue. ==Technical summary==