Architecture Though its lineage was of the past-generation GeForce2 (NV11 and NV15), the GeForce4 MX (NV17) did incorporate bandwidth and fill rate-saving techniques, dual-monitor support, and a multi-sampling anti-aliasing unit from the Ti series (NV25); the improved 128-bit DDR memory controller was crucial to solving the bandwidth limitations that plagued the
GeForce 256 (NV10) and GeForce2 lines. It also owed some of its design heritage to Nvidia's high-end CAD products, and in performance-critical non-game applications it was remarkably effective. The most notable example is
AutoCAD, in which the GeForce4 MX returned results within a single-digit percentage of GeForce4 Ti cards several times the price. Many criticized the GeForce4 MX name as a misleading marketing ploy since it was less advanced than the preceding GeForce3 (NV20). In the features comparison chart between the Ti and MX lines, it showed that the only "feature" that was missing on the MX was the
nfiniteFX II engine—the DirectX 8 programmable vertex and pixel shaders. However, the GeForce4 MX was not a GeForce4 Ti with the shader hardware removed, as the MX's performance in games that did not use shaders was considerably behind the Ti. In motion-video applications, the GeForce4 MX offered new functionality. It (and not the GeForce4 Ti) was the first GeForce member to feature the Nvidia
VPE (video processing engine). It was also the first GeForce to offer
hardware-iDCT and VLC (variable length code) decoding, making VPE a major upgrade from Nvidia's previous
HDVP. In the application of
MPEG-2 playback, VPE could finally compete head-to-head with ATI's video engine.
Lineup There were 3 initial models: the MX420, the MX440 and the MX460. The MX420 had only Single Data Rate (SDR) memory and was designed for very low end PCs, replacing the GeForce2 MX100 and MX200. The GeForce4 MX440 was a mass-market OEM solution, replacing the GeForce2 MX/MX400 and GeForce2 Ti. The GeForce4 MX460 was initially meant to slot in between the MX440 and the Ti4400, while the release of the Ti4200 was held back. In terms of 3D performance, the MX420 performed only slightly better than the
GeForce2 MX400 and below the
GeForce2 GTS. However, this was never really much of a problem, considering its target audience. The nearest thing to a direct competitor the MX420 had was ATI's Radeon 7000. In practice its main competitors were chipset-integrated graphics solutions, such as Intel's 845G and Nvidia's own nForce 2, but its main advantage over those was multiple-monitor support; Intel's solutions did not have this at all, and the nForce 2's multi-monitor support was much inferior to what the MX series offered. The MX440 performed reasonably well for its intended audience, outperforming its closest competitor, the ATI
Radeon 7500, as well as the discontinued
GeForce2 Ti. Despite harsh criticism by gaming enthusiasts, the MX440 was successful in the PC OEM market as a replacement for the GeForce2 MX. Priced about 30% above the GeForce2 MX, it provided better performance, the ability to play a number of popular games that the GeForce2 could not run well—above all else—to the average non-specialist it sounded as if it were a "real" GeForce4—i.e., a GeForce4 Ti. While
John Carmack initially warned gamers not to purchase the GeForce 4 MX440, its somewhat widespread adoption compelled
id Software to make it the only DirectX 7.0 GPU supported by
Doom 3. When ATI launched its Radeon 9000 Pro in September 2002, it performed about the same as the MX440, but had crucial advantages with better single-texturing performance and proper support of DirectX 8 shaders. However, the 9000 was unable to break the MX440's entrenched hold on the OEM market. Nvidia's eventual answer to the Radeon 9000 was the
GeForce FX 5200, but despite the 5200's DirectX 9 features it did not have a significant performance increase compared to the MX440 even in DirectX 7.0 games. This kept the MX440 in production while the 5200 was discontinued. One of the fastest DirectX 7.0 compliant GPUs, the MX460's performance was similar to the discontinued GeForce 2 Ultra and the existing GeForce3 Ti200 (the remaining available member of the GeForce 3 family). However, ATI released the Radeon 8500LE which outperformed its price competitors, the GeForce 3 Ti200 and GeForce4 MX 460. ATI's move in turn compelled Nvidia to roll out the Ti4200 earlier than planned, also at a similar price to the MX 460, and soon afterwards discontinuing the Ti200. The Ti200, 8500LE, and Ti4200 were all DirectX 8.0 compliant while having similar market pricing to the MX460, while the 8500LE and Ti4200 also provided significantly better performance than the MX460, prevented the MX460 from ever being popular compared to the other GeForce 4 MX releases. The
GeForce4 Go was derived from the MX line and it was announced along with the rest of the GeForce4 Ti and MX lineup in early 2002. There was the 420 Go, 440 Go, and 460 Go. However, ATI had beaten them to the market with the similarly performing Mobility
Radeon 7500, and later the DirectX 8.0 compliant Mobility Radeon 9000. (Despite its name, the short-lived 4200 Go (NV28M) is not part of this lineup, it was instead derived from the Ti line.) Like the Ti series, the MX was also updated in late 2002 to support AGP-8X with the NV18 core. The two new models were the MX440-8X, which was clocked slightly faster than the original MX440, and the MX440SE, which had a narrower memory bus, and was intended as a replacement of sorts for the MX420. The MX460, which had been discontinued by this point, was never replaced. Another variant followed in late 2003—the MX 4000, which was a GeForce4 MX440SE with a slightly higher memory clock. The GeForce4 MX line received a third and final update in 2004, with the PCX 4300, which was functionally equivalent to the MX 4000 but with support for
PCI Express. In spite of its new codename (NV19), the PCX 4300 was in fact simply an NV18 core with a
BR02 chip bridging the NV18's native AGP interface with the PCI-Express bus. == GeForce4 model information ==