GeForce 256 through GeForce FX are fine cards for old games for a number of reasons. Their DOS game compatibility and GUI performance are top notch. Their Direct3D driver supports two critical old features, fog table and 8-bit palettized textures. OpenGL compatibility and performance are second to none, and some games utilize proprietary NVIDIA extensions.
One problem with NVIDIA cards prior to GeForce4 is that some card vendors built their cards too cheaply. The most noticeable result is poor analog signal quality. It causes problems such as blurriness, loss of color saturation and color bleed, particularly at higher resolutions and refresh rates. Determining which cards are high quality is difficult but the GeForce4 and newer cards are most likely to be good.
Perhaps the finest choices are the GeForce FX series because they offer the most refined quality-enhancing features and the high end models have the performance to use these features at higher resolutions. GeForce 6 drops support for palettized textures, which is a problem with a few games, but otherwise they too are great choices. GeForce 7 drops support for Windows 9x but still supports Windows 2000. Some users, however, have reportedly gotten GeForce 7800 cards to work under Windows 98 and ME using an unofficial, modified driver (link). This modified driver does not support Windows 95. The GeForce 7 series is also the last that is supported by NVIDIAs Stereo3D driver extension for shutter glasses such as ELSA Revelator. GeForce 8 and above support only XP and newer.
NV1 / STG2000
Released in 1995, NVIDIA's first 3D accelerator was an all-in-one product with audio, GUI, VGA, 3D and Sega Saturn gamepad support. It uses a type of 3D rendering called quadratic texture mapping that is not Direct3D or OpenGL compatible so it is only useful with games that use its proprietary API. It is the same technology used by Sega Saturn and as such various games were ported to use the NV1. Its audio consists of wavetable MIDI and DirectSound support but very little DOS support. DOS VGA compatibility is limited.
It comes in various memory configurations with up to 4 MB maximum. The final drivers support Direct3D, but it is a software-only implementation.
Games with support: Battle Arena Toshinden, Descent: Destination Saturn, NASCAR Racing, Panzer Dragoon, Virtua Cop, Virtua Fighter Remix.
NV3 | RIVA 128 & 128 ZX
The RIVA 128 was released in late 1997 and it is the first Direct3D compatible GPU from NVIDIA. RIVA stands for Real-time Interactive Video and Animation and 128 for the internal 128-bit pipeline and memory interface. It has a 206 MHz RAMDAC and supports DDC2 and VBE3. Riva 128 has all the hardware features required for Direct3D 5 and has also good OpenGL 1.0 compatibility. It renders at 16-bit color depth and supports 3D accelerated resolutions up to 960x720 with Z-buffer. The 3D-performance is competitive with Voodoo Graphics (Voodoo1). The RIVA 128 shows a few rendering quality issues like visible texture seams and a very apparent dithering pattern. Overall the rendered image appears more saturated than output from Glide games from this time. A special feature of the PCI version of the card is the ability to load textures over the PCI bus, which was advertised as AGP content on PCI. The 2D-features include video scaling and color conversion capabilities. The chip features also PAL/NTSC output so many cards were released with TV-out, some even with TV-in.
In early 1998, NVIDIA refreshed the NV3 architecture by releasing the RIVA 128 ZX. The RIVA 128 ZX is an upgraded chip that has a 250 MHz RAMDAC and supports up to 8 MB SGRAM. For texture intense games the increased memory results in higher performance.
NV4-6 | RIVA TNT & TNT2
Released in 1998, NVIDIA's RIVA TNT is a drastically improved Direct3D 6-compliant GPU with much better image quality and performance compared to its predecessors. It is competitive with Voodoo2 but with more flexibility such as 32-bit color rendering and 1024x1024 texture support. It also supports AGP 2x with execute mode (AGP texturing). 32-bit color rendering comes with a significant speed loss though and large textures are of very limited use since the chip lacks any form of texture compression. 16 MB RAM allows for very high resolutions. Great OpenGL support. All NVIDIA cards have the advantage of seperately clocked memory which makes for more flexible overclocking than with 3dfx cards.
NVIDIA further refined the RIVA TNT architecture in early 1999 with the RIVA TNT2. The RIVA TNT2 is very similar to the original TNT but adds support for AGP 4x and 2048x2048 resolution texture maps. It also typically has 32 MB of RAM and it is clocked much higher so performs noticeably better. Some popular budget variants include Vanta and TNT2 Model 64 (M64). The M64 model uses a 64-bit memory interface (rather than 128-bit as opposed to the regular variant), effectively halving its memory bandwidth. The performance of the M64 model, however, is faster than the original TNT in some situations.
Acer Laboratories Inc. (ALi) later integrated the RIVA TNT2 core into the northbridge of their M1631 motherboard chipset, commonly known as the Aladdin TNT2. The Aladdin TNT2 could also use shared system memory (up to 32 MB total) in addition its onboard graphics memory. Most motherboard manufacturers opted to equip it with 8 MB local memory. The Aladdin TNT2 is on par with the TNT2 M64 in gaming performance.
The RIVA TNT and TNT2 do not support 8-bit palettized textures. This is a problem with, for example, Final Fantasy VII.
The TNT2 is also NVIDIA's last chip with Windows 3.1x support. The TNT/TNT2 Windows 3.1x drivers were only released in beta form and therefore have stability issues and occasionally produce screen corruption - for example, video playback is fickle and will usually crash the system. Furthermore, the same drivers are locked to 60 Hz refresh rate, making them undesirable for use with CRT monitors under this OS. The TNT/TNT2 Windows 3.1x drivers will not work on any GeForce cards.
NV1x | GeForce 256 / 2 / 4 MX
Released in October 1999, GeForce 256 / NV10 was the first Direct3D 7-compliant GPU. It was initially released with standard SDRAM, but a DDR version later followed. The DDR version is roughly twice as fast as TNT2. It introduced ordered grid super-sampling anti-aliasing, anisotropic filtering support (up to 2x level), cubic environment mapping, and support for hardware dot product bump mapping.
The next refinement in the architecture came in April 2000 in the form of the GeForce 2 / NV15. The GeForce 2 has twice the texture fillrate per clock compared to NV10 and uses a smaller manufacturing process allowing higher clock rates while reducing the power consumption at the same time. The GeForce 2 GTS is about 40% faster than the GeForce 256. GeForce 2 performance is mostly limited by memory bandwidth. Some card manufacturer used a low quality analog circuit design that produces a blurry image output.
GeForce 2 MX / NV11 is the low end series of the GeForce 2, released in September 2000. These cards have half of the pixel pipelines and half the memory interface of the NV15. It is the first NV chip with two integrated TDMS channels, providing dual display output (called "TwinView"). It also has "Digital Vibrance Control" that allows calibration of various image output aspects. The 3D performance of GeForce 2 MX at 16-bit color depth is slightly faster than a GeForce 256 SDR. With its relatively low price and with the performance it offered, it became a popular card. The GeForce 2 MX core was later integrated into NVIDIA's nForce IGP motherboard chipset northbridge.
GeForce 4 MX / NV17 replaced the GeForce 2 series in January 2002. The NV17 core is a hybrid of the NV11 and NV25. The integration of various efficiency and bandwidth improving features, combined with significantly higher clock speed than NV11, allows it to match NV15 performance. These features were advertised as "Lightspeed Memory Architecture II," which was a refinement of the Lightspeed Memory Architecture introduced with the GeForce 3 (NV20). It also gained the "AccuView" anti-aliasing capabilities which are considerably advanced in quality and performance over NV11 and NV15. However, the GeForce 4 MX lacks the hardware pixel and vertex shaders support, as well as environment mapped bump mapping support and higher level anisotropic filtering support found in the NV2x architecture.
NVIDIA further refreshed the GeForce 4 MX line in late 2002 with the NV18, which added AGP 8x support. In 2004 the series received another refresh with PCI Express x16 support via a bridge chip. The first such product introduced was the GeForce PCX 4300. The GeForce 4 MX core was also integrated into NVIDIA's nForce2 IGP motherboard chipset northbridge.
Estimated model performance ranking:
GF2 MX100 < GF2 MX200 < GF2 MX < GF2 MX400 < GF4 MX420 < GF2 GTS < GF2 Pro < GF2 Ti VX < GF2 Ti < GF2 Ultra < GF4 MX440
NV2x | GeForce 3 & 4
Released in March 2001, the GeForce 3 / NV20 is the first Direct3D 8.0-compliant GPU. It is more efficient than GeForce 2 because of improvements in memory bandwidth utilization and the addition of hidden surface removal (HSR) functions similar to those of ATI Radeon. NVIDIA called this "Lightspeed Memory Architecture". Despite lower fillrate than GeForce 2 Ultra/Pro/Ti, in some cases GeForce 3 can outperform those cards by up to 50%, namely in situations with anti-aliasing or when the HSR features save considerable fillrate. However in some cases it loses to GeForce 2 Ultra. In the latter half of 2001, NVIDIA released the GeForce 3 Ti 200 and Ti 500. GeForce 3 Ti 200 is clocked lower than the original GeForce 3, while Ti 500 is fastest. The original GeForce 3 and GeForce 3 Ti 500 were only released in 64 MB configurations, while the GeForce 3 Ti 200 was released in 64 MB and 128 MB configurations. The GeForce 3, however, benefitted very little from 128 MB memory.
GeForce 3 is the first GeForce GPU with environment mapped bump mapping (EMBM) support, multi-sample anti-aliasing (MSAA) and complete anisotropic filtering support (up to 8x level). MSAA is considerably less demanding of fillrate than SSAA. 2X and 4X MSAA modes are available. There is also an anti-aliasing mode called "Quincunx" that uses a combination of 2X MSAA and a RAMDAC-based filter. This mode was intended to allow better quality anti-aliasing than 2X MSAA/SSAA but without the performance hit of 4X MSAA/SSAA. Higher anisotropic filtering levels (4x and 8x), however, incur a heavy performance decrease (sometimes by as much as 50%). However, compared to its main competitor, ATI's R200 series, GeForce3's anisotropic filtering implementation is less angle-dependent and can work simultaneously with trilinear filtering, yielding better quality when observed in motion.
The next evolution in the NV2x architecture came in early 2002 in the form of the NV25 / GeForce 4 Ti series. The GeForce 4 Ti is quite similar to the GeForce 3 in general. Changes include higher clock speed, pixel shader 1.3 support, Direct3D 8.1 compliance, an additional vertex processor for better geometry performance and dual RAM DAC for dual display output. Later in 2002, NVIDIA released the NV28, which added AGP 8x support. With the NV28 release, the AGP 8x versions of the Ti 4400 and 4600 were respectively rebranded as the Ti 4800 SE and Ti 4800. The AGP 8x variant of the Ti 4200 was just known as the "Ti 4200 with AGP 8x."
In the GeForce4 Ti line, the Ti 4200 is slowest and the Ti 4800 is fastest.
NV3x | GeForce FX
These are NVIDIA's first Direct3D 9 GPUs. They have excellent Direct3D 5-8 compatibility and performance but are of limited value for Direct3D 9. They are very useful for old games because they still have support for palettized textures and fog table. Similar anti-aliasing and anisotropic features, but performance with these is improved compared to older models. The high performance models like 5700 Ultra, 5800 Ultra and 59x0 Ultra allow one to run old games in high-resolution with anti-aliasing and anisotropic filtering.
Avoid models with 64-bit bus and naming suffixes like LE, XT or VE because they have been crippled in some way. There were some PCIe models made, named GeForce PCX 5xxx.
5200 < 5500 < 5200 Ultra < 5600 < 5600 Ultra < 5700 < 5700 Ultra < 5800 < 5800 Ultra < 59x0 < 59x0 Ultra
NV4x | GeForce 6
The GeForce 6 series (NV4x) was released in 2004 and is the first Direct3D 9.0c-compliant GPU. It introduced support for Shader Model 3.0 and support for PCI Express x16, though initially released cards were AGP-only. Compared to the GeForce FX series, the NV4x had dramatically improved performance all-around but dropped palettized texture support, so it is incompatible with some old games (few). This is NVIDIA's final generation of GPUs with Windows 9x and NT 4.0 support.
- With Intel 440BX motherboards, drivers newer than 56.64 may be unstable.