Contents

History

Matrox had been known for years as a significant player in the high-end 2D graphics accelerator market. Cards they produced were excellent Windows accelerators, and some of the later cards such as Millennium and Mystique excelled at MS-DOS as well. Matrox stepped forward in 1994 with their Impression Plus to innovate with one of the first 3D accelerator boards, but that card only could accelerate a very limited feature set (no texture mapping), and was primarily targeted at CAD applications.

Matrox, seeing the slow but steady growth in interest in 3D graphics on PCs with NVIDIA, Rendition, and ATI's new cards, began experimenting with 3D acceleration more aggressively and produced the Mystique. Mystique was their most feature-rich 3D accelerator in 1997, but still lacked key features including bilinear filtering. Then, in early 1998, Matrox teamed up with PowerVR to produce an add-in 3D board called Matrox m3D using the PowerVR PCX2 chipset. This board was one of the very few times that Matrox would outsource for their graphics processor, and was certainly a stop-gap measure to hold out until the G200 project was ready to go.

Overview

With the G200, Matrox aimed to combine its past products' competent 2D and video acceleration with a full-featured 3D accelerator. The G200 chip was used on several boards, most notably the Millennium G200 and Mystique G200. Millennium G200 received the new SGRAM memory and a faster RAMDAC, while Mystique G200 was cheaper and equipped with slower SDRAM memory but gained a TV-out port. Most G200 boards shipped standard with 8 MB RAM and were expandable to 16 MB with an add-on module. The cards also had ports for special add-on boards, such as the Rainbow Runner, which could add various functionality.

G200 was Matrox's first fully AGP-compliant graphics processor. While the earlier Millennium II had been adapted to AGP, it did not support the full AGP feature set. G200 takes advantage of DIME (Direct Memory Execute) to speed texture transfers to and from main system RAM. This allows G200 to use system RAM as texture storage if the card's local RAM is of insufficient size for the task at hand. G200 was one of the first cards to support this feature.

The chip is a 128-bit core containing dual 64-bit buses in what Matrox calls a "DualBus" organization. Each bus is unidirectional and is designed to speed data transfer to and from the functional units within the chip. By doubling the internal data path with two separate buses instead of just a wider single bus, Matrox reduced latencies in data transfer by improving overall bus efficiency. The memory interface was 64-bit.

G200 supported full 32-bit color depth rendering which substantially pushed the image quality upwards by eliminating dithering artifacts caused by the then-more-typical 16-bit color depth. Matrox called their technology Vibrant Color Quality (VCQ). The chip also supported features such as trilinear mip-map filtering and anti-aliasing (though this was rarely used). The G200 could render 3D at all resolutions supported in 2D. Architecturally, the 3D pipeline was laid out as a single pixel pipeline with a single texture management unit. The core contained a RISC processor called the "WARP core", that implemented a triangle setup engine in microcode.

G200 was Matrox's first graphics processor to require added cooling in the form of a heatsink.

Performance

With regards to 2D, G200 was excellent in speed and delivered Matrox's renowned analog signal quality. The G200 bested the older Millennium II in almost every area except extremely high resolutions. With 3D, it scored similar to but generally behind a single Voodoo2 in Direct3D, and was slower than NVIDIA Riva TNT and S3 Savage 3D. However, it was not far behind and was certainly competitive. G200's 3D image quality was considered one of the best due to its support of 32-bit color depth (assuming driver bugs weren't a problem).

G200's biggest problem was its OpenGL support. Throughout most of its life G200 had to get by, in popular games such as Quake II, with a slow OpenGL-to-Direct3D wrapper driver. This was a layer that translated OpenGL to run on the Direct3D driver. This hurt G200's performance dramatically in these games and caused a lot of controversy over continuing delays and promises from Matrox. In fact, it would not be until well into the life of G200's successor, G400, that the OpenGL driver would finally be mature and fast.

Early drivers had some problems with Direct3D as well. In Unreal, for example, there were problems with distortions on the ground textures caused by a bug with the board's subpixel accuracy function. There were also some problems with mip-mapping causing flickering in textures. As drivers matured these problems disappeared.

G200A & G250

Around 1999, Matrox introduced a newer version of G200, called G200A. This board used a newer 250 nm manufacturing process instead of G200's original 350 nm. This allowed Matrox to build more graphics processors per wafer at the factory as well as to reduce heat output of the chip, and the G200As came without even a heatsink. Some G200A boards were named G250, which were clocked slightly higher than the normal G200, and sold only to OEMs, with Hewlett Packard perhaps being the only buyer.