Our first stop off for the day was at the Grand Hyatt on the 11th floor where we visited XGI Technology, a two month old graphics chip maker spun off SiS, another relatively large Taiwanese chipset manufacturer. XGI is initially hoping to directly take on nVidia and ATi on all product levels with their DX9 compatible Volari series which cater for Enthusiast with Volari Duo, High-End with Volari V8 and V8 Ultra, Mainstream with Volari V5 Ultra and V5 and Mobile with ZP5.

The full range of graphics chips will become available in November in quantity through its growing add-in partners which currently include Power Color and Gigabyte. Right now XGI are still working on beta 2 silicon and perfecting their drivers for optimal performance and compatibility. Pricing is expected to match nVidia and ATi on all levels, ranging from $99 US to $499 US.

We were given the opportunity to run a stock standard 3DMark 2003 test with the Volari Duo, which is in the simplest terms, is a dual GPU Volari V8 Ultra linked by a 16GB/s bus with 256MB of DDR-2 memory (128MB dedicated to each GPU) which will most likely be very power hungry. Before you jump up and down like we almost did, XGI learnt off nVidia's mistakes with the GeForce FX 5800 Ultra and designed the IC in a way which allows for add-in partners to either use a maximum of 512MB DDR or DDR-2 memory based on their memory clock speed requirements and supply of DDR-2 memory, which is still only in the early phases. With DDR memory, the memory core is designed to run around 375MHz (750MHz DDR) and with DDR-2 memory around 500MHz (1GHz DDR) with the GPU running at 350MHz.

Like we posted in the news today, the results were quite pleasing for such an early product. The Volari Duo scored 5,370 under 3DMark 2003 consistently which is a tad slower than GeForce FX 5900 Ultra and Radeon 9800 Pro - We were told that the high-end Volari V8 Ultra is scoring in the 3k range with the mainstream Volari V5 Ultra in the 2k range. As XGI drivers mature, we expect performance to increase to levels close to nVidia and ATI.

To finish off I wanted to mention an interesting comment from XGI about their dual GPU system. You may expect performance to be blisteringly fast but as XGI explained to us, it is cheaper to produce a dual GPU system since it becomes very expensive to produce such a complex GPU on the allocated die space which according to XGI will become common place among GPU makers soon as technologies continue to advance.