Although Matrox has been hyping its new Parhelia-512 graphics card and the technology behind it for over a month now, the big question has always been … when do the benchmarks arrive? Those benchmarks arrived today, as several feature articles on Parhelia were made public.

The Parhelia comes with 128 MB of 275MHz DDR SGRAM (250MHz in the OEM version) and a core speed of 220MHz. There are also two DVI video outputs that can be adapted with included cables into three outputs for tri-head/triple-monitor support. Future versions will include support for up to 256 MB of on-card memory, faster core speeds, and video-in support.

The US$399 Parhelia was most often compared to the Nvidia GeForce4 Ti 4600, also with a $399 suggested retail price, but went up against the Radeon 8500 as well, which retails for $199. Overall, the Parhelia performs fairly well in 3D gaming, but that performance takes some explanation. With image quality at lower levels, the GeForce4 whipped the Parhelia. With image quality at maximum levels (anisotropic filtering enabled, highest anti-aliasing and resolutions), the Parhelia was able to slightly beat out the GeForce4 Ti 4600 in gaming benchmarks. Most gamers today shy away from such image quality levels due to the slowdown in performance, but this may change with ATI, Nvidia, and Matrox all competing to bring higher performance with better image quality.

Parhelia includes anisotropic filtering levels that didn't quite live up to those in the GeForce4 Ti 4600 or Radeon 8500. Where the Parhelia shined was its 16x “fragment” anti-aliasing (FAA) as opposed to full screen anti-aliasing (FSAA). Parhelia can run at the high 16x anti-aliasing mode but only anti-aliases (smoothes out jagged lines) the edges of objects, making its algorithm more efficient, as anti-alias values for the entire scene don't need to be calculated.

Get the full story on image quality and benchmarks at AnandTech and Tom's Hardware. Both sites provide image comparisons of specific anisotropic and anti-aliasing settings from different games with different results. Tom's offers a 15 MB download of uncompressed images to compare the quality for yourself. There is also a review at ExtremeTech which has good detail on the triple-monitor setup that Parhelia allows.

ROB'S OPINION
For now, it looks like Matrox may have succeeded in doing what it was intending to: it has a card available that, on some levels, beats out Nvidia's current best, and has features that set it apart from the pack. Still, the high price and inconsistencies (the 16x Fragment Anti-Aliasing didn't work very well at all on Max Payne) make this a tough call for all but die-hard Matrox fans, or those looking for specific performance and quality levels in specific games.

True to form, Matrox uses high quality components in the Parhelia to ensure that the analog outputs are crisp and free from annoying effects. This is important if you are going with the three-way monitor setup, as two of the outputs will be analog. A cable is included with Parhelia that splits one of the DVI outputs into two analog outputs.

Overall, the reviewers seemed to feel very rushed, with Matrox just barely getting them samples in time for review. Both AnandTech and Tom's promise additional features on Parhelia once they have some extra time to do them. There seemed to be some inconsistency in the Parhelia drivers here and there as well, making it seem like a beta product at times. Matrox will have to work hard to update the quality of its drivers quickly, as Nvidia and ATI are both committed to high-quality drivers at this point and have been updating frequently.

The Parhelia, while not an outright grand slam and quite expensive, has features that make it worth looking at for some. I have a two monitor setup right now, and I do someday look forward to adding a third monitor, all powered from the same card. To remain competitive, Matrox has to beef up its core clock speed quickly, make sure that FAA works everywhere, ensure that its drivers are solid, and work with video game programmers to use the extra features where Parhelia shines, such as quad-texturing, Matrox's high performance vertex shading, and triple-monitor setups. That's a tough job, and good luck to Matrox.

USER COMMENTS 50 comment(s)

$$$$(2:28pm EST Tue Jun 25 2002)Who has the money for a 3 monitor “surround-gaming” setup? Who has the room for a the 3 monitor “surround-gaming” setup?

It is refreshing to see a new video card compnay hop into the fray. It's even more refreshing to see one that isn't just concerned with getting 300fps in Quake 3.

nVidia and ATi have 2 teams working on products at the same time. One team works on the up and coming card. The second team works on the next iteration of the up and coming. Matrox does not. So it could be a very long time before we see another card come out of their “high quality” fabs.

No we don't need a new video card every 6 months like nVidia would lead us to believe. But it's kinda hard to be a “die hard Matrox fan” (as Rob referred to them) when they release a new product years after the previous.

I do hope Matrox does well for themselves, but I personally will not buy one of their cards (or anymore from ATi for that matter). – by RyGuy

Is this the same card?(2:32pm EST Tue Jun 25 2002)In Benchmarks, this card was very dissapointing. It was being mopped on the floor next 2 Nvidia and ATi cards. It may have the features and picture quality, but i think falls a lot short for any gamer that wants FPS and performance. I pesonally think Graphics or overrated. I prefer a game that runs smooth than have good graphics. – by Very Dissapointing

Very Dissapointing(2:51pm EST Tue Jun 25 2002)

Gamers.

Rhymes with lamers. – by Get a life eh?

hmm(2:52pm EST Tue Jun 25 2002)I think it brings a lot of interesting features to the table, like 10 bit color. Of course it doesn't seem like some of these features are released on the orginal card here… who know's what one of the cards released later with all the features they've boasted on there site and 256 megs of RAM stacks up. I hope they hurry up and do a release. I for one would love to see graphics cards as a whole expand on WHAT they can do not just there FPS abilities. – by Visiting reality

RyGuy(2:52pm EST Tue Jun 25 2002)“It is refreshing to see a new video card compnay hop into the fray”

Did I somehow misunderstand that? I mean, where have you been for the last decade? I bought a Matrox Mystaque in (IIRC) 1995.

Yes, that “typo” was deliberate. Not all games would work on that particular card at the time. – by Odo Asong

Not interested(2:57pm EST Tue Jun 25 2002)I have been gaming and doing multimedia development with my ATI All-In Wonder for over a year and it suits me just fine.

It's time to draw the line. My 1.2 Ghz Athlon, gig of ram and 40 gig harddrive is fine. I declare officially that I am done with upgrading until it is absolutely necessary. This includes all development tools (I will have to get by with Photoshop 6 and Illustrator 9 for a while). I am done giving my money to these people. – by James The Lesser

Get a life eh?(2:59pm EST Tue Jun 25 2002)“lamers” isn't a word.

Better luck next time, nerdboy. – by Get A Dictionary, ed

benchmarks(3:34pm EST Tue Jun 25 2002)matrox is doomed

everyone who was waiting for this card to power up their dual monitor setups are most likely going to go for a ti4200 which is like $200 less and performs better. Matrox also is late with stable drivers. I had the g400 when it first came out and it was months till they got the drivers right, glitches in opengl, & that sort.

Dual head gaming never really took off & now they are trying to sell you surround gaming??? what a fucking joke. Not sure about the demand for triple head as there isn't much of a demand for dual head, and for most multimonitor setups, dual head is good enough.

owell… matrox is doomed. – by p00kie

Re: Odo Asong(3:38pm EST Tue Jun 25 2002)My apologies.

It should have said “It is refreshing to see another video card company jump into the current 3D fray with a competetive product.” Hope that clears it up.

Re: Very DissapointingFPS vs Picture qualityIt depends. For a Flight Simulator (ZZzz) or an RPG, you may want better picture quality. You are able to take your time and look at the surrounding environment. Matrox's hyped FAA would do real well in this area.

Where as in an FPS deathmatch, all you want to do is avoid the incoming rocket. You don't really care what it looks like. I don't care that the FAA has gotten rid of the jaggies on the sniper rifle that is aimed at my forehead. Or that the anisotropic filtering has allowed a clearer texture on an object that is 150 yards out.

Considering the popularity of FPS's (is that punctuation correct?), it is no wonder nVidia and ATi push more for frames-per-second rather than picture quality. – by RyGuy

Numbers-obsessed?(3:48pm EST Tue Jun 25 2002)I love people who put down a card because it “only” gets 150 fps in Quake3. Without a framerate counter in the corner, none of these goobers could make a distinction between the Parhelia at 150fps and a GF4 at 250, except that the Parhelia's better graphics would give it away.

Sure, criticize about lower framerates with all the goodies turned on, or about less overclocking potential, or whatever. But when some guy with 270 degrees of visibility kicks your ass at Q3, be prepared to be a little bit jealous.

I'm a nVidia user myself, but I love neat new techonology, and I hate morons who judge everything off framerate. – by TeamNutmeg

WTF w/ Creative Labs(4:18pm EST Tue Jun 25 2002)Or are they out of the race? Ive been playing with annihilator 2 ultra for some time now. I wonder if theyre working on a geforce 4 model.

The kind of guys who can afford that triple rig are most likely young admins who still live in their parents houses. No expenses – no bills except for maybe their cable modem.

Ah, the good ole days. To bad my parents lived there too, hehe. – by jv

Re: jv(4:27pm EST Tue Jun 25 2002)jv – Creative Labs, who owns a portion of 3D Labs (if I am not mistaken) will be comping out with their own 3D acclerator sometime soon. It will be dubbed a “VPU” rahter than a “GPU”. Supposed to stand for Visual Processing Unit I believe.

Re: Teamnutmeg – I do agree with you. And it is 150 degrees of visibility. Not 270. Just nitpicking. – by RyGuy

RyGuy(4:43pm EST Tue Jun 25 2002)yup the VPU will also support the games to use op codes like mmx sse sse2 but for the video card.

intresing idea to see if the games will include it and if the AMD fans will call it skewing the benches.

yes its flame biat but I couldnt help it. – by Nataku

Does the tri-monitor mode take more processing overhead?(4:51pm EST Tue Jun 25 2002)None of the reviews really offerend any insight to this. Im guessing it will take slightly more processing since naturally a 150 degree viewing angle is going to include more triangles and textures than 90 degrees, but not too much more processing. Has there been any info on this? – by ScratchMan

With the Millinums …(4:58pm EST Tue Jun 25 2002)Matrox had a history of bringing out the hardware first then fine tuning the drivers for performance.

Wait a few months. – by LaughALot

…blah blah blah(5:07pm EST Tue Jun 25 2002)Well, I have to admit I figured the performance would be better. I am amazed at the difference in image quality, though.

I'd say the card is a success. As many people have stated, frame rate isn't everything. If I can play my Soul Reaver 2 at 1600 X 1200 X 32 with 16X Fragmented AA, I will be happy. – by Jewsh

RE: jv(5:54pm EST Tue Jun 25 2002)they have a GEforce 4 model all the way to Ti4600.ive seen it at the creative UK site and ive seen it on the shelfs of the local computer shop. – by DX2

A stupid question….(6:00pm EST Tue Jun 25 2002)I am not a gamer, so this question may have an obvious answer I am not aware of, but I hope somebody can answer it for me anyway…

The refresh rate of my monitor @ 1280 x 1024 is 120 Hz.

If I have a video card which will do, say 240 frames per second, what advantage does this give me, given my monitor can only redraw the sceen at half this speed?

Thanks for the answer!! – by The Waaaaaabit

D'oh!(6:25pm EST Tue Jun 25 2002)Yep, you're right – I have a tendency to overexaggerate things a bit. Almost as bad as my tendency to cause trouble. Hehe.

Gotta admit, I'm not as impressed with the anisotropic results for the Parhelia as I thought I'd be – ATi's still got that one sewn up by a huge margin, in both lack of framerate loss and in image quality. But for the Parhelia to run 16x antialiasing and BEAT a GF4Ti4600 only running at 4x AA is pretty damned impressive.

Like you noted, RyGuy, it's gonna be the flight sim crowd who ends up buying this thing. A triple-monitor setup simulating the entire front cockpit and ALL of the instrumentation in one view, looking as good as it does at those kinds of AA/Aniso levels, is going to be quite a draw. Slap a Thrustmaster HOTAS Cougar in front of that and it's a sim-geek's vision of heaven. – by TeamNutmeg

In reply to Waaabit(6:36pm EST Tue Jun 25 2002)Waaabit, if you disable vsync, then you 120hz monitor will draw the top half of the screen with one frame, and then as its drawing the bottom half of the screen it will use the updated info from the “new” frame.

So there will be a slight tear in the screen image, but the info in the bottom half of the screen will be more uptodate.

It could save you a split second, if there is something interesting coming at you in the bottom half of the screen.

Gamers need every split second they can get. – by Milliseconds count

Matrox misses the boat…again(7:13pm EST Tue Jun 25 2002)How fallen are the mighty. Remember when Matrox was THE card to have for your system? Top notch image quality AND incredible speed in Windows. Too bad I'm referring to, oh, around 1997, 1998 timeframe, because Matrox sure as hell hasn't given anyone who's serious about gaming any reason whatsoever to buy their cards.

First we had the G200, a lackluster card that shipped without any kind of OpenGL support at all, and when it did finally get it, the performance was pitiful.

Then we get the G400Max, a respectable if not outstanding card. Image quality was again very good (hell, I bought one) but performance was about 30% behind nVidia and 3Dfx (remember them?). If it weren't for the dual head support that was second to none, the G400 would've been useless.

After several years of…well, NOTHING, Matrox decides to grace us with the G550 — a G400 core shrunk down with no new features to speak of. Oh, they had something stupid called “headcasting” technology that was supposed to revolutionize VoIP communications. Anyone heard of headcasting lately? Didn't think so. Yawn!

Now we've got this Parhelia thingamadoodle. It's got top notch image quality but is slow — does this sound like something you've heard before, folks? Yep, it's the same old deja vu all over again. Really nice DAC's, nice TDMS, awesome multihead support — and crappy performance by any stretch of the imagination. Who's with me in calling this thing a dud for the gaming market? – by U.S. Marine

Apps(9:45pm EST Tue Jun 25 2002)Everyone is talking about gaming on this thing, but wouldn't it be a cheap tri-monitor card if you were working in applications? Hell, I'd like to have After Effects stretched over 3 monitors! – by blastermaster

So would I…(12:29am EST Wed Jun 26 2002)…but how many After Effects folks are there in the world besides you and me? How many people can TRULY make a business case for multimonitor setups? Sure, it's not a handful, but it's nothing compared to the teeming throngs of gamers out there. And gamers, maligned lot that they are, spend a lot of money to have the biggest, the baddest, and the best on the block. Usually those bragging rights are centered around incredible FPS numbers, something the Parhelia simply ISN'T going to deliver.

BTW, if you want 3 monitors NOW, you can have them NOW with another product from Matrox. They've got setups running on their old G200 cards that support up to 8 monitors. Stock traders love 'em. – by U.S. Marine

iq vs frame rate(6:03am EST Wed Jun 26 2002)id rather have image quality than frame rate in games. all the games i play (online/offline), EVERY iq option is maxed. sure, i get my ass handed to me in moh.. but not because im only getting 35 fps.. its just that i suck. and it looks damn good.

if the parhelia can keep me above 30fps, and make it look better than my ti4400, im sold.

i have dreams of f1 2002 on 3 monitors, cockpit view. maybe then ill be able to see my mirrors entirely.

the only downfall is price. im unwilling to pay $400 for a video card, plus the expense of the extra monitors to really get the “full” effect. – by porn loader

Average(9:03am EST Wed Jun 26 2002)Matrox Parhelia seems to be very good, not the best, in everything.

Best for people that do something else than play Dooohhhmmm 25 hours a day.– by Manza

In reply to Milliseconds count(11:15am EST Wed Jun 26 2002)Hey… You are absolutely right… A refresh rate of 240fps could potentially save you 4 milliseconds!!Too bad though your eye refresh rate is approximately 12fps….Let me give you a friendly advice. Try to spend more time on your studies and less on Quake….it won't do any harm.– by Cost

The Waaaaaabit(11:22am EST Wed Jun 26 2002)I'll be fried by the other gamers here for saying this, but if you're using your (non-hdtv) TV set as a monitor, then anything more than 30fps is wasted. A TV can only draw 30 frames in a second regardless of what the computer feeds it.

JTL(11:58am EST Wed Jun 26 2002)lol. nah, you gotta smoke 2 joints and than play a video game game game(the bob marley way).

Cheech – a gravity bong is a real stoner inevntion. so stoned in fact it doesn't even use gravity! take a 2 litre pop(THATS RIGHT I SAID POP, NOT SODA POP POP POP POP POP POP) bottle, cut the bottom off, get a bigger bucket, fill it with water, have bowl at top of 2 litre, place 2 litre with cut end in first into bucket of water and than pack bowl, light bowl as slowly pulling 2 litre up. than remove bowl and put mouth over opening…..and than push it down, the suction pushes all the smoke into your lungs(2 litres of condensed if you do it right smoke) and than you start grasping for air and choking. – by somewhat pissed off

in reply to COST(1:03pm EST Wed Jun 26 2002)maybe you should spend a little more time studying yourself. your eyes, with your mind in a conscious state, refresh around 70FPS…60 if your muscles are stretched from age, or too many LAN's. BUT, If you're attent enough to let your self slip into a sub-conscious state(its possible. ever felt like you were floating while doing something on a computer?) at that point your yes refresh some 150,000+ times per second. you take snapshotsof everything you see, whether or not you're body is in a subconscious state. hence the way subliminal imaging works. it takes a special projector to split one frame into 1,000 frames, or rather, fit 1,000 frames into 1 frame. one of those 1,000 will have the subliminal quesiton in it, another will have the answer. fascinating study. but for most gamers, anything over 100FPS isn't noticable, and is therefore…FOR BRAGGING RIGHTS(uhh, *gasp*) NO…..WAY!!!!!– by a subc. gravity bong

Look to the Future(10:28pm EST Wed Jun 26 2002)Unified Driver Architecture, Cg software to aid game development, new flagship GPU's every 6 months, making the last flagship model a steal at 50% off. I would say with the costs to develop these 100M transistor GPU's, I would rather show brand loyalty to Nividia so I can have a company around in a few years than waste money on a card from a company like Matrox who comes out with something “new” every few years at best. This might be their last product. – by gamerpro28

It's all lies………(1:42am EST Thu Jun 27 2002)Before I tell you that Matrox is a private company and they just need to sell their products not impress people like stock market companies so they will never get bankrupt suddently like other stock market companies do. It's all lies when the reviewers say 2D quality doesn't matter. I'll tell you the reason why. I want to get a Parhelia but I can't afford a 128MB version so I am waiting for the 64MB version. I have TNT2-M64(value) that is currently not used, and a GF2MX that is used. However GF2MX costs more so I have to sell that to get more money. I put a TNT2-M64 back in and guess what? My Win98 desktop suddenly gets blurry enough to get my eyes hurt. Yeah you may say I am a liar but use a better 2D quality card then go use a less 2D quality card and you'll notice the difference. I am not putting my MX back in because I will sell it soon(normal MX) and its a hassle installing and uninstalling drivers for a video card I will sell in a couple of days. Use your computer for couple of hours till your eyes hurt(or much till you get eye strain), and you'll see the difference in 2D quality with 2 cards. GF2's 2D quality is improved from TNT2-M64, but its not as much as Matrox compared to Nvidia(or for that matter ATI, which is better than Nvidia). My eyes seem to be sensitive than other people and I want to be on a computer that isn't blurry so my eyes don't hurt. Matrox's quality is the best people and you can't argue about that after you see it for yourself. – by Geek Reader

Monster Card(3:27pm EST Thu Jun 27 2002)I might be the only one using 3 screens. I shelled out for a Mass Multiples black flat screen 18″ triple wide for business purposes last year. It's awesome and clients are impressed. Triple Head is exactly what I need for after hours. I'd like to see Nvidia put out a card that could handle my system. I don't know how the 256mb Parhelia will test when it is introduced (late summer?) but I'll probably wait till then. Why settle?

On the same topic, Parhelia only supports 2screen DVD. That should be 3, even if there has to be a border to get the right dimensions. Does anybody know if Nvidia or ATI has any plans to reply with a 3screen gaming card of their own? – by 3ScreenUser

multi-monitor(9:56pm EST Thu Jun 27 2002)You could try using an AGP card and a PCI card, but it would also take a combination of displays able to take either analog or digital. For instance, my TI4600 card has a DVI and analog output and I have run 2 LCD monitors side by side, one an analog NEC 18.1 and the other an NEC digital.

I could add a PCI card in the available slots and using nView, I could use 3 monitors. – by gamerpro28

Close, but try this. If I get 240 fps in Q3A demo001 (or four.dm_67, or whatever), this is an average rate. Superhigh numbers mean that even on the nastiest, craziest firefights (where you really need the speed), you'll be refreshing at least close to your monitor's refresh speed.

PS (don't laugh) my PC gets 30 fps on four, dropping to about 8 in the firefights. :( That's how I know. – by Not telling now!

NEC 1830 LCD(7:02pm EST Sat Sep 21 2002)Any advice on getting 3d effects to work on this screen. The chip we are using is a Gforce 4 TI4600?? – by Diofantine

Parhelia(7:04pm EST Wed Oct 16 2002)If your interested in playing unrealistic shoot-ems up with impossible maneauvers for a human than the Parhelia is not your card. For flight sims/racing games this is card is superb.This card is more for the professional user than the frag junkie.Personally Im looking at both cards the Parhelia and GF4.Both have their plussses and minuses. If you want to do some real shooting and turn those flabs into abs join the army. – by SnowBear

Parhelia good!?(11:09pm EST Sat Nov 23 2002)There just has been some interesting news that came out on the Parhelia performance wise, look here: could be competing for information. Or are almost all the hardware news sites share and be like Robin Hood and his marry men, to phrase it in a creative and queer term. :D – by Just another Person!

Fashion??(11:22pm EST Mon May 24 2004)Do you even KNOW Fashion you geeks!? – by Ashley