Search form

You are here

ATI New Board With 32 Pipelines

The latest information released from Canada regarding ATIs monstrous R520 board, also known as Fudo, mentions the magic 'pipeline' word, a key element of the battle for graphical supremacy last year. Many might remember how the number 16 seemed to be the holy grail of both manufacturers and how stunned we all were that the companies involved could produce such monstrous boards carrying so many transistors and not be responsible for the melting of the polar ice caps.

Well this time ATI claims that its new range of boards will carry in excess of 300 million transistors meaning that, at least in theory, it will be able to feature 32 pipelines. It's time for gamer jaws to drop again as this will mean that the boards will feature twice as many pipelines as their predecessors. The current line of thinking at ATI is to have all boards with 32 pipelines but to restrict the first few versions to 24 functioning ones and to gradually introduce the full 32 enabled versions when market conditions and competitor products demand them.

ATI is rather keen to begin making its presence felt on the Top 20 3DMark hall of fame, as the recent failed attempt in Texas clearly demonstrated. The R520 could be the card to bring that success to ATI since it should be capable of performing at least twice as fast as the X850 meaning it should outperform even current SLI setups.

If ATI also manages to introduce its dual-GPU configuration this summer as expected, the resulting performance should run away with the 3DMark crown. I will leave you with the thought of two 32 pixel pipeline R520s running in tandem a game of your choice, how's that for starters?

Amiga? yeah igraz i still have mine and its still running great. i upgraded the cpu and hd way back when cost per mb for hard disks was about £1 per mb thats about 2 u.s. dollars.I bought a HUGE 360mb hd :)!Was a great machine at the time but things move on...SOB

Sounds pretty impressive. 32 Pipelines will definatly be a cool thing to see. Of course I won't be able to afford it and don't think it will be worth buying for awhile to come. Seeing as how my 6800 gt runs everything great so far, but forward progress is always nice to see. Will be interesting to see the finsihed product.

Don't listen to Damien Cain. He's my Uncle. He doesn't really talk to Michael Myers like he says he does. I hate his guts, he ignored me throughout my childhood and I had noone. He left me with a void that can never be filled. Don't listen to Damien Cain.

Excuse me but this lady is clearly deranged. Yes, she is my niece but she is delusional if she thinks I don't converse with Michael Myers. He's starring in my movie! I'M DIRECTING HALLOWEEN 9! WTF Candi? Stay out of my affairs if you know what's best for you...

AJAXat 6:30 2/4/2005".. R520 will have because 90nm is like twice as small as 130nm. "er... wtf...yea i own an ATi card.. but what makes u think 90nm is twice as small as 130nm ?do u even know what nm stands for ? its nano meter 10x10^-9m13/9 = 2 ? what kinda maths is that ...

man, everyone on this thread is totally braindead...at first, you start talking about the card but after like 10 posts it becomes a raging war between ati and nvidia fanatics....poor loosers, haven't you got better things to do then talk shit all day...retards, every last one of you....and by the way, you all are a bunch of nerds and geeks SUCK ASS!!

nVIDIA WILL GO 90nm, Just not with the next refresh of the 6800 series. You will see them migrating towards 110nm before 90nm.Nvidia has nothing until late 2005-early 2006 to compete with the R520. They have nothing taped out for new macarhiture. Nvidia was banking on sli to carry them towards the end of the year. Partly in fact because the 6800 series is one fine gpu. However, with the soon to be released R520, and ATI's own Dual gpu capabilities for those chips, this puts major pressure on Nvidia. Not only will ATI now have a answer to SLI, but they will have, at least on paper, a new GPU that will have 2x the power of this genberation GPU's. And if that turns to be correct, Nvidia's SLI will not cut it.

WTF ARE U PPL TALKING ABOUT, NVIDIA'S P3.0 ISNT PROPER,LMFAO.U PPL KNOW NOTHING ABOUT SHADERS BLAH BLAH.WHAT YOU HAVE TO REALISE IS THAT, NVIDIAS HARDWARE AS OPPOSED TO ATI IS NEW!!AND I'VE BEEN TOLD BY A VERY CLOSE SOURCE THAT, NVIDIAS DRIVERS AINT EVEN USING 65% OF THE 6800'S POWER.SO BEFORE U START GOING ON THAT ATI OWNS THIS AND THAT. GIVE IT A FEW MONTHS AND YOU'LL C WHAT I MEAN.

im sry, but ati multirendering + r520 is gonna RAPE sliup to 34 cards plugged in at once? if u got the money and a motherboard that supports tht many cards, u wont have to upgrade ur gfx card for years :P

if an ati r520 costs less than sli setup and is 3x faster, we all know whos gonna be switching to ati, that would be all but a few nvidia customers. they would be really stupid and sit there with their gay little 6800's worshipping nvidia...

"AND I'VE BEEN TOLD BY A VERY CLOSE SOURCE THAT, NVIDIAS DRIVERS AINT EVEN USING 65% OF THE 6800'S POWER"Ok buddy. lmao. Ya like drivers increase that were like the ones for the 5800? Full of cheats...pleaseeeeeee you fanboy

SECOND TIME I WRITE THIS!!man, everyone on this thread is totally braindead...at first, you start talking about the card but after like 10 posts it becomes a raging war between ati and nvidia fanatics....poor loosers, haven't you got better things to do then talk shit all day...retards, every last one of you....and by the way, you all are a bunch of nerds and geeks SUCK ASS!!

Ati the more the better, i think ati will really start the new revolution by introducing the 1024 mb vers of 32 pipES for like $700-800 and the 512mb ver for $450-$550 ur not buying a comp just a video card prices should drop and realistic prices for this should be around the 400 mark no more.

I would imagine anyone willing to pay $500 for a video card they will replace in 2 years would be willing to pay a good amount for a screen that should last them at LEAST 5 years. So why don't the monitor/screen manufacturers go gaga about making something gamers and graphic developers would want.I would imagine I'm not the only person who would like something say about 24" that has great colors, high resolution 1600, a fast refresh - 8ms or less and in a standard monitor format. The wide screen models are nice but how about the standard format too?Frankly I'd buy a CRT if I could get one designed for a PC (not a TV HDTV or otherwise pretending to be a monitor). The only reason is that I've tested the $1000 dell LCD everyone is mentioning and while it is nice, it doesn't compare in colors to a CRT. Also I know some people would like the wide screen format but since I do work on it as well, a standard format would be better for me.Just me whining, not very productive but I figure if just three people started jumping up and down - it would be a movement.BTW - I've e-mailed Philips, LG, Samsung, Viewsonic, BenQ and Sony to (not whining) to find out if they have anything on the way that meets these specs. I'll be sure to post back if I hear anything. Of course if enough people were to e-mail them, they might realize the market niche is ready. I understand they work together on some screens so asking each individually certainly doesn't hurt.[url removed] - if any of you are thinking about this model, you are correct, it doescome close but the refresh ra

"ATI went the 90nm route, Nvidia already said that they won't take the risk."And with good reason. Look what happened when Intel went 90nm - TDP went THROUGH THE ROOF. They were the first to break past 100W power consumption for a CPU (whereas an athlon64 4000+ peaks at around 75W). I can imagine it now - you start playing a game....and the lights dim from the drain caused by the gfx card LOL.

And about 90nm.........it is less than half the size of 130nm. And here is the math.Unless i'm mistaken, the sizes are referring to the width of pathways in the electronic circuitry. Thus the size depends on cross-secional area of the pathway (which is proportional to the square of the radius of the pathway).130/90 = 1.4441.444 squared = 2.08Thus 130nm is more than twice the size of 90nm.

I owned several ATI and nvidia cards, and you can say what you want nv is more stable than ati, because with nv i never had a single issue (gf4ti, 6800). However, ATI kept giving me some of the worst image quality and bsod's i have encountered yet. so even if r520 is 1000x as fast as nvidias 6800 i will stick with it, because i prefer good performance and stability to uber-performeance and no stabilityATI sucks

My friend you got it the other way around. Nvidia's image and performance quality is shit when compared to ATI. Why because Nvidia hardware does not run at full precision and ATI cards do. I have owned several cards over the years and clearly ATI is the best. I am not the only one to think so just look at the stock market ATI is higher there than Nvidia, why because people prefer to buy ATI. Microsoft kicked Nvidia off the XBOX2 project and put ATI in, why because ATI has better technology. ATI still has better AA and AF they use Gamma Correction. So stop making shit up you homosexual!

I know more than anyone else in this forum or at beyond3d, I am hardware engineer. Nvidia has a few problems with its shader model 3.0. Poor flow control mainly, which is why there are performance drops when this feature is enable. Nvidia is actually going to refresh its current PS 3.0 feature set with its next release of GPU (G70).

"AND I'VE BEEN TOLD BY A VERY CLOSE SOURCE THAT, NVIDIAS DRIVERS AINT EVEN USING 65% OF THE 6800'S POWER"So whats this "VERY CLOSE SOURCE"? someone who wroks at nvidia? if thats the case theyre lying cause they have to make excuses why their cards suck. If its from a friend or something, thats a pretty damn bad "close source"If the drivers only used 65% of the card, nvidia would have been smart enough to make it use a better percent by now

"I owned several ATI and nvidia cards, and you can say what you want nv is more stable than ati, because with nv i never had a single issue (gf4ti, 6800). However, ATI kept giving me some of the worst image quality and bsod's i have encountered yet. so even if r520 is 1000x as fast as nvidias 6800 i will stick with it, because i prefer good performance and stability to uber-performeance and no stability"That is the biggest F*ckin' lie yet. Ive got like 8 computers and each have gone through several cards. All my ati cards have been fine, and all my nv cards have givin me blue screen of death and more of multiple driver versions. Ive only had better stability, quality, and perfectness with ATi

Maybe your were using Nvidia drivers for an ATI card you dumb f**k. I also have a close source from ATI, this source tells me that ATI has only been using 1% of its true power on current hardware. I was also told by this source that you are a f**king homo, and you suck gay donkey balls. Also that you were one of the kids sexually abused by Michael Jackson. Only 65% eh WTF are they waiting for you to loose your virginity you bastard f**k!

In Halloween 9 I will tell the story of how Michael Myers went on his murderous rampage not because of the curse of thorn but because Michael Myers was in fact molested by Michael Jackson. Apparently, Michael Jackson's wood woodpecker is the only piece of brown skin he has left and that's only because he didn't wash it afterward. Michael Myers is an ATI fanboy.

WTF are u talking about you faggot, don't use caps you c**t muscle. 35% to boot up u are an idiot. Windows does all the booting up you dumb shit. Go back to f**king you brother in the ass. Another week and you will have your entire one inch p***s inside him. You see all of my Megagames friends 'NVIDIA FOR ME' is the reason why cousins shouldn't marry.

@ AJAX: How old are you? 11? 12? If you are any older than that I really feel sorry for you. Or have all the other forums banned you so you come and post here. get yourself a life! then you can think about getting a graphics card. And If you do know anything about hardware you know even if 2 cards are built on the same day in the same factory one might be worse than the other. And if someone gets 2 Radeons and both are one of this worse kind of graphics cards and get 2 quality geforce cards guess what he will buy next. Idiot