Post Your Comment

115 Comments

I love this piece. Not sure if you'll get notified, but while doing some research on the performance of Hybrid Crossfire, I came back - it was interesting to see the tone of the piece, and hear about the guys at ATI talking vageuly about what would become the 5870. Fascinating stuff, I've got to put a bookmark in my calendar to remind me to come back to this next year when RV970 is released (pending no further difficulties).

the gpu industry is squeezing more and more transistors(SP s or what ever) .it would be energy efficient if it could disable some cores when there is less load than reducing clock frequency and 2D mode.just like in the latest AMD processor.a HD 4350 would consume power less than HD 4850 in IDLE right. Reply

In 2006, when the legendary X1900XTX took the world by surprise, actually beating the scarce and coveted 7800GTX-512, I bought it. It was king of the hill from January 2006 until the 7950GX2 stole the crown back for the fastest "single-slot" solution about 6 months later around June 2006, only a few months after the smaller 90nm 7900GTX was *finally* released in April 2006. Everybody started hailing Nvidia again although it was really an SLI dual-gpu solution sandwiched into one PCI-E slot. Perhaps it was the quad-gpu thingy that sounded so cool. It was obviously over-hyped but really took the attention away from ATI.

GDDR4 on the X1950XTX hardly did any good, since it was a bit late (Sept 2006) with only like 3-4 performance increase over the X1900. Well then the 8800GTX came in Nov 2006 and had a similar impact that the 9700Pro had.

As everybody wanted to see how the R600 would do, it was delayed, and disappointed hugely in June 2007. The 8800GTX/Ultra kept on selling for around $600 for nearly 12 months straight, making history. 80nm just did not cut it for the R600, so ATI wanted to have its dual-GPU single card REVENGE against Nvidia. And it would be even better this time since it's done on a single PCB, not a sandwiched solution like Nvidia's 7900GX2. Hence the tiny RV770 chips made on unexpected 55nm process! The 3870X2 did beat the 8800GTX in most reviews, but had to use Crossfire just like with SLI. Also, the 3870X2 only used GDDR3, unlike the single 3870 with fast GDDR4.

But Nvidia still took the attention away from the 3870 series by tossing an 8800GT up for grabs. When the 3870X2 came out in Jan 2008, Nvidia touted its upcoming 9800GX2 (to be released one month afterwards). So, Nvidia stopped ATI with an ace up its sleeve.

Round 2 for ATI's revenge: The 4870X2. And it worked this time! There was no way that Nvidia could expect the 4870 to be *that much* better than the 3870. Everybody was saying the 4870 would be 50% faster, and Nvidia yawned at that, thinking that the 4870 still couldnt touch the 9800GTX or 9800GX2 when crossfired. Plus Nvidia expected the 4870 to still have the "AA bug" since the 3870 did not fix it from the 2900XT, and the 4870 had a similar architecture. Boy, Nvidia was all wrong there! The 4870 actually ended up being *50%* faster than the 9800GTX in some games.

So, now ATI has earned its vengeance with its single-slot dual-GPU solution that Nvidia had with its 7900GX2 and 9800GX2 a while ago. With the 4870X2 destroying the GTX 280, ATI does indeed have its crown or "halo".

Unfortunately, Quad-crossfire hardly does well against the GTX 280 in SLI. We now know that quad-GPU solutions give a far lower "bang-per-GPU" due to poor driver optimizations, etc.. So most enthusiast gamers with the money and a 2560x1600 monitor are running two GTX 280's right now instead of two 4870X2's.. oh well!

One thing not mentioned about GDDR5 is that it eats power like mad! The memory alone consumes 40W, even at idle, and that is one of the reasons why the 4870 does not idle so well. If ATI reduces the speed low enough, it messes up the Aero graphics in Vista. It would have been nice if ATI released an intermediate 4860 version with GDDR4 memory at 2800+MHz effective.

Now, I cannot even start to expect what the RV870 will be like. I think Nvidia is going to really want its own revenge this time around, being so financially hurt with the whole 9800 - GTX 200 range plus being unable to release a 55nm version of G200 to this day. Nvidia just cannot beat the 4870X2 with a dual G200 on 55nm, and this is the reason for the re-spins (delays) with an attempt to reduce the power consumption while maintaining the necessary clock speed. Pardon me for pointing out the obvious...

Hope my mini-article was a nice supplement to the main article! :) Reply

True, and nowhere in the article was it pointed out that since the AA algorithm relied on the shaders, simply upping the shader units from 320 to a whopping 800 completely solved the weak AA performance that plagued 2900's and 3870's. It did not cost too much chip die size or power consumption either. ATI certainly did design the R600 with the future in mind (by moving AA to the shader units, with future expansion). Now the 4870 does amazing well with 8x FSAA, even beating the GTX 280 in some games.

I wanted to edit my above post by saying that the dual G200 needed to have low enough power consumption so that it could still be cooled effectively in a single-slot sandwich cooling solution. The 4870X2 has a dual-slot cooler, but Nvidia just cannot engineer the G200 on a single PCB with the architecture that they are currently using (monster chip die size, and 16 memory chips that scales with 448-bit to 512-bit bandwidth instead of using 8 memory chips with 512-bit bandwidth). That is why Nvidia must make the move to GDDR5 memory, or else re-design the memory architecture to a greater degree. Just my thoughts... I still have no idea what we'll be seeing in 2009!

more like uber mega retards, right? if they are so smart... why do they keep making such terrible, horrible, shitty drivers?

why?

i really really, really want to buy a 4850, i really do. but im not going to do it. im going to go and buy the 9800gt. And i know is just a re branded 8800gt. And i know nvidia is making shitty @ explosive hardware ( my 8600gt just died) And i know that gpu is slower, older, oced 65nm tech. And that nvidia is pushing gimmicky tricks " physics" and buying devs. but guess what? NVIDIA = good, clean drivers. New game? New drivers there. Fast. UN-Bloated drivers, that work, is is that hard ati? Really. or maybe you guys just suck?

Im going to pick all tech@ because of that. Thats how much i fkn hate your bloated and retarded drivers ATI. Install ms, framework for a broken control panel? stupid. And whats up with all those unnecessary services eating my memory and cpu cycles? ATI Hotkey poller?, ATI Smart?, ATI2EVXX.exe, ATI2EVXX.exe,NET 2.0 ? always there and the damm thing takes forever to load? Nvidia dsnt use any bloated crap, so why do you feel entitled to polute my pc with your bloated drivers?

AGAIN HORRIBLE DRIVERS ATI! I DONT WANT A SINGLE EXTRA SERVICE! i just build a pc for a friend. I choose the hd4670, beautiful card, really cool, fast, efficient. I love it. I want one for myself. But the drivers? ARg, i ended up using just the display driver and still the memory consumption was utterly retarded compared to my nvidia card.

I've followed Anandtech for many years but never felt the need to respond to posts or reviews. I've always used anandtech as THE source of information for tech reviews and I just wanted to show my appreciation for this article.

Following the graphics industry is certainly a challenge, I think I've owned most of the major cards mentioned in this insitful article. But to learn some of the background of why AMD/ATI made some of the decisions they did is just AWESOME.

I've always been AMD for CPU (won a XP1800+ at the Philly zoo!!!) and a mix of the red and green for GPUs. But I'm glad to see AMD back on track in both CPU and GPU especially (I actually have stock in them :/).

Thanks Anand for the best article I've read anywhere, it actually made me sign up to post this! Reply

Thank you. I'm not too much into following hardware these days but this article was interesting, informative, and insightful. You all have my appreciation for what amounts to a unique, humanizing story that feels like a diamond in the rough (not to say AT is "the rough," but perhaps the sea of reviews, charts, benchmarking--things that are so temporal). Reply

Is the ~$550 price point seen on ATi's current high end part evidence of them making their GPUs for the masses? If this entrire strategy is as exceptional as this article makes it out to be, and this was an effort to honestly give high end performance to the masses then why no lengthy conversation of how ATi currently offers, by a hefty margin, the most expensive graphics cards on the market? You even present the slide that demonstrates the key to obtaining the high end was scalability, yet you fail to discuss how their pricing structure is the same one nVidia was using, they simply chose to use two smaller GPUs in the place of one monolithic part. Not saying there is anything wrong with their approach at all- but your implication that it was a choice made around a populist mindset is quite out of place, and by a wide margin. They have the fastest part out, and they are charging a hefty premium for it. Wrong in any way? Absolutely not. An overall approach that has the same impact that nV or 3dfx before them had on consumers? Absolutely. Nothing remotely populist about it.

From an engineering angle, it is very interesting how you gloss over the impact that 55nm had for ATi versus nVidia and in turn how this current direction will hold up when they are not dealing with a build process advantage. It also was interesting that quite a bit of time was given to the advantages that ATi's approach had over nV's in terms of costs, yet ATi's margins remain well behind that of nVidia's(not included in the article). All of these factors could have easily been left out of the article altogether and you could have left it as an article about the development of the RV770 from a human interest perspective.

This article could have been a lot better as a straight human interest fluff piece, by half bringing in some elements that are favorable to the direction of the article while leaving out any analysis from an engineering or business perspective from an objective standpoint this reads a lot more like a press release then journalism. Reply

Never in the article did it say anything about ATI turning socialistic. All it did mention was that they designed a performance card instead of an enthusiast one. How they approach to finally get to the enthusiast block, and how much it is priced, is completely irrelevant to the fact that they designed a performance card. This also allowed ATI to bring better graphics to lower priced segments because the relative scaling was much less than nVidia -still- has to undertake.

The built process was mentioned. It is completely nVidia's prerogative to ignore a certain process until they create the architecture that works on one they already know; you are bringing up a coulda/woulda/shoulda situation around nVidia's strategy - when it means nothing to the current end-user. The future after all, is the future.

I'd respectfully disagree about the journalism statement, as I believe this to be a much higher form of journalism than a lot of what happens on the internet these days.

I'd also disagree with the people who say that AMD is any less secretive or anything. Looking in the article there is no real information in it which could disadvantage them in any way; all this article revealed about AMD is a more human side to the inner workings.

Thank you AMD for making this article possible, hopefully others will follow suit. Reply

This was a really cool and interesting article, thanks for writing it. :)

However there was one glaring flaw I noticed: "The Radeon 8500 wasn’t good at all; there was just no beating NVIDIA’s GeForce4, the Ti 4200 did well in the mainstream market and the Ti 4600 was king of the high end. "

That is a very misleading and flat-out false statement. The Radeon 8500 was launched in October 2001, and the Geforce 4 was launched in April 2002 (that's a 7 month difference). I would certainly hope a card launched more than half a year later was faster.

The Radeon 8500 was up against the Geforce3 when it was launched. It was generally as fast/faster than the similarly priced Ti200, and only a bit slower than the more expensive Ti500. Hardly what I would call "not good at all". Admittedly it wasn't nearly as popular as the Geforce3, but popularity != performance. Reply

Hello, I've been visiting your site for about a year now and just wanted to let you know I'm really impressed with all of the work you guys do. Thank you so much for this article as i feel i really learned a whole lot from it. It was well written and kept me engaged. I had never heard of concepts like harvesting and repairability. I had no idea that three years went into designing this GPU. I love keeping up with hardware and really trust and admire your site. Thank you for taking the time to write this article. Reply

Been reading this site for going on 8 years now and this article ranks up there with your best ever. As I've grown older and games have taken a back seat I find articles like this much more interesting. When a new product comes out I find myself reading the forwards and architectural bits of the articles and skipping over all the graphs to the conclusions.

Anyways, just wish I was one of those brilliant programmers who was skilled enough to do massively parallelized programming. Reply

While the RV770 engineers may not have GDDR5 SDRAM to play with during its development, ATI can already use the GDDR4 SDRAM, which already has the memory bandwidth doubling that of GDDR5 SDRAM, AND it was already used in Radeon X1900 (R580+) cores. If there was any bandwidth superiority over NVIDIA, it was because of NVIDIA's refusal to switch to GDDR4, not lack of technology. Reply

absolutely brilliant. i've always read anandtech instead of tomshardware because of objective reviews. i was reading an intel review, and people were questioning the objectivity of anandtech. while some might could look at this as praising ATI/AMD, i would definitely say this was a very objective view of what happened. seriously, one of the BEST articles i've read since the 4850/4870 review. Reply

I've been reading anandtech for years and there wasn't any article that makes me wanna read every single word of it. Usually i will read the forewords and then skip on to the Conclusion. But for this article, i really read every single words! period! Anandtech rocks! Reply

This is by far the best (and most insightful) article I have read here to date Anand. It sounds like you put in a ton of thought to it and I have never flown through 7,500 words as quick as that read. Congrats to the ATI guys for their successful gamble on the RV770. The last three years must have been an extremely interesting experience for them and their engineers. Reply

I thoroughly enjoyed the article. Last time I was really involved with graphics cards was when the X1900's were in full swing and G80 was on everyone's mind. The history told in the article helped bring me up to pace as to what has transpired since I stopped gaming as much. I can remember how the Video card section used to be here on the forums with the trolls and constant flames. Two camps of people cheered on for one or the other competitors instead of realizing that they should be cheering for competition itself.

I am definately a performance/mainstream kinda guy in this market. Definately love the competition. I started my first build with a Geforce 4200Ti, moved up to the infamous 9700Pro, followed by X1900, and now 4850HD...

It is good to see that the ATI/AMD didn't damage ATI as a whole. Rock on guys! Love those cheap kick ass Crossfire cards! Go 4850HD x2. Reply

I've been a long time reader of AnandTech but I especially liked this article. It was interesting to get a peek behind the curtain to see what challenges companies face when making these tough decisions. Hopefully more companies take a chance and share more of their stories with this site. Keep up the good work.

I think those phrase is describe what Graphic-field is. Another year win, and another year lose. But from those situations, only hardworking and tough guy would be able to turn all upside down. And ATi team do make it. Now I relieved I make a decision to buy 4670, though not performance, it still does big bang for the buck. And with those Catalyst 8.12, I would be more grateful that I bought this video card. Has been downloaded it and now testing it.

Would Anand make another article about those GP-GPU programming language to make a data parralel computing possible.

Well, I considering to build my Leo Platform in the H2 of 2009 when the AM3 Deneb is out, Sata 3, and RD890. Reply

Like many before me have said, this has to be one of the best articles I've ever read here at AT. It really puts things into perspective. We (the consumer) are always criticizing or praising everything that comes out and don't take into account the amount of hard work and time put into the release. I'm 4850 owner, and I couldn't be happier with the performance I've received. I would like to personally thank ATI/AMD and the entire team that put RV770 into play. Absolutely brilliant.

I would also like to thank Anand for sharing this awesome experience with us. Reply

I have to say this was a great article. Great idea to write about the story behind these guys and the rv770. musta been a helluva relief when they realized how great the gpus were in the market, especially after taking such huge risks. For these guys to pull through the way they did, with the whole gddr5 issue and the die-shrink/physical limitations is amazing. I thought I was stressed in college. I can't imagine what its like to design something like this for 3 years not being even sure it'll work in the end. That's one hell of a resolve, makes me like ATI a bit more than I already do.

Keep writing great articles here, this is my favorite site to read reviews on, and this is another reason why.

I agree with everyone else that the article is very well written. I am not sure if these would even be the right guys to ask, but did you bring up any of the driver issues your other recent articles have mentioned with them? As you have mentioned before, it is probably not the best business plan to assume nVidia will screw up again, and they should probably get their crossfire support in order for the good feelings about this strategy to continue. Reply

The "sweet spot" strategy would have amounted to *nothing* without the efforts of many very talented engineers (and a little luck as Anand has noted). They made the 770 happen and deserve the lion's share of the credit.

I didn't think Anand would use this for anything other than background here-and-there in future articles. I fully expected him to politely cut me off at some point and say "about those future architectures..." which would have lead to Eric, Mike, and Mark telling a different interesting story. Thanks to Anand et al for telling this part of the 770 story. Responding to a comment or two in the posts:

* Sorry to quench the speculation - the AMD purchase had no effect on the 770's execution. Dirk Meyer and the other AMD executives supported Rick, a guy that they really didn't know, during some pretty tough times at AMD. They did their jobs so that we could do ours.

* The price range for 770-based cards was determined back in 2005 - it was an essential factor limiting the GPU cost, one of the big gambles. We had no clue what nV's 2008 pricing would be, but we did know what the gamers wanted. At launch we were tempted oh so briefly to launch at a higher price given the competitor's product offerings. It took some will-power for the starving man (us) to pass up a banquet (profits). We had a sneaking suspicion there was a lot of unhappiness about the direction prices had gone, and didn't want to be a party to that for the sake of a few weeks better revenue. Greed never pays. Remembering your customers does.

P.S. We don't keep any dart-board pictures of Anand around the office. However I *do* recall seeing his picture somewhere and thinking at the time that it *would* make a good dart target. Just a thought... :-)

You guys got the sweet spot right as far as I'm concerned (I'm not sure if it's true for others - does it show up in the units sold?)

Before the ATI 3800 (RV670), and Nvidia 8800GT, it seemed like after shelling out a few hundred US dollars, you'd only get low/medium quality at current games. And cheaper cards were pathetic to unusable for new games.

So I stuck to playing old games with on my old video card (Ti4200) - which was decent in its time.

After the beginning of the new "sweet spot" era, this year I bought a 9800GT (and a new PC). While the 9800GT is not as good as AMD/ATI's offerings in hardware performance terms, I was concerned about ATI's drivers/software. A colleague tried an ATI card on his office PC, but in the end he had to switch to Nvidia to get his multiscreen set up on Linux working the way he wanted, and I had seen a fair number of complaints from others. So far Nvidia's drivers have been OK for me whether in Windows or Linux.

On the other hand I've seen too many Nvidia cards failing in hardware terms (bad caps, bad whatever). So pick your poison ;).

But if the cards aren't totally crap, it often takes less time to just replace a faulty card, than to keep tinkering with drivers and software configs (sometimes to no avail).

Anyway, many thanks for helping to make stuff affordable, even though I picked Nvidia again ;).

And I'd like to add a point which has not been raised yet, at least in this discussion: the "small and fast enough" strategy only works because GPUs hit the realm where they're power limited!

The point is, whenever you go multi-GPU you loose performance due to inefficiencies and communication delays and there are also some transistors lost to redundant logic. If you had the choice between one 100 Mio transistor chip or 2 50 Mio ones, then the 100 Mio one would certainly be faster; assuming both could run at the same clock speed, which previously was determined by chip design (basically identical in the example) and process (identical).

But GT200 is too big, it can not fully fledge its clock speed wings because its power limited. Imagine GT200 at 1.5 - 1.8 GHz shader clock - it would be much more in line with performance expectations. RV770 on the other hand can be pushed quite a bit and on the 4870 it chews up lots of power for such a small chip - but that's OK because this power envelope has been accepted and the performance is there to justify it. And the 2 GPU versions are succesful because the power envelope on such "freak"-cards is larger.

And another frequently overlooked aspect: not all of GT200s transistors contribute to game performance. The 30 shaders which are 64 bit capable must be large and don't help games at all (and probably won't for quite some time). This is a very forward looking feature for games and a feature of immediate benefit for GP-GPU.

Congratulations to Anandtech for one of the most interesting articles this year. Congratulations to ATI/AMD for putting out their best and most exciting product since R300/9700 Pro.

The industry really needed something like RV770. When the 9700 Pro came out in 2002, it was at the cutting edge of technology and performance, far ahead of the previous champion, the Ti4600, yet it launched at only $399. Nvidia launched the 8800 Ultra and GTX280 at $800 and $600 respectively, even though neither GPU introduced any significant new features, only moderately higher framerates.

I currently have a 4850 512MB which I bought in July and I love it... It runs all my favorite games at great framerates and with fantastic image quality at 1680x1050. Still, I wouldn't considering myself an "ATI fan". When it's time for me to upgrade again, I will buy the best card in the $200 range and won't care whether the sticker on the GPU fan is green or red. Reply

I'm glad you guys were able to stick to the plan an launch at the amazing prices you hit. It really shook up the industry and helped bring higher performance to lower price points. Now we just need the same thing to happen with integrated graphics.

But seriously ... about those future architectures ... maybe you guys want to sit down and have another nice long chat? ;-) Reply

Choice B : Make the design to be price/power/performance efficient for very profitable mass-market. If you users want almost double the performance, buy 2 identical cards which saves on not producing a low volume BIG chip/card. (Though now we have 2 chips for X2 which still simplifies and reasonable efficient compared to a card design cramming a single BIG chip starved of bandwidth). Reply

I want to know what AMD was thinking for those quite 3 years when they blew Prescott out of the water. Did they see Conroe coming? I think it's time for Anand to get on some black paint and go commando over there at AMD HQ. Reply

Don't forget what rv770 did to GTX 280. Made it completely irrelevent. 2 x 4850 made it so no-one would ever bother with nvidia's "monster". Now they just have to get over the hump on driver support. Down with NVIDIA!! Reply

Best article I've read on your site in a long time. I crave all the performance benchmarks and reviews of new products but the back story behind the creation of the RV770 is amazing. I will be building myself a new rig very soon and I've been following hardware religiously in the last few months to help me make my decision. A new 4850 or 4870 will def end up in my new build. Reply

This article is the best I've read on any tech site. Loved it! I hope Anandtech has more behind-the-scenes stories like this again, and I also hope that companies continue to give these types of interviews. It was a great journalistic piece that made the company all that much more human. Thanks! Reply

I've been reading for 2-3 years and was too lazy to comment...but I found this article compelling enough to create an account just to say how much I liked it =)

For a student studying both compsci and business/mgmt, the dual focus on engineering and business challenges was very interesting. Though there was a very obvious potential for a "rah-rah ATI!" bias given the nature of the interview, especially when discussing R600. Reply

There must be echos in here, because I'm adding my words to the mix. In the roughly-10 years that I've been reading AnandTech (yes, I remember reading the Celeron "launch" article and the whole celery jokes that went with it), I must say that this is one of the best articles I've read here.

It's articles like this that keep me coming back to AT all these years. Everyone and their dog can benchmark and put up pretty graphs (no offense, Derek), but it's the meaty articles like this one that give AT that leg-up over the competition.

Thanks, Anand, for an awesome ten years, and here's to ten more! Reply

On page 2, when discussing the Radeon 8500, you have to remember that the 8500's intended competition was the GeForce 3 series, against which it was fairly competitive (especially at the end of its life). ATI never really released a product to compete with the GeForce 4 cards. Reply

Kudos to Anand for such a great article, extremely insightful. I may even go out and purchase AMD stock now :)

I love AMD even when it’s on the bottom, I own 780G + X2 + hd4850, in hopes that Deneb (or AM3 processors for that matter) will come in time to repeat the success of rv770 launch, at which point I will upgrade my obsolete X2 and have a sweet midrange machine.

My only concern is that Nvidia is looking at all this smirking and planning an onslaught with the 55nm refresh. There is a very “disturbing” article at Xbitlabs that Nvidia is stock-piling the 55nm GT200 parts; seems like that’s something they would do – start selling those soon and undercut 4800 series badly.
I’m just a concerned hd4850 owner and I don’t want to see my card obsolete within couple of months. I don’t really see AMD’s answer to 55nm GT200 in such short period of time?!?!

I don't think you'll have to worry too badly about the 55nm G200s. NVIDIA won't drop prices much, if at all; they're already smarting from the price drops enacted after the RV770 launch. There's also the fact that the 4850 isn't in the same market space as any of the G200 cards, so they're not really competitive anyhow. Reply

I always imagined designing GPUs would be very stressful given you're trying to guess things years in advance, but this inside look at how things are done was very informative.

On GDDR5, it's interesting to read that ATI was pushing so hard for this technology and they felt it was their only hope for the RV770. What about GDDR4? I thought ATI was a big supporter of it too and was the first to implement it. I'm pretty sure Samsung announced GDDR4 that could run at 3.2GBit/s in 2006 which isn't far from the 3.6GBit/s GDDR5 used in the 4870, and 4GBit/s GDDR4 was available in 2007. I guess there are still power savings to be had from GDDR5, but performance-wise I don't think it would have been a huge loss if GDDR5 had been delayed and ATI had to stick with GDDR4.

And another interesting point in your article was definitely about the fate of the 4850. You report that ATI felt that the 4870 was perfectly specced and wasn't changed. I guess that meant they were always targeting the 750MHz core frequency that it launched with. Yet ATI was originally targeting the 4850 at 500MHz clock. With the 4870 being clocked 50% faster, I think it should be obvious to anyone just looking at the clock speed that there would be a huge performance gap between the 4850 and 4870. I believe the X1800XL and X1800XT had a similarly large performance gap. Thankfully Dave Baumann convinced them to clock the 4850 up to a more reasonable 625MHz core.

One thing that I feel was missing from the article was how the AMD acquisition effected the design of the RV770. Perhaps there wasn't much change or the design was already set so AMD couldn't have changed things even if they wanted to, but they must have had an opinion. AMD was probably nervous that they bought ATI at it's height when the R580 was out and top, but once acquired, the R600 came out and underperformed. Would be interesting to know what AMD's initial opinion of ATI's small die, non-top tier targetted strategy was although it now seems to be more consistent with AMD's CPU strategy since they aren't targeting the high-end there anymore either. Reply

The best thing I've ever read on a tech site. This is why you're better than THG.

Only one typo! It was a "to" when it should have been a "too."

Chalk one up for the red team. This makes my appreciation for AMD rise even more. Anyone willing to disclose internal perspectives about the market like this is a team with less secrecy that I will support with my hard earned cash. So many companies could stand up and take a lesson here from this (i.e. Apple, MS).

I have been an avid reader of this site for close to 8 years. I used to read almost every CPU, GPU and novelty gadget articles page to page. But over the years, my patience is much lower and I realize I get just as much enjoyment and information from just reading the first page and last page and skimming a few benchmarks.

However, this is the first article in a while that I spent reading all of it and I thoroughly enjoyed it. These little back stories with a human element in one of the most interesting recent launches provides a refreshing change from boring benchmark-oriented articles.

I hope to find an article based on Nehpalem of a similar nature and other Intel launches.

I wish you had gotten greedy! I want to know about RV870, and about nVidia's first DirectX 11 part too.

I had been thinking about building a new gaming rig in Q1 2009, but presently it looks like I'd be spending too much for too little improvement over my current box. I'm hoping that changes by late summer. :) Reply

This is one of the best articles I have read here and there have been so many over the years. In the more then 10+ years I have been coming here I have always enjoyed this site and Anand continues to produce great content. Here's to another 10 years! Reply

Anand, you and all the hard working people at this website have just outdone yourselves. You raised the bar yet again.

Your readers are probably as amazed as you are that AMD/ATI came out with such personal and intimate information as to what goes on behind closed doors. Your conclusion is on point as well. Without competition, we know these other companies will run wild with their prices. Unfortunately for us, the fate of competition in both the GPU and CPU market falls on AMD which needs a little financial lovin' right about now.

My strongest desire is that the CPU team over at AMD pulls out all the stops with their next CPU to Bulldoze the competition (or just their prices). We need to make $1000 CPUs a thing of the past. Maybe your site and others can put pressure on AMD to bump up their CPU roadmap about a year so we see Bulldozer in 2010. Reply

Yup. I have always voted Red with my wallet though! The only two green cards I had were the Riva TNT and the 7600GT. Got rid of that after a couple of months though: it was noisy and the drivers sucked.

What a great story. Really shows how long it can take to make a significant change towards a process that is right for the end user. All those external factors e.g no GDDR5 at the time and G80s prosperity can really suck out an engineers motivation. Good thing those few engineers stuck to the task.

The card itself looks so neat and well designed from a bird's eye view. To see how the internals were pateiently designed too is awesome!

The article showed how the RV770 came to life using just words. Time to introduce my pc to life using this card I say :-)

Great job on the article! Generally today's reviews consist of me quickly going to the benchmarks portion and seeing if a new game was used or if any screwy results came out. This article however was much different. You had my attention from the get go and I didn't take a break in my reading until the whole article was finished.

It is a real shame that so much of the work in reviews are overlooked in favor of simple graphs but this article was different and I thank you for that. Reply

I really enjoyed the read, and it really gives me an appreciation for the card i just happen to get (hd 4850).

I may not speak for others, but these are the kind of articles I like to read, the kind that really explain in detail what's really happening. Anand, you did an excellent job of giving perspective (be in ATI's shoes in 2007 when nvidia was doing this...etc) to the article that gave definition between the "so obvious" hindsight we have now to the "this is suicide!" view that it must have seemed like to be there in 2005.

Now, for my own counter-perspective, I can understand why AMD, Intel, and nVidia may not do this very often. On the flip side of the coin, I'm not a mainstream user, and I don't exactly build 1000s of computers that ATI can sell. Bottom line speaking, a story that's interesting to me, I don't bring them $$$$. And on top of that, this story is also giving a lot of info to the competition, which can be at best a double edged sword, and at worst too much information to be self-destructive. Reply

The price fixing took part before AMD bought ATI. And it would be safe to assume that it stopped at the latest at that time, but it probably did stop well before that (the earliest evidence is an e-mail from 2002). AMD should know better than to point the finger at Intel and do something that is equally wrong in another segment of their business. Reply

Keep up the good work and never let the haters get you down! There's always people b!tching when they don't know how hard it is to write well (any moron can "write"). But its good stories like this that has been the bread and butter of Anandtech.

The pressure of deadlines, writer's block, or not having enough to write. I appreciate what you do and I know its stressful at times. Others can sympathize but I can empathize having been an amateur journalist myself (in high school and at the university newspaper). Reply

I have to admit this is one of the best articles I have read anywhere on the web in a long time. It is very insightful, interesting, and even compelling at times. Can you do a follow up, only from an NVIDIA perspective. Reply

I totally agree. This article is superbly written. One of the best tech articles I've read in a long long time, out of any source, magazine or online. I highly doubt nVidia will be as willing to expose their faults as easily as ATI was to expose their success; but I could be entirely mistaken on that.

In either case, well done Anand. And well done ATI! Snagged the HD4850 two days after release during the 25% off Visiontek blunder from Best Buy during release week. I've been happy with it since and can still kick around the 8800GT performance like yesterday's news. Reply

I agree about the insight especially. Gave us a real look at the decision making behind the chips.

This got me excited about graphics again, and it leaves me eager to see what will happen in the coming years. This kind of article is what will draw readers back. Thank you Anandtech and the red team for this amazing back stage pass. Reply

Like others have said, this is probably the best article I've read in recent memory. It was IMHO well written and interesting. Kudos to ATI as well for divulging the information.

I second the notion that similar articles from nVidia and Intel would also be interesting. Any chance of AMD's CPU division doing something similar? I always find the architectural articles interesting, but they gain more significance when you understand the reasoning behind the design. Reply

This is easily one of my favorite articles on this website. It really puts a lot of aspects of the GPU design process into perspective, such as the shear amount of time it takes to design one.

I also think this article really adds a great deal of humanity to GPU design. The designers of these marvels of technology are often forgotten (if ever known by most) and to hear the story of one of the most successful architectures to date, from the people that fought for this radical departure... It's amazing, to say the least.

I really envy you, Anand. You get to meet the geek world's superheroes. Reply

I couldn't agree more! This could be the best article I've read here at anandtech period. The performance reviews are great, but once in a while you need something different or refreshing and this is just precisely that. Reply

Couldn't agree more. Quite simply, this is the best article about hardware architecture I have ever had the distinct pleasure of reading. The human perspective adds an element of drama which cannot be underestimated. Very nearly reads like a Hollywood script, a la Jeff Bridges in "Tucker".

"Carol recalled a story where Rick Bergman and others were at a table discussing RV770; Rick turned to Matt Skynner and asked him if he thought they could really do it, if they could make RV770 a smaller-than-NVIDIA GPU and still be successful, if it was possible to create a halo in the Performance segment." Reply

I think though that it is sad that you missed the opportunity to get the best insight you can into future GPU trends and technology for the coming years. That would have been an even better article. Reply