The 6600GT had exactly half the pixel-pipelines and memory bus of the 6800GT/Ultra, and this makes me think the 7600GT will be the same in relation to the 7800GTX. To add weight to this theory, a 128-bit would be much cheaper to produce, and with a smaller die size, be more economical and cooler. By being cooler, smaller fans can be used, saving more money. Also, NVidia would probably want to keep the PCB smaller for mainstream components (something that I would like myself).

So basically, my suggestion is that the 7600GT would be a 12-pipe, 128-bit card probably with those 12 pipes matched to 8 ROP's (like the 6600GT was 8-pipe matched to 4 ROP's). Around 5/6 Vertex pipelines would sound about right too. If teh core were at 450Mhz or even 500Mhz with 12-pipes, and paired with 1100Mhz memory, it would likely turn out somewhere inbetween a 6800 and 6800GT in performance, but importantly, would be:

I think maybe some of you are taking the article a little too seriously. Most hardware articles nowadays are geared toward high end tech for good reason; it's interesting technology and a lot of people want to read about it. It's useful information to a lot of people, and a lot of people are willing to pay for it. You want entry level and mid range video reviewed too? That's fine, but you'll have to wait like everyone else, AT can't force NVIDIA to push out their 7xxx entry level/mid range tech any faster. When it's ready you'll probably see some type of review or roundup. Reply

Well there is still no AGP for 939 AGP owners and the performance difference between the Ultra and GT this year is a lot more significant from last years. I would hate to spend 500 dollars on a "crippled" 7800 GTX. Not to mention ATI is still a bench warmer in this competition. Just seems like upgrading this year is not even worth it to a 939 AGP owner no matter how much of a gamer you are. I'm disappointed in the selection this year. Performance is there, but the price/value and inconvenience is above and beyond. Last year was a great time to upgrade, while this year seems more like a money pit with no games to fill it over. Reply

I intially agreed with that statement until I thought about 90nm parts. Correct me if I am wrong but Nvidia has no 90nm parts.

While nvidia current line of 7xxx and 6xxx provide a broad range of performance. I'm sure nvidia can increase profit margins by producing 90nm parts.

Nvidia can simply take the 6800 GT and Ultra 90nm chips and rebadge them the 7600 vanilla and GT. Since this involves a simple process shrink and no tweaking, these new 90nm can possibly be clocked higher and draw less power while increasing profit margins, without the cost of designing new 7600 chips based off the G70 design. Making everyone happy.

I would like G70 technology on 90nm ASAP, I have a feeling Nvidia didn't do a shift to 90nm for NV40 for a reason, as that core is still based on AGP technology, and Nvidia currently doesn't have a native PCI-E part for 6800 Line, they are all using HSI on the GPU substrate from the NV45 design.

NV40 on 0.13 micron is 287mm2 as pointed out by a previous poster, a full optical node shrink from 0.13 micron to 0.09 micron without any changes whatsoever, would bring NV40 287mm2 die size to ~ 172mm2 as full node optical shrink generally gives a die size of around 60%. This die size may not be enough to maintain a 256Bit Memory Interface,

Hence why Nvidia is rumored to do only a 0.11 micron process shrink (NV48) on the NV40 as that would bring a core down to about 230mm2 which is 80% of the size. Still large enuogh to maintain the 256Bit Memory Interface with little problem.

Making a 90nm G7x part for the mainstream segement directly would be very nice.

Let's say it has 16 Pipelines, and 8 ROP's to help save transistor space, plus the enhanced HDR buffers, and Transparency AA. It would be fairly close to the range I believe of 170mm2, it would probably still be limited at 128Bit Memory Interface, but the use of GDDR3 1.6ns @ 600MHZ could help alleviate the bandwidth problems some. Remember large amounts of memory bandiwdth combined with high fillrate is reserved for the higher segements, very hard to have your cake and eat it too in the mianstream.

Let's faice it for the time being, were not going to be getting fully fucntional high end cores at the 199US price point with 256Bit Memory Interface, so far we have gotten things like Radeon X800, Geforce 6800, 6800 LE, X800 SE, X800 GT. Etc etc. It just doesn't seem profitable to do so.

From what we have seen mianstream parts based on the tweaked technology are usually seen, RV410 Radeon X700, NV36 Geforce FX 5700 are mainstream cores based on the third and second generation of R300 and NV30 technology.

The 6800 @ 199US, 6800 GT @ 299US, 6800 U @ 399US is a temporay measure and production should slow on these cards as Nvidia ramps up the 90nm G7x based parts.

"I would like G70 technology on 90nm ASAP, I have a feeling Nvidia didn't do a shift to 90nm for NV40 for a reason, as that core is still based on AGP technology, and Nvidia currently doesn't have a native PCI-E part for 6800 Line, they are all using HSI on the GPU substrate from the NV45 design."

I believe Nvidia didn't want another 5800 fiasco. They probably determine a long time ago that 110 nm was a safer bet and used the 6600 as a guinea pig. Having a sucessful launch of the 6600 gave them confidence that manufacturing a 110nm g70 would be painless process.

Futhermore, the 7600 will be a midrange card and will target a market segment that is more than likely dominated by AGP boards. So NV40 based 7600 would make perfect sense since the majority of the 7600 sold wouldn't require a HSI chip.

"Let's faice it for the time being, were not going to be getting fully fucntional high end cores at the 199US price point with 256Bit Memory Interface, so far we have gotten things like Radeon X800, Geforce 6800, 6800 LE, X800 SE, X800 GT. Etc etc. It just doesn't seem profitable to do so."

The X800 GT is a 256Bit Memory Interface Card and targets the 6600 GT segment.
Reply

"The X800 GT is a 256Bit Memory Interface Card and targets the 6600 GT segment."

I guess you missed reading the fully functional part, as the X800 GT does not comply with this statement.

I guess I didn't get my meaning right, when I said G70 technology, I was talking about the mainstream cards going to 90nm not the 7800 GTX/GT.

For a mid range part the risk would be reduced for going to 90nm as the core is not quite as complex, Nvidia did make safe bet to go to 110nm for their high end cards, I am asking for a G7x technology based performace (199US) card on 90nm technology. Not on the high end.

Targeting PCI-E now would be a good idea as there are now boards on both sides that have PCI-E support for a decent amount of time, and it's the more forward thinking marchitecture, not to mention the possibility of power conusmption reduced enough on the 7600 GT to put it solely on the PCI-E bus if the Bridge Chip didn't exist. There isn't much point in designing a native AGP chip now, unless your talking about the value segment where margins are extremely thin per card.

For the AGP users, I believe they can continue to use 110nm NV48, but I would like for PCI-E users to benefit from a 7600 GT 90nm PCI-E native card, with possible bridging to AGP if demand calls for it. There isn't much point of calling the mianstream card a 7600 GT if it's not based on G7x technology. We don't want Nvidia to follow ATI's lead on that kinda front. :) Reply

The FX5800 was Nvidia attempt to introduce a new high end architecture on a new process (130nm) it had never used before. Just like ATI is doing now. The 6xxx lines (130nm) is not new tech so producing a mature NV40 architecture on 90nm or 110nm should go alot smoother. Even at 130nm the NV40 is smaller than the G70 at 110nm (287mm² vs. 334mm²). Moving the NV40 to 90nm would reduce die size to ~200mm². Look at the 6600Gt: 110nm, 150mm², 8 pipes(?) vs. a 90nm NV40 ~200mm², 16 pipes.

Unless they can make a 90nm "7600GT" part backwards compatible (via SLI) with the 6800GT, NVIDIA is in a position of "damned if you do, damned if you don't." As a 6800GT owner, I'd be rather sad to suddenly have the promise of upgrading to SLI yanked away. Reply

I don't know. At least they arent doing those ugly case reviews anymore, but they sure are still making me feel alienated.

Thats first page smacks of elitism. Why can't we average people with a 5900XT (or even 5200) upgrade to say a 7600 that uses less power and thus is less noisy and easier to cool to than a 6600 or 6800?

I wonder which of the 2 authors wrote that paragraph?

I guess this could be a symptom of Anand and his Apple usage, cause Apple people are often very elitist. Or it could be that they want to be the upmarket tech website for people with lots of money and think Toms Hardware is better suited to us unwashed (FX 5200 weilding) masses.

Actually, this whole new colourscheme smacks of cold sauve elitism! Not the warm yellowish homey feel of old....

As has been pointed out in this very comments section, a 7600 release would be redundant because there already is a 16-pipe, 350MHz part with 6 vertex pipelines: The 6800GT. There is no elitism, it's the raw fact that a 7600GT would be identical to a 6800GT in specifications and (most likely) performance, rendering it pointless to spend time fabricating when the 6800GT serves just as well.

As for the article, I noticed that the 7800GT was outperformed by the 6800U in some SLI applications (like UT2004). Is that related to memory bandwidth, or is that a driver issue because of the 77.77 beta drivers you tested with? Reply

Yes, the 6800 GT will come down to $250 and likely even more over the next few months. You can already buy a 6800 GT for $270 (from our realtime price engine).

The 6800 GT is not a noisy part. The HSF solution for the 7800 GT is strikingly similar. A lower performance G70 part may run cooler and draw less power, but, again, the 6800 GT is not a power hog.

There really is not a reason for us to want a lower performing G70 part -- prices on 6 Series cards are falling and this is all we need. Even if NVIDIA came out with something like a "7200 that performs like a 6600", the 6600 would probably be cheaper because people would the 7 means more performance -- meaning the 6600 would be a better buy. Reply

Good question. xbitlabs.com state in their review that, "The GeForce 7800 GT reference graphics card we had in our lab proved quite overclockable. We managed to increase its working frequencies from 400MHz for the chip and 1000MHz for the memory to 470MHz for the chip and 1200MHz for the memory, which should ensure a significant performance increase during our tests." Unfortunately they did not include overclocked results in their graphs.

Also, xbitlabs noted that the PCB is shorter than for the GTX, that the cooler is shorter and also commented on noise levels, given that unlike the GTX, the GT is apparently unable to regulate its fan speed. It is a shame that anandtech missed out on such details. Reply

I'm thinking this is a typo when you re-made the charts for BF2 to include the NV 6800 Ultra. The ATI Radeon X800 XT is really the ATI Radeon X850 XT PE right? Maybe I am wrong. Just wanted to point out the fluke so you guys can fix it. Good read so far! Reply

"Until performance is increased beyond the 7800 GTX, it will be hard for us to see a reason for a new desktop 7 series part."

So you rich boys got your flashy new toys and now you see no need for a new desktop 7 part, because you're too busy thinking about how you're gonna be playing games on the go (i.e. your new laptop GPU).

What about us stuck at 6600 GT level. Don't we deserve an upgrade??? Like a 12-16 pipe/ 256-bit 7600 GT? I guess that we are just fine, since we're stuck with 1280x1024 res monitors anyway, right? WRONG! We've been cheated out of a significant upgrade long enough. Until 6600 GT there was only BS in the mainstream for about 2,5 years (R9500/9600/PRO/XT/GF5600/Ultra,...) and we're not going back there, no sir!

Same sh!t when people with broadband post large uncompressed images on the web and forget about all those with dial-up, even though they themselves left that sorry bunch not too long ago. The world is weee bit bigger than your own back yard and someone writing for a site as big as AT should really know that.
Reply

I will agree with you that the 6600 GT was the first real solid mainstream option in a while. It's a good card.

I'll argue that the next mainstream card you'll want to look at is the 6800 GT. There are 128MB parts and 256MB parts all with 256-bit busses and 16 pixel pipes at good clock speeds.

As other's have said, releasing a G70 part with the same specs as the 6800 GT will have the same performance as well.

We tried to explain in the article that the 6 Series comprises the rest of the lineup going forward. There are no performance gaps that the G70 needs to fill in. The only reason NVIDIA would want to release slower G70 parts would be to phase out the 6 Series part at that same speed grade.

It also doesn't make sense for NVIDIA to immediately release a slower G70 part. The lower transistor count and more mature process used on NV4x chips will likely make it easier for NVIDIA to sell the parts at a lower price and higher profit than equivalently performing G70 parts. The economics of this are very complicated and depend quite a bit on NVIDIA and TSMC and the cost per IC for NV4x and G70 chips.

It would almost make sense for NVIDIA to take less functional G70 chips and sell them as 6 Series parts. But maybe I'm wrong. Perhaps NVIDIA will think it can trick people into thinking a part with the same performance as a 6800 or 6800 GT and a 7 Series name is worth more money.

It really shouldn't matter to anyone whether 6 Series parts keep falling in price or other 7 Series parts come out. I stand behind my statement. We'll see lower performing G70 parts come out if and when it becomes more financially viable for NVIDIA to sell those parts than NV4x parts. There really isn't any other factor that matters.

Transparency AA may be an interesting thing, but moving towards the budget end of the spectrum will tend to make the performance impact of Transparency AA too high to matter. Other little tweaks and features have already been covered and aren't that compelling over the 6 Series, and seeing a G70 perform the same as an NV4x for the same price really wouldn't be that exciting no matter what my budget looks like.

It would be interesting to see if the 90nm based 7600 part has a 256Bit Memory Interface, as it seems 256Bit Memory Interface cards usually have a minimum die size of around 200mm2 just out of range of a high volume mianstream level card.

I would certainly want a 7600 GT if it had a similar pipeline configuration in comparison to the 6800 or 6800 GT because G7x does have some improvements I like, most notable the improvement in HDR performance levels, and Transparency AA, also what about WGF 1.0 support? It usually for the most part better to have a new mainstream card based on newer tech then just shifting older technology into the lower price points, as those cards aren't meant for those price points and are usually more expensive to produce.

Not all of us can even afford the current 399US price point of 7800 GT. Reply

The 9800 pro had to be the best budget investment in all of video card history (as far as I can remember anyways). The price for the 128MB version quickly dropped to below $200 and it was pretty much the top performing card on the market for that price. Now its impossible to find a sub $200 offering that will last for more than 6 months. Reply

I think what the two of you are missing is that the Geforce 6 series is basically the same technology. This pushes the 6800GT a little closer to mainstream, as it can currently be had for around $250-300, and it will probably drop in the next few months to the $200 level, which is about where it could be considered mainstream (and probably about what you paid for your 6600GT. Any new "mainstream" 7xxx releases would simply overlap current 6xxx parts. If the 7xxx series showed more of a technological difference, then this sort of thing would make more sense. If you go to Nvidia's home page, you'll see their "power of 3" marketing push, and see the list of products that all share this same technology. Reply

Yup.
The original request was for "Like a 12-16 pipe/ 256-bit 7600 GT"
Err, the 6800, 6800GT are 12 and 16 pipe, 256bit SM3 cards.
How would a 7600GT type part be ANY different than what we already have, except for being cooler/cheaper to make?
In terms of performance, it's just a case of the 6800 dropping down to cheaper prices.
And as I said in the forums:
"Lower end redesigned parts to replace the 6xxx series would probably use 11nm tech, which would lead to smaller dies, and lower power consumption, reducing heat in the end, and reducing cost to make the die in the first place, making it a better deal for both consumers and nVidia, although transitioning the NV40 to 0.11 might not be worth the hassle."
ie: there's already a current part fitting the niche you want a brand new part for, and the probable only changes would be a transition to a smaller process, and a dropping of the price. Reply

While the 6800GT would be warmer than a hypothetical 16-pipe 7600GT, it would not be slower. On the contrary, the 6800GT would probably be faster. Why? Because it has a 256-bit memory-bus, and in all probability the 7600 series would have a 128-bit bus.

The 6800 and 7800 series are designed as high-end parts and for that they need the 256-bit bus to allow satisfactory performance at the extreme resolutions with AA that high-end users demand. The 6600 and possible 7600 are designed as mid-range parts and they do not need to be able to run at such high resolutions with AA, therefore they can make do with a 128-bit bus as memory-bandwidth is not so important at medium resolutions. The 128-bit bus lowers the cost of manufacturing the core, and also the card's circuit board as it is much simplified, and provides a clear distinction between the mid-range and high-end parts.

Do you really think they'd release a 16-pipe 256-bit 7600GT, when the 7800GT is "only" a 20-pipe card? The performance gap would be too narrow. A 12-pipe 256-bit 6600GT would be a possibility, but it would make a lot less sense than a 16-pipe 128-bit 6600GT.

If you want a 16-pipe 256-bit card, just go out and buy a 6800GT. It's every bit as good as the 16-pipe 256-bit 7600GT you want but which will never be released. Reply

Ooops the third paragraph should read "Do you really think they'd release a 16-pipe 256-bit 7600GT, when the 7800GT is "only" a 20-pipe card? The performance gap would be too narrow. A 12-pipe 256-bit 7600GT would be a possibility, but it would make a lot less sense than a 16-pipe 128-bit 7600GT."

It may be basically the same tech but my 6600GT will never perform like a "7600GT" would. And that's basically what I'm interested in. I don't see why a 7600GT part couldn't be sold at the $200-$220 price point. I'd buy it. Reply

Likewise your 6600GT won't perform like a 6800GT... which would perform the same as a 7600GT... the 6800GT will be sold in the $200-$220 price point within six months, and probably sooner.

What I might find exciting... would be if ATI actually gets their new card out... and has new mainstream parts (available to buy) that drive down the prices of the X800/X850 and Geforce 6800GT even more, and maybe introduce some new technology, too.

Has anyone heard anything about naming of the new ATI cards? Are they finally ready to give up the Radeon name... or their new X naming? If they went with X900 for the new flagship, would they be able to release new mainstream parts, since there would be nothing to name them?! Reply

"(1) I understand that taking new tech and reviewing it on launch day, etc., is important. (2) Then comes the mass production of the tech by different manufacturers, so there's a need for the readers to be informed on the differences between the different products. (3) Then there's the difference between the interim releases after the initial launch of the new tech that also need reviewing and explanation. From those three different times of a piece of new tech, I would typically expect 3 articles or so for each piece of said new tech. From my initial post, I have just been surprised that what seems to be happening are lots of reviews centered around the second phase of your review cycle, and so that's why I was asking whether this is really what readers want to see on AT all the time (i.e., $500 graphic cards to oggle and wish a relative would die so that we could afford it)."

"Can't tell you how weird I felt last night to read the new article about the $3000 desk. I guess it helps to have some off-the-wall review about such a nice piece of desk. But is that really what the readers want to see? More hardware that they can't afford? One poster above me here mentioned that you've lost touch with your readers, and sometimes, I wonder whether you're really just trying to fill a niche that no one else is really pursuing in an effort to either drive the industry in that direction or just cater to a crowd that may or may not even visit here. Who knows. I sure got confused with such an article. These 7800GTX articles have done the same for me."

"I don't know what to tell ya to do, because I'm not in your position. But I certainly don't feel as at home on this site as I used to. Am I getting too old to appreciate all this nice shiny new expensive hardware?? :)"

4 out of the last 5 articles on AT are all this high-end tech! Where's the sweet spot? The budget? ANYTHING ELSE BUT THE HIGH-END??

What else is there to review? I mean it's not like Nvidia has relased the 7600 Series yet??? Neither is RV530 anywhere to be found. And typically a high end piece of hardware is new, and you remember Anandtech did review the Athlon 64 X2 3800+. Though I would like to see a reivew of the recently announced Sempron 3400+. I would also like to see how the new Celeron D 351 stacks up as well.

I am not sure it's all that interesting to review the same video card over and over again like reference 6600 GT vs a new one with a new more advanced heatsink, then a new one with a better bundle of software etc...

I have my doubts as to whether a 7600 type card will even *BE* launched in the next six months. Think about it: why piss off all the owners of 6800GT cards by releasing a new card that isn't SLI compatible? From the customer support standpoint, it's better to keep the older SLI-capable cards in production and simply move them to the mid-range and value segments. Which is exactly what NVIDIA did with 6800 and 6800GT with this launch. Now if the 6800U would just drop to $350, everything would be about right. Reply

The 7800GT is slightly slower than a 6800 Ultra SLI setup and the GTX is on par or faster. The GT AND GTX cost less than the additional 6800 ultra upgrade to SLI, so SLI is rather useless. Why opt for an extra power hungry 6800 ultra when you can just swap for a lower power 7800 GT or better performing and lower power GTX for less money? This will happen with the 7800 GTX SLI setup too. SLI should only be a considerationas an initial buy (for rich gamers who want the absolute best), not as an upgrade path for later. Gotta love nVIDIA "rendering" their own technology useless lol!. Reply

I'd like to suggest maybe using 1920x1200 for high res tests. The popularity of widescreen gaming (where possible) is growing, and this provides a more commonly used "extreme resolution" than the 2048x1536, thus, imo a bit more relevant.

I second this motion for 1920x1200!! Why test at 2048x1536 when most people who could afford these monitors (albeit CRTs) would likely go for widescreen instead? Slightly less pixels but better visual impact... (nb love watching other CS players not spotting an enemy on the peripheral of my screen presumably cos their monitors are not widescreen!) Reply

First off, no gamer plays videogames at resolutions above 1600x1200! Most of us stick to 1024x768 so that we can get high framerates and enable all the features and play the game on the highest settings. In addition you did not show how the GT and GTX stacked up against the previous generation suchs as the 6800 ultra, GT and the 5950 ultra. And Where is the AGP version? My computer is 2 years old and I am upgrading my graphics card soon. I guess I'll wait to see if ATI makes AGP cards for their next generation. And where the heck is the R520? ATI is really lagging this time around. Hopefully we will get some AGP love. AGP still got a good 2 years of life left in it. Reply

Hey, that's good to know about the vsync... back when I played Doom III, I noticed some of that, but didn't know much about it. I just felt "robbed" because my Geforce 6800GT was giving me tearing... thought maybe it couldn't keep up with the game. But everywhere I went I saw people saying "Vsync off! Two legs good!" Reply

First off, no gamer plays videogames at resolutions above 1600x1200! Most of us stick to 1024x768 so that we can get high framerates and enable akk thge features and play the game on the highest settings. In addition you did not show how the GT and GTX stacked up against the previous generation suchs as the 6800 ultra, GT and the 5950 ultra. And Where is the AGP version? My computer is 2 years old and I am upgrading my graphics card soon. I guess I'll wait to see if ATI makes AGP cards for their next generation. And where the heck is the R520? ATI is really lagging this time around. Hopefully we will get some AGP love. AGP still got a good 2 years of life left in it. Reply

Er, I have a Dell 2405 monitor running at 1920x1200 and I always run it native where possible (even with my 6600GT, many modern games are *playable* including CS Source, Far Cry) so this statement is complete balls. Obviously I would like a faster card to run games as smooth as possible so the tested resolutions are extremely pertinent to me. Reply

The high resolutions are needed cause at 1024x768 there will hardly be any difference between 6800GT, 7800GT, x850xt, 7800GTX cause all these cards handle this resolution easily and they will give similar fps cause they will all be CPU limited. Reply

I believe the higher resolutions are used because at the lower ones there really isn't much differentiation between the various cards. The article title is "Rounding Out The High End" so hopefully there'll be another comparing the performance against mid-range cards (high-end from previous generation). AGP is missing, but is there really that much difference between the AGP and PCIe versions of the same card?

For those of us who would like to get a better idea of how the 7800GT compares to the 6xxx generation of cards, might you guys consider throwing 6800GT/Ultra (non SLI, preferably) numbers on the benchmarks for comparison? Reply

If you're a 6800GT owner, *and* looking to upgrade that still not-so-cheap and capable beast, you're not so much worried about the money as most people are, but you'll be very interested in performance.

So, wouldn't you want to know if you should spend ~$450 for a more modest upgrade (compared to that 6800GT), or ~$550 for the most performance you could get without going SLI?? If the ~$100 difference is a very big deal to you, you wouldn't be looking to upgrade your 6800GT. (And yes, I know these prices fluctuate, and may each be more or less, but the point is still valid.)

Not including its direct sibbling in the benchmarks, the GTX, now that would have been lack of common sense. Honestly. Reply

The document posting engine isn't designed to be posted last to first. Normally, articles are uploaded and then activated. Some times a mistake is made where an article goes "live" before all the pages have been uploaded. Reply