you know when was the time to care? when nvidia gave amd the license to do this years ago by breaking the re-branding ground. there is a stable truce it seems in gpu mfg, seems either side is only willing to take baby steps ahead of each other, in terms of corporate scheming or hardware advancement. step. follow follow, step. it's just like dancing!

you know when was the time to care? when nvidia gave amd the license to do this years ago by breaking the re-branding ground. there is a stable truce it seems in gpu mfg, seems either side is only willing to take baby steps ahead of each other, in terms of corporate scheming or hardware advancement. step. follow follow, step. it's just like dancing!

when exactly did they make the decision to rename the 8800gt to the 9800gt, and isn't one of the gts240 a reworked 9800gt? i'd say thats when, as soon as they took the first step in the direction that made it less shitty for ati to do the same, and bad business if they didn't do the same cost cutting measures as their direct rivals. at least it would appear to be bad business to some of the share holders, who are what really matter, which is why it's happening now on amd's side.

My 5770 is still doing good, they are fairly low power consumers. It makes sense to continue that line up. Look what the 6770's still sell for. I have no problem with some carry over, especially when g-cards are getting so far ahead of software. I just want to see power consumption drop!

when exactly did they make the decision to rename the 8800gt to the 9800gt, and isn't one of the gts240 a reworked 9800gt? i'd say thats when, as soon as they took the first step in the direction that made it less shitty for ati to do the same, and bad business if they didn't do the same cost cutting measures as their direct rivals. at least it would appear to be bad business to some of the share holders, who are what really matter, which is why it's happening now on amd's side.

So ATi renaming the X550 to the X1050 didn't happen then? I guess renaming the X1300XT to x1650Pro didn't either, and they definitely didn't rename the x1600 Pro to x1650...

And remind you, this was back in the day when adding a 50 to the numbers was signifigent enough to distiquish between generations. Back when doing a die shrink, or similar to aid performance, meant you want from an x1900 to an x1950 name, instead of today where that is enough to leap an entire generation in naming. In the naming system today, then x1950 would have been called the X2900. Or with the examples I used, the x1650 would have been called the x2650 if the naming style of today was used.

Oh, and then there was the x1900GT. Which they reworked after release to reduce performance, kind of like the 9600GSO that everyone likes to harp on. Oh, and they did the same thing with the HD 2900 Pro, secretly dropping it from a 512-bit memory bus to a 256-bit memory bus after the initial batch had shipped, severaly cripping it, likely because it performed too closely to the x2900XT, and in reality could be flashed to an x2900XT, but crippling the memory bus killed 2 birds with one stone there.

So are you still sure it was nVidia that started it all?

And before anyone gets in a huff, I'm not pointing fingers at ATi here either. I honestly don't know who was the first one to rename a GPU to fill a gap in the next generation, my knowledge doesn't go back that far(and I'm too lazy to do the research). For all I know it could have been VooDoo. My point is that they both have been doing it, the G92 rebadges were not the first by a long shot, they were just the first(or maybe just most recent) where people actually caught on and publicised it.

What sucks is if they had die shrunk the 5670 it'd have epic performance for a 65w part. Maybe they were afraid of a $100 part having enough performance to max any game? I imagine it's actually a serious concern for them that their high end cards are becoming less and less relevant due to console stagnation.

What sucks is if they had die shrunk the 5670 it'd have epic performance for a 65w part. Maybe they were afraid of a $100 part having enough performance to max any game? I imagine it's actually a serious concern for them that their high end cards are becoming less and less relevant due to console stagnation.

I'm sure part of the problem too is the immaturity of 28nm. TSMC isn't expected to be running full blast 28nm until at least then endish of Q1 2012. But both sides(more so AMD IMO) are trying to push "new" products out the door at record pace. The more "new" products you keep showing on paper, the better you look...

This is really more about btarunrs’ banter as why we’re all off in left field disputing this. So, they’ll shrink or even rename some low end chips, its smart business! It’s not like Nvidia is beating down the door to get at the market either, the lowest of low end is basically gone so why spend any engineering there!

AMD can keep the 6450 as the minimalist card, they don’t need to shrink it, just newer feature set and tinker with it to always keep it fan less. Consider it as the 8400GS lives today. That card would still probably better the GT520 replacement, while holding low entry price for those who can’t live with Nvidia IGP or GMA Craphics.

AMD will make APU’s prevail and abandon onboard graphic, while all they need is two models that Hybrid. First a 6550 like card; they don’t shrink or change much architecturally. The other a 6570 might shrink and get the rest of architecture changes originally slated for Northern Island that they passed on. Either of those still can be used as discrete and pricing is very competitive.

The 6670 successor will get elevated to 6770 level performance; while still maintaining without external power (6-Pin). This card will get most all of the mainstream improvements of the Southern Islands and sustain the VLIW4 architecture (with upgrades) and 128-Bit DDR5. The 7670 will be the stout player in the “entry gamer level” who wants to “plug-n-play” with their OEM box. I mean what has Nvidia offer in this segment, but just the GT440… Fermi never scaled good in the low end and Kepler is probably holding that same course.

I see AMD not offering the 7750/7770, that's always been an oddity and not coming back (at least at first). When the 5XXX came along the 58XX went way up the ladder, and they need to back fill a cost effective model to 4850’s the 57XX was born. I think they’ll retire the X7XX designation, or if competition and binned 78XX chips are profuse, they could bring it back latter as 7770 but 7830 sounds more proper.

It was 'Bump'gate. An issue with the solder components causing things to melt and stop working. Hypothesised reason why Apple turned to ATI. Cost NV a lot of money that did as they had to set aside some huge amount for RMA's etc.

The Nvidia rebranding was pretty systemic but a lot stemmed from the huge success if the 8800GTX chip (G80 on a 90nm process?) which the pretty much regurgitated until Fermi.

I'm sure part of the problem too is the immaturity of 28nm. TSMC isn't expected to be running full blast 28nm until at least then endish of Q1 2012. But both sides(more so AMD IMO) are trying to push "new" products out the door at record pace. The more "new" products you keep showing on paper, the better you look...

...which I thought was the basis for them wanting to test it out on low end chips first, as has been the reported plan fro both manufactures. This would suggest they're saving 28nm for the 78xx and 79xx series chips and won't do a full range update till 8xxx. That just seems bizarre. Unless we're going to see 7679s half way in.

What sucks is if they had die shrunk the 5670 it'd have epic performance for a 65w part. Maybe they were afraid of a $100 part having enough performance to max any game? I imagine it's actually a serious concern for them that their high end cards are becoming less and less relevant due to console stagnation.

I personally think that is the big problem. Look at my system, and with the resolution of my monitor it can play anything I have tried to play. I know people running higher resolutions have more strain, but still....... If performance of g-cards took 2 more flying leaps forward a low end card would be all ANYBODY needed, and that is not good business.

What sucks is if they had die shrunk the 5670 it'd have epic performance for a 65w part. Maybe they were afraid of a $100 part having enough performance to max any game? I imagine it's actually a serious concern for them that their high end cards are becoming less and less relevant due to console stagnation.

...which I thought was the basis for them wanting to test it out on low end chips first, as has been the reported plan fro both manufactures. This would suggest they're saving 28nm for the 78xx and 79xx series chips and won't do a full range update till 8xxx. That just seems bizarre. Unless we're going to see 7679s half way in.

Why bizarre? Considering the console-ification of gaming, and the resultant lowering of graphical standards practically rendering most GPU updates as not necessary for most gaming, they seem to be doing just as much as they need to do. Why spend more money on a completely new line, top-to-bottom, if it's not necessary?
It's the fault of the game designers not pushing the technological limits, always making sure their designs are console friendly. The consoles have created an artificial ecomonic and technological plateau. No need for Big Green and Big Red to throw too much money into their development if they can rebrand.
Also, no major DirectX updates that might call for a major change top-to-bottom.

humm... didn't we already know this? If I recall correctly the 7300-7800 cards were supposed to be 28nm VLIW4 GPUs from the start (a die shrink of current models) and the 7900s were going to be GCN. I think that the news piece about the HD7900 series using XDR2 RAM had this info.

If you are really against this practice and any other practices you feel are suspect (for example selling low end cards with lots of memory), why do you complain about it here?

Tell Amd or any company that does this yourself. Maybe they don't know it's bad.

I'll even help you out.

Dear Company,

I feel that your selling of rebranded chips is very misleading to the average computer user. Please stop this practice immediately or I shall be forced to create a group that sits outside you offices until we get want we want. Occupy AMD!!

I don't really see any improvement in performance in 90% of the games I play between my 4830 and my 6950 and I don't expect that to change until the next gen of consoles....So I don't really care if they rebrand...just means my 6950 will be relevant longer

Well as long as the performance increase is relative not a single f*** will be given, I know this is stupid but say the 6850 became became lower mid range its still better than that old low to mid range card from the 6000 series isnt it? There is still a performance increase.

IF they straight rebrand a card with the same performance and target market then no i cant agree with that.

Saying that the ATi 9500pro of the early 2000s was slightly faster than the 9600pro that replaced it. The 9600pro was a cut down 9800pro which was a beast in its day. The 9500pro could unlock to a 9700pro and yes i do remember this from the day because i read a lot of computer magazines back then lol!

And before anyone gets in a huff, I'm not pointing fingers at ATi here either. I honestly don't know who was the first one to rename a GPU to fill a gap in the next generation, my knowledge doesn't go back that far(and I'm too lazy to do the research). For all I know it could have been VooDoo. My point is that they both have been doing it, the G92 rebadges were not the first by a long shot, they were just the first(or maybe just most recent) where people actually caught on and publicised it.

3Dfx didn't really rebrand. Course they weren't around long enough to do that. But they did rebrand within the gen. If I remember right the Banshee was a crippled V2/V3. Cheaper board but didn't sell well.

ATI on the other hand has been doing rebranding with Radeon since the beginning. First Radeon was later rebranded the 7200. The Radeon 7000 itself was also a 7200 without a T&L unit. 8500 series was a new design but they rebranded it for 9000, 9100, and 9200. Later chips were roughly 8500s that had been tweaked/weakened. ATI back then often practiced this. They would release the better GPUs first and then later release a cheaper to produce board based on that. Cut the costs but also the performance.

3Dfx didn't really rebrand. Course they weren't around long enough to do that. But they did rebrand within the gen. If I remember right the Banshee was a crippled V2/V3. Cheaper board but didn't sell well.

ATI on the other hand has been doing rebranding with Radeon since the beginning. First Radeon was later rebranded the 7200. The Radeon 7000 itself was also a 7200 without a T&L unit. 8500 series was a new design but they rebranded it for 9000, 9100, and 9200. Later chips were roughly 8500s that had been tweaked/weakened. ATI back then often practiced this. They would release the better GPUs first and then later release a cheaper to produce board based on that. Cut the costs but also the performance.