R9 280X = slower than the GTX 770 in most cases costs $300-$350 on average depending on model

GTX 770 = Overclocks like a champ, costs $300-400 on average depending on model and ram

both are very close, i dont see where this "huge" price difference is..... That argument is pointless, we can go down, all the way down to the 750 ti, same story..... Not saying one is better than the other, just saying they are close to price/performance.

Click to expand...

The cheapest 770 here costs 290 euros here. The cheapest 280X costs 250. And of course 770 is not faster "in most cases".

280 here costs on average 20 euros less than 760.
There is nothing new from Nvidia to put next to 270X
270 costs a little more than 750Ti and it is way faster
Even 265 that costs the same as 750Ti is faster
260X is faster than 750
260 is cheaper than 750.
250X is faster than Nvidia cards in the same price range.
The same is true for 250
and 240

Dude, you are aware he owned a Titan, and is currently running a 780ti? You just made yourself look like a complete tool, and you and people like you are the reason this forum is getting worse.

Click to expand...

We need to team up like double dragon, bring Freedom too! But seriously, yeah, so many people are incapable of reading a post in a neutral manner. It's becoming a PITA with a lot of people too ignorant or with their bias guns set all the way to 10.

I love new tech but until games catch up OR you are gaming at 4K its not like the old days. You used to have to upgrade every year just to stay on medium settings! I think tech will be in a slump for a long time until the market demands more. Seriously a AMD 4200 X2 is more than enough for the average computer user and a 570 is over kill for the casual player. Its really hard for me to get excited over this stuff anymore.

Maybe I'm just getting older and jaded. For those of you who still get excited......I'm jelly.

Click to expand...

I agree, there is no reason to upgrade to powerhouse GPUs untill 4k becomes more affordable

I agree, there is no reason to upgrade to powerhouse GPUs untill 4k becomes more affordable

Click to expand...

I think the time when "4k becomes more affordable" will be close to the end of next year, that is - 2016 will be the year of 4K.

Windows 9 (because Windows 7 and Windows 8 are relatively poor with high DPI software)+ new GPUs (if we are lucky on new manufacturing processes, that's what I am more interested in, rather than Maxwell on 28 nm, MEH) + some new CPUs.

R9 280X = slower than the GTX 770 in most cases costs $300-$350 on average depending on model

GTX 770 = Overclocks like a champ, costs $300-400 on average depending on model and ram

both are very close, i dont see where this "huge" price difference is..... That argument is pointless, we can go down, all the way down to the 750 ti, same story..... Not saying one is better than the other, just saying they are close to price/performance.

Click to expand...

Little point in using overclocking as a pro or con when 90%+ of graphics card users don't overclock to be honest, if everyone did overclock reference designs would cost us even more as there would be little market for overclocked or special edition models, just a personal opinion though.

Silly, nvidia is not alone but their pricing has a very pronounced negative effect on the whole market, since they are part of it, thus emphasizing and enforcing even further stagnation and crisis in the same market rather than positive influence on growth or whatever else you want to achieve. .

Click to expand...

I believe all the current prices have stemmed from AMD not being able to compete mano a mano with nVidia since possibly around the times of the X1950 series of cards. Sure, current AMD cards offer great performance for the money, but tend to draw much more power, overclock less, produce more heat, etc, compared to their Nvidia counterparts (debatable on a generational basis). There is also the perceived (real or otherwise) greater reliability and stability of Nvidia drivers.

I'll leave this here.

...go back to the release of the previous generation of SKUs. AMD has a 4 month head start on Nvidia by releasing the HD 7000 series. There is speculation, that in this time, Nvidia moved their mid-tier GK104 to a high-end SKU, removing the GK110 from the lineup. This is entirely possible, because a product based on the GK110 was announced as far back as May 2012, one month after the release of the GK104 SKUs. Fast-forward to February this year. Nvidia is starting to lose some competitiveness against the HD7970 GHz Ed (and possibly due to other pressures) and decide to release the GTX TITAN (and eventually the GTX 780) and cash in on the enthusiast market.

So to sum up, AMD are only just beginning to be competitive with a 17 month old GPU. Maybe AMD should be blamed for Nvidias crazy prices? But having said that, some say AMD play a different game; price/performance.

Click to expand...

There is also the fact that each companies SKU tiers seem to conveniently slot in between each others' price range - again, none go head to head. Sure they try to undercut each other, but it almost seems there is some sort of gentleman's agreement in place. Maybe I'm just paranoid....

There is also the fact that each companies SKU tiers seem to conveniently slot in between each others' price range - again, none go head to head. Sure they try to undercut each other, but it almost seems there is some sort of gentleman's agreement in place. Maybe I'm just paranoid....

Click to expand...

That has been the case for a few years- at least since Nvidia and AMD got smacked for price collusion. The illusion of a price war from a few special case SKU's ( HD 5830 vs GTX 460 for instance), but with very little actual impact to the overall product stack (and ASP's) of either company.

Are you on something.
I pointed out the obvious and I welcome all new tech even in 2015 im no OT fan of any of them.
Sorry if I was not excited enough for your liking but im not impressed by random ass silicon in that raw or mysterious a form.

I have mentioned similar tactics in amd threads ie look at this not them, they all do it.
Many a purile jab at me here but The facts still are.
Here is a chip
Its got plenty of memory
Its at ALPHA stage
You bet it needs power
And it still could be anything
Especially given nvidias recent roadmaps, that's not me being biased, that's me saying the way it is.

In India (not everywhere in India, just a few places I checked) the 280x is around INR 3k cheaper than the 770. Cheaper, but not enough imho, given the higher power consumption and largely without the sexy backplate as on the Asus 770 DCU II.

I have been seeing the talk about the GTX 880 (GM 104/204) having a 8 pin and 2 6 pin power connectors and the talk of it being up to a possible 375 watt TDP when you add it all up. Take a look at the TDP of the GTX 260 and GTX 460 and GTX 680 (really should have been called a GTX 670). Nvidia had to invest some time and money into redesigning the Maxwell for the 28nm process due to the situation at TSMC so you can be sure that they intend to release an entire series of 28nm Maxwells over a period of several months in 2015 to recoup their investment and then we will have the 20nm Maxwells and those will be the performance powerhouses.

Wow compute cored gfx is around the corner nice , whilst the pic may be a hoax it does all make perfect sense , imho those serial cores will either be a new special Dp shader sub unit aimed at serial compute or something much more interesting like Jaguer cores perhaps but they are not at all like an smx unit btw ( cant remember who said that) they will be separate and special , I think the bus/fabric that binds it all will be very very interesting too.

I disagree .
Some people have too much free time on a Friday night my comments in this thread were neg only slightly but I suppose you read into it what you will.
Id hope you are all right because imho we need 4-10x the gpu grunt we have and progress is progress but this pic is intriguing but not informative.
I obviously don't approve of the miss quoted stuff smokey.

I have been seeing the talk about the GTX 880 (GM 104/204) having a 8 pin and 2 6 pin power connectors and the talk of it being up to a possible 375 watt TDP when you add it all up. Take a look at the TDP of the GTX 260 and GTX 460 and GTX 680 (really should have been called a GTX 670). Nvidia had to invest some time and money into redesigning the Maxwell for the 28nm process due to the situation at TSMC so you can be sure that they intend to release an entire series of 28nm Maxwells over a period of several months in 2015 to recoup their investment and then we will have the 20nm Maxwells and those will be the performance powerhouses.

Click to expand...

Its nvidia they wouldn't put a card out that uses 375watts for a single gpu. That would just be outright stupid for them. AMD I would believe it more from them to do it as they have and are will to use what ever amount of power to push their chips to be competitive Nvidia (on GPU side) and Intel (n cpu) side.

There is also the perceived (real or otherwise) greater reliability and stability of Nvidia drivers.

Click to expand...

That's because of stupidity of end-users who otherwise pretend to be able to do many things.

I have been using ATi Catalyst for 8 years now and never ever had any problems with it.

Everything seems to depend on the quality of the device-behind-the-keyboard, and since I think we can assume AMD drivers are more sensitive to your PC environment, like installed MS Visual C++, .NET Framework, etc, it is indeed some kind of a proof of this claim.