Stop That, It’s Silly: Nvidia’s New Titan X Graphics Card

Share this:

Sooner than anyone expected, Nvidia has rolled out its latest uber graphics card. It’s the new Titan X. It’s undoubtedly the fastest and bestest PC graphics board ever and probably by some margin. And it will cost you $1,200 and probably a similar post-VAT sterling figure back in the old, disintegrating empire. Call me a desiccated old cynic, but this is getting silly…

Before I invite a comment-thread flaming with my philosophical observations, let’s get a sense of Nvidia’s new pixel-pumping machine courtesy of the speeds and feeds.

Where the mere £600 / $600 GTX 1080 has 2,560 pixel-prettifying shaders, the new Titan X has 3,584. Nvidia hasn’t dished the details on some of the other specifics including things like texture units and pixel outputs. Be we do know it has a large if conventional 384-bit memory bus hooked up to 12GB of 10GBPS GDDR5X memory.

For context the old GTX Titan X from the now defunct 28nm Maxwell family of chips rocked 3,072 shaders and 12GB of memory, albeit much slower 7GBPS memory. Oh and Nvidia has bumped the GPU speed versus the old Titan by nearly 50 per cent to a peak clockspeed of 1,531MHz.

Beautiful board, beastly price…

When you throw numbers around like that, it all gets a bit baffling. So, for a rough feel of the raw computational impact of this new chip, try this for size. It’s good for 11 TFLOPS of simple number crunching prowess. One can argue the toss over the relevance of that figure for rendering and indeed playing games. But it’s still bloody impressive and not far off twice the 6.6 TFOPS figure attained by the old Titan X, which was not exactly a slouch.

If we are to believe Nvidia, and at this stage I wouldn’t entirely take everything at face value, the GP100 chip in that Tesla board is a 15-billion transistor hunk where the new Titan X uses a hitherto unseen chip known as GP102 with 12 billion transistors. For the record and at risk of getting swept away by a torrent of codenames, the GTX 1080 uses yet another chip called GP104 that clocks in around the seven billion mark.

The new Titan has 40 per cent more, er, rendery bits than the feeble GTX 1080

Anyway, the point is that at first glance the latest Titan looks like a pure graphics product without any of the compute-centric features of some earlier Titans. However, for the first time, Nvidia is touting this card’s INT8 performance, which is a measure of neural network or so-called deep learning performance and thus very much a non-graphics application. The messaging, then, is a little mixed – is this an out and out gaming card or something else?

Whatever it is, you’re getting 40 per cent more functional units than a GTX 1080 for 100 per cent more money. If that doesn’t sound like a great deal, the new Titan X also has a lower GPU clock than the 1080, so in at least some situations you won’t even get 40 per cent more performance.

The Titan X does have 50 per cent more memory bandwidth than the 1080. But however you slice it, the value proposition looks laughable. I doubt, for instance, that even this new Titan X will prove a total single-card solution for 4K gaming.

Oh my god, it’s full of shaders

That’s even more true when you consider the likely manufacturing cost of the new GP102 chip. At 471mm2, it’s much smaller than the 601mm2 of the old Titan X. In fact it’s nearer in size to the GM104 chip in the old GeForce GTX 980, which comes in at 398mm2.

The point is that, generally, the bigger the chip, the more costly it is to produce. But in this instance, Nvidia has released a smaller chip and then ramped up the price. Actually, that’s exactly what it did with the GTX 1080, too. The chip in that measures just 314mm2. Admittedly, there will be variables with new processes, but the pricing of this new GPU family looks positively punitive to me.

At this point I was planning on penning a semi-serious dissertation about what I think is going wrong. It would involve near negative real-world interest rates, quantitative easing, epic inequality and how this thing strikes me as being the graphics card Donald Trump would sell you.

But somehow it was so much easier to rejoice in the sheer technical majesty of a GPU capable of smoothly rendering Far Cry’s stunning vistas at 1,600 by 1,200 pixels. That was way back in 2004. Adjusted for inflation, at most that was a £600 graphics card.

There’s actually a parallel to be drawn with my other occupational muse, the car market, in which wheel-heeled punters beat eat other to a metaphorical pulp to sign on for the latest limited-edition Porsche, paying 200,000 euros for cars which are so over subscribed that they’re worth three times that much on the open market the moment the first examples are delivered. It’s total madness, which is where all that cheap money I mentioned comes in.

But maybe I’m just a desiccated old hack. Maybe I should be celebrating the mere existence of master works like the Nvidia Titan X and Porsche 911R. On the other hand, surely there’s a point when the price gouging becomes so vulgar you just have to gag? I don’t know about you, but £1,200/$1,200 for a graphics card certainly sticks in my throat.

Or maybe cards like this just aren’t relevant to PC gamers and are best ignored? The problem with that notion is that the whole market is being dragged upwards. The new high end is £1,200, the new enthusiast is £600 and the new mainstream is nigh-on £300. I have a hands-on with the new GTX 1060 in the works and a jolly nice mid-range board it is, too, in many ways more impressive than the GTX 1080. But it’s 300 bleedin’ pounds for the version I have and even the cheapest 1060s are well over £200.

AMD is taking the ridiculous not-really-at-all-for-consumers card in a different direction: slapping an entire terabyte of solid-state storage on the card itself and charging ten thousand bucks for the privilege of owning one.

When a pair of three year old GPU’s that cost $550 total can run circles around the supposed new ‘Prosumer Compute King,’ (in compute workloads) it really makes you question the price tag. I mean, you $550 only buys you 14-18 tflops of FP32 and 4.5-6 tflops of double precision. I mean, forget the price and performance, forget that $1200 prosumer price tag isn’t even enough to justify HBM, this new Titan is a real deal!

The price of the prosumer cards isn’t just the raw hardware power, it’s the support they give, specialized drivers and such, the places that buy these have a lot of money to spend (and they probably get deals for bulk buys)and good support and space/efficiency are very valuable resources.

Really sorry about that comment, it wasn’t necessarily directed towards you, but I’m just getting used to commenting from the bottom of the page.

As a badly misguided buyer of the previous Titan X for my own ‘prosumer’ purposes, I can definitely say that drivers do make a difference, which is one of many reasons that I’m using such obsolete hardware in my own CAD / rendering setup. Sure, the Quadro line (and perhaps Titan, but to my knowledge, Titans have always used Geforce line drivers) provides driver optimizations of questionable benefit for big name programs like 3DS Max and the like (or perhaps the Geforce drivers are just intentionally gimped), but as has been mentioned, this is a prosumer device, and many prosumers like me are looking for more than just that high end part number to justify a relatively huge $1000 (now 1200) purchase, and the Titan X did that, with 12 gigabytes of VRAM that virtually eliminated those pesky ‘CUDA out of memory’ errors.

Back on track, though. As a ‘prosumer,’ implying a consumer who wants high end hardware, but doesn’t have the means to go drop $3000 on a Quadro card, I also don’t have the means to spend thousands on things like 3DS or Octane Render, meaning that I have to fall back to things like Blender and Luxrender for my 3D work. These programs are both free and not widely used by the people willing to pay NV the big bucks, so they are shunned, and receive no optimization in drivers. However, looking past optimization, and instead evaluating what NV cards are absolutely terrible at, the one thing that really stood out for me was gimped OpenCL. I can’t believe that NV would rather destroy a product line and attract a total of 3 CUDA developers than simply comply to a standard. This is what really shoved me out of the Applevidia ecosystem and into AMD.

After shelling out a hefty $1100 for a card that was only useful for gaming (accounting for tax gets it to that $1100), I needed to cheap card to replace this heap, and that’s when I stumbled onto the HD 7990. In short, it turns out that this antiquated hot (meant literally, of course) mess of an engineering experiment ran circles around my Titan X, and virtually all of my ‘prosumer’ use cases saw a 2+ fold increase in performance, likely due to this unadulterated OpenCL support. I bought each of these cards for a fraction of the Titan.

So, maybe we have different definitions of prosumer (that wasn’t meant to sound hostile), or maybe I’m just cheap, but I refuse to believe that these ‘specialized drivers’ justify the price to anyone who is considered a mainstream buyer, and somehow I doubt that NV makes most of their money off of $3000 cards from their professional compute lineup.

The fastest pro-sumer card on the market needs to carry a hefty price tag.

Nvidia has said that the Titan series is now primarily a compute card – aka not for us gamers, and since it’s Quadro series costs a serious stack of notes, obviously that can’t very well give these cards away cheap, as they would corrode their own business offerings further.

I am surprised they release it this early in the cycle though, but I guess they want to blow AMD away completely.

I’ll hold off upgrading my 980Ti untill the 1080Ti version with HBM2 ram though, so I’m not expecting to purchase new parts untill next winter, just in time for the next Mass Effect game.

Until the price on HBM/HBM2 memory drops, it doesn’t make business sense to use it yet. There aren’t enough applications that call for it’s use just yet. You likely won’t see HBM until Nvidia release the 1100 series or something.

Yeah, the HBM2 process isn’t ready yet. AMD actually paid Samsung to get ahead of everyone else and it ended up getting delayed by almost a year anyway. I think Nvidia ended up going with someone else.

Well it’s supposed to make them more flexible, so they can react to market changes, but they still end up locked into contracts like AMD with Samsung and Nvidia with TSMC (and AMD with TSMC for their CPUs) and can’t really do anything when said contracts are held up. TSMC had been promising Nvidia a 20/16nm processes every year for the last 5 years.

1200USD for 11TFLOPS of FP32 compute performance doesn’t seem too awful compared to the ‘over 5TFLOPS’ performance of the recently announce AMD Radeon Pro WX7100 that will cost ‘under 1000’ USD (assuming that means over 600USD), so maybe they really are marketing this card to the compute crowd.

I’m hoping that in a few months we end up with a 1080Ti card that does to this new Titan X what the 780Ti did to the original Titan; the 780Ti came out later, but was less expensive and better at gaming.

The WX7100 is an enterprise RX 480, that’s the only reason the price goes up. Different driver, and maybe it’ll have higher FP64 performance (which the older Titans also did, but which Nvidia dropped with the first Titan X).

Value is always going to be a personal thing. What one person thinks is good value will differ from what another person thinks is good value.

Personally, being a fairly practical sort, I’m likely to be calculating good value along the lines of “computing power per £” (being a Brit as well as a practical sort). Cards like this, I don’t think they’re going to appeal to the “computing power per £/$/whatever” crowd. There’s the additional factor of “status” or “nerd interest” to take into account. Being a bit of a nerd myself (as well as practical and a Brit), I can understand people paying a small premium to try out this cutting edge stuff, but the kind of nerd tax applied here is well beyond what I would be willing to pay. Also, I know a lot of nerds, and none of them would think this pricing makes any kind of sense.

The card and it’s price is all about marketing. It’s not supposed to represent any kind of value. This is Nvidia swinging it’s big green and black c*ck, saying “look what we can do – you know you want a piece of us”.

ps – drunk post warning. I reserve the right to look back on this post later and wonder wtf I was on about.

These cards aren’t meant for mass market, and the market as well as nVidia knows this. Just like a car manufacturer like Ford (and graphics cards have almost become ‘consumer’ products like cars) caters for the granny who just needs a 1.0 litre for pootling down to the shops on a sunday for less than £10K they also cater for the mental more-money-than-sense crowd with things like the Ford GT costing £400,000. Now, the car isn’t 100 times better like the price would imply but they make far less of them and use more complex methods to make them, just like nVidia with their smaller, but far more complex Pascal chips. How can you really think as simply that because its smaller they would be cheaper? They have fit more transistors in a smaller die area whilst increasing performance over the last generation. They will have had R&D costs and costs for wasted silicon (ie chips that didn’t make the cut due to the more complex manufacturing process) and don’t forget that they have licensing income from Intel that they will be losing next year to take into account too.

I agree. This is basically like the car manufacturers making a race car to show they have the tech and savvy.
And the really well-ado people actually _want_ a snobby price tag to make the purchase stand out more(luxury factor, sociological status effects etc), so saying the most expensive show-off toy is expensive is basically just complaining about what defines it as what it is.

If just as a proof-of-concept I think its fine that these things exist, because it helps to learn things about production, heat management, computing balancing etc.

And as has been said, the more realistic things we should be looking at and talking about are the 1060 for mainstream and the 1070 for 1440p and WQXGA gaming for example.
And they already have shown 50+ EUR price ranges lower than the initial offering prices, which I find encouraging.

“I was planning on penning a semi-serious dissertation about what I think is going wrong. It would involve near negative real-world interest rates, quantitative easing, epic inequality and how this thing strikes me as being the graphics card Donald Trump would sell you.”

I would have liked that, given that I can already get pricing, tech details and benchmarks everywhere else.

Quite frankly, the fact that negative interest and global CB QE is ruining “normal people” while helping the already rich-from-finance getting richer with finance should be a topic everyone should be bringing up everywhere, especially given that we seem to be far too focused on the whole “immigrants are the cause of all evil” distraction when the truth is the ivory towers are the ones rocking the devil fork.

On the plus side, having a 1060 GPU with a TDP that used to be in the dimension of CPUs that can do some games even at x1440p for under 300 EUR is a pretty nice sign of progress to me.
And I do quietly hope that prices will come off 15-30% on all of these new guys over time, if just by AMD maybe, some day, whenever that might be, getting their shit together and not frying PCIE slots and mobos and offering cards that can compete in TDP, performance and price strongly.

Overall, I am quite hopeful for GPUs. Just not so much for politics and equality.

Well, it wouldn’t have been terribly ground breaking. It would just argue that part of the context that makes this kind of product possible is the now quite large group of people (still a very small proportion of the overall population, obviously) up to their eyeballs in cheap money.

So they pay 3x list price for a brand new Porsche or millions on a rusty old one, dump god knows how much on a piece or art or bottle of wine. I’m not arguing this card is precisely analogous to the asset bubbles currently prevailing in various alternative investments. But I do think that the cadre of totally price insensitive punters I alluded to fuels this kind of pricing just as much as a lack of competition from AMD.

People argue this is a compute card, but I’m not sold on that. It doesn’t have all the compute gubbins of something like GK110 and I would be fairly confident most of these cards will be bought be consumers for whom $1,200 is fairly meaningless, not professionals.

Banks make *more* money in a high interest rate environment. QE might not be accomplishing much, but it is not driving the purchase of Titan X cards. Conspicuous consumption is not new.

And that old chestnut about asset price bubbles is a good one, considering the only one anyone can ever point out is housing, which is usually being driven by very specific local factors. Since it’s a claim mainly originating from those self-same bankers, I’m liable to discount it. YMMV

I’m very close to proposing marriage to you. The only thing that is stopping me is the sense the impression I got from your post that the technological advanced excused the price. I might be wrong – as I said above, I’m a wee bit tipsy right now.

Look, technological advances are awesome. I love them. My current computer would have cost thousands of pounds even a decade ago – now, even a peasant like me can afford it. But technological progress is not what this card is really about.

Nvidia basically has the high-end card market sewn up. AMD aren’t competing there – their preferred battleground is on the fields of Bang-for-Buck (that’s slightly to the West of Lincoln, for the geographically challenged). The only reason for Nvidia to release THIS card at THIS price is marketing – to get their name out their as THE manufacturer of damn-fast cards. Price doesn’t really come into the equation, as far as profit-per-card is concerned. It’s really just about generating marketing buzz with the actual price as an irrelevancy.

Having said that, there’s always going to be a group of people who will pay that price premium. There’s billions of people in the world, and a proportion of them will be what’s technically known as “stupidly rich”. Then, out of the stupidly-rich people, some will be nerds – those are the ones that will buy this card.

Technological advancement doesn’t have to be made available to the public, particularly when there’s no real competition (in the high/cutting edge gfx card market). Released to the public or not, the technological advancement is still there.

With that in mind – and this is the point where the discussion heads into politics rather than technology – you can argue that releasing a card that is, in financial terms, so limited in it’s market (who can actually afford it) that its perpetuating the system that allows for such inequality in the first place. The target market exists because of how unequal society currently is.

Until a piece of technology becomes mainstream, the fact that it exists is basically irrelevant to most people. Maybe in the future you will be able to afford it – but until you can, then so what? What actual difference does it’s existence make to your life?

I have now run out of beer, so y’all will probably be pleased to know that means there’ll probably be no more long-ass posts from me setting the world to rights!

That *really* sucks for UK/EU people, but on this side of the pond the 1060 has already been available for $225 delivered to my door. Thats roughly 170 pounds, and a mighty fine price point for the level of performance that the 1060 offers.

I desperately need this card. I don’t play games. Some neural networks are too big to fit in 4/6 GB graphics cards. And training NN on those cards are too slow. Spending over $100K for DGX-1 is out of question, so the NVIDIA Titan X is the best one I can afford.

Underneath that absurd pile of black plastic moulding is piece of common circuitry board, externally indistinguishable from whatever BestBuy clearance sale special you have in your computer right now.

The real fallacy here is marketing: PC people care about what is under the hood. It’s the Mac people who are more concerned with shiny, bubbly packaging. Try selling this abomination of a marketing department cocaine binge to them, instead.

I wouldn’t be able to distinguish a V6 from a W12 at first glance, but I’m not gonna start claiming they’re the same thing, either. Perhaps it’s just that looking at PCBs with the naked eye is an absolutely terrible way of judging its worth?

You think it overpriced because you don’t know much about the computation industry. Let’s compare two Titans: this one and the Titan super computer that is currently ranked #3 (maybe #4 now) in the world.

The latter has 27 petaFLOPS peak performance, and cost $97 million to build, which is about $3593 per TFLOPS. The new Titan X cost $1200. If you bundle it with server quality motherboard and CPU and RAM, maybe $5000 total? So that’s $455 per TFLOPS. Do you still think it’s expensive?

BTW this is why when the nVidia Titan first came out, every machine learning scientists I know of went to buy 4 way Titan for their research machine. And this is where and how the Titian is supposed to be used.

Nope, I’m afraid the misapprehension is yours on two counts, mostly likely.

First, this Titan lacks the compute capabilities of the GPUs used in the Titan supercomputer. In that context, the peak FLOPS numbers you are trotting out are basically irrelevant. The new Titan X lacks the dedicated FP64 registers of the orginal GK110 that went into the first Titan and is used in the supercomputer you mention.

You will not find anyone making that kind of supercomputer out of the GPU in this new Titan X board, it’s not built for that kind of work.

Likewise, that supercomputer uses Tesla products, not graphics cards with video output ports etc. So the pricing and compute power of a super computer built out of GK110 GPUs is entirely irrelevant in terms of putting a value on this new graphics card.

I expected a price increase from this generation’s Titan after seeing them across the entire range, but I was hoping that it would be at least partially justified by… something new.

I’ll repeat my analysis from the EG Titan X (2016) thread last week.

1) It doesn’t seem to be a fully enabled GP102. A 50% wider GP104 (which this chip almost certainly is) should have 3840 shaders, not 3584. That raises the prospect of a faster GPU in this generation. The Titan label should be reserved for a fully enabled example of the biggest GPU in each generation IMO.

2) For the price, I think they should be using a reference water cooler by now. The unfortunate truth is that the reference “Founder’s Edition” blower just doesn’t keep these cool enough. Even the 1080, with an identical cooler, will hit its 83C thermal barrier, and this card has a 70W higher TDP.

3) the previous point is exacerbated by Pascal’s “temperature compensation” which backs off the clock speed as the GPU temperature increases (reference: Anandtech GTX 1080 review page 15). The value of this card will depend on how closely the GPU clock tracks the “Pascal standard boost” range of 1900-2000 MHz. I fear that with the reference cooler, it will end up much father away from that range than it could be with a better cooler.

While quiet using the stock profile, this fan is also extremely loud if you modify the curve in search of lower temperatures.

4) the previous two points are exacerbated again by the fact that no board partner cards are allowed, as is always the case with Titans. Any water cooling will have to be done by hand and at additional cost.

5) Price. It’d be easier to swallow in combination with a significant update to the reference cooler, but this is still effectively the same cooler that NV have been using for 250W cards since 2013.

As for the inevitable 1080 Ti, it will go one of two ways, I’d guess, depending entirely on what AMD do…

1) if Vega comes out soonish and gets closer to threatening their high end hegemony than they’d like, then NV will pull a “980 Ti” and release a very slightly crippled GP102 as the 1080 Ti at a much lower price (but still higher than the 980 Ti was at launch, no doubt) which will end up practically identical to this Titan in performance.

2) if AMD do nothing, then NV don’t need to do anything either and can pull a “780 Ti” in 2017 – this hypothetical 1080 Ti would be a fully enabled GP102 with 3840 shaders which would be both cheaper and faster than this Titan. If things work out this way then I wouldn’t expect to see the 1080 Ti until Q3/Q4 2017.

yes, this card is more than enough for 4k gaming.
i have a 980gtx card hooked up to a 49 inch 4k LG tv at 60hz and most of my games have max settings. 3.5ghz i-5 cpu and 24 gigs of ram DDR3 1TB boot though would like a SSD drive enentually for boot.

heck even my Nvidia 670 card did a fairly decent job on 4k at mid range before i got the 980

well, i think if we took useless gpu’s and so on,we must took AMD gpu’s,its fact!

1st amd fury x is most useless and terrible gpu i ever see and 1st i dont even try.

i mean we live 21 century and amd offer you this earth detsroyer with watercooled and over 430w load gpu!
still it loose easily nvidia air cooled gtx 980 ti what take over 130w less power. and isay 120w is much!

i say many times that all gpu for desktop use,watercool must denied,agains of law,its horrible way to get gpu cool and silent.
same as we go Formula one back 3.0 litre engine with turbo,or we not try build electric cars,amd think we just another way.
think! do you wanna suppoert that gpu politics, i dont!!

and now amd release rx 400 series,rx 480 is also same terrible efficiency gpu!
load power over 3200w,much more than gtx 1080 reference gpu and still it loose it all the way!

i think and what know something IT hardware ,think just same.
that tell everyone or should tell how lausy and bad technoly amd’s ‘new’ gpu is. terrible planning,testing and technics.

but i think all remembe amd’s huge and amazing hypeting that rx 400 series with low temp and low watt eater,great efficiency and so on, just big lie.

once more.. i read over 320w load and over 85w idle! omg!! but,and i must say is no reference models like sapphire nitro+ OC,but i guess some ppl buy just that one.
sapphires only average think is that little low noise,but if u check net,not for near high performance. just bcoz it eat so much energy,its need alot air.

so,if we see and check it compare nvidia gtx 1060 reference or not gpu,gtx 1060 is much much better and real mainstream gpu.
still it much faster,low wat eater and low temps,better quality components and only so called bad think is that SLI connect missing..
but so what,buy gtx 1070 and u dont need it or gtx 1080 and play 4K games,and ,a gain lower power than rx 480…

short and sharp : amd rx 480 is fail gpu and its fact.

so what about titan x?

only think is that is expensive..sure it is! if there is some ppl who think it must buy 200$, wake up and stop dream and weeping,u cant get i say world fastest gpu cheap
and u not should have.

nvidia is not charity comapany and not amd either,its take huge palnning,building and testing session bfore it can offer it ppl who can buy it. nvidia do it right.

titan x is not allmans gpu, gtx 1060 is for that. it htink best gpu is now gtx 970 and newest gtx 1070.

titan x, if we check it deep, take less power for net than rx 480. we see it august 2nd,and i mean normal running,not oc’d.

many ppl can pay it 1200$,not me, but many thousands,and u must be happy that u have chance to get real 4K gpu for all games.

remembe, that without nvidia gpu’s we all must play some amd’s fury x with watercool or amd newest rx 480 what is naightmare i say.
i mean little respect for nvidia.

so if u want world fastest gpu without compromis buy titan x or coming gtx 1080 Ti,if it expensive but gtx 1060 and u will happy i promise.

last
i’m not working nvidia or do any contact nvidia,ánd i have amd gpu about 10 years ago,but i see nvidia make better ones with better drivers.
just i cant belive that ppl huraayy here some lausy rx 400 sereis and tell lies for nvidia products.

if nviida stop make gpu,its so much bigger damage foe gamerss than amd,and if this happends amd gpu price go skyhigh,but for now
amd can sompare only price,rx 480 os gpu athat i dont pay it more than 4+$ or so, if really must and nothing else is not market.

titan x is excellent,great efficiency,silent,super fast gpu. prepare to see it august 2nd!

well, i think if we took useless gpu’s and so on,we must took amd gpu’s,its fact!

1st amd fury x is most useless and terrible gpu i ever see and 1st i dont even try.

i mean, we live 21 century and amd offer you this earth detstroyer with watercooled gpu and over 430w load gpu!
still it loose easily nvidia air cooled gtx 980 ti gpu,what take over 130w less power. and isay 130w is much!

i say many times, that all gpu for desktop use,watercool must denied,agains of law,its horrible way to get gpu cool and silent.
same as we go Formula one back 3.0 litre engine with turbo,or we not try build electric cars for better efficiency,amd think we just another way.
think! do you wanna support that kind gpu politics, i dont!!

and now amd release rx 400 series,rx 480 is also same terrible bad efficiency gpu!
load power over 320w,much more than gtx 1080 reference gpu and still it loose gtx 1080 or gtx 1070 or gtx 1060 all the way!!

i think ppl who know something IT hardware and eenergy safe,think just same for sure.
thouse numbers should tell everyone how lausy and bad technoly amd’s ‘new’ gpu architecture is. just terrible planning,testing and technics.

but i think all remembe amd’s huge and amazing hypeting that coming rx 400 series with low temp and low watt eater,great efficiency and so on, just big lie.

once more.. i read over 320w load and over 85w idle! omg!! but,and i must say, its no reference models,it was sapphire nitro+ OC model,but i guess all ppl buy just that one.
sapphires only average good think is that its little lower noise,but if u check net details and reviews,not for near high performance. just bcoz it eat so much energy,its need alot cool air.

so,if we see and check it compare nvidia gtx 1060 reference or not reference gpu,gtx 1060 is much much better and real mainstream gpu.
still it much faster,low wat eater and low temps,better quality components.
only bad think gtx 1060 is that, SLI connect missing..but so what,buy gtx 1070 and u dont need SLI anymore or gtx 1080 and play 4K games,and gain, lower power than rx 480…

short and sharp : amd rx 480 is fail gpu and its fact.

so what about titan x?

only think is that is expensive..sure it is!
if there is some ppl who think it must buy 200$, wake up and stop dream and weeping,u cant get world fastest gpu cheap and u not should have.

nvidia is not charity comapany and not amd either,its take huge palnning,building and testing session bfore it can offer it ppl who can buy it. nvidia do it right. amd not!

titan x is not allmans gpu, gtx 1060 is for that. it i think best for now price/fps is gtx 970 and newest models,gtx 1070.

titan x, if we check it deep, take less power for net than rx 480. running lower heat than rx 480 and stil is many times more powerfull.
we see it august 2nd,and i mean normal running,not oc’d.

many ppl can pay it for 1200$ price,not me, even its worth it sure,but many thousands buy it ,and u must be happy that u have chance to get, real 4K gpu for all games.
even good phone these days pay 700-800$ is it better buy???

remembe, that without nvidia gpu’s we all must play some amd’s fury x with watercool or amd newest rx 480 what is naightmare.
i mean little respect for nvidia.

so if u want world fastest gpu without compromise buy titan x or coming gtx 1080 Ti,if it expensive buy gtx 1060 and u will happy i promise.

last

i’m not working nvidia or do any contact anyway for nvidia,ánd i have amd gpu about 10 years ago,but i see nvidia make better ones with better drivers.
thtas why i use for world end nvidia gpu’s they are best!
i pissed and tired that i cant belive that ppl hurraayy here some lausy rx 400 sereis and tell lies for nvidia products. and weep.. it cost too much and so on.

if nviida stop make gpu,its so much bigger damage for gameres than if amd stop make gpu’s. and if this happends that amd makes gpu only,price go skyhigh.
but for now amd can sompare only price,not quality,speed or efficiency,rx 480 ss gpu that i dont pay it more than 40$ or so, if really must and nothing else is not market.
amd just dont dare ask more price that junkie,bcoz thta it is.rx 480 value after 6 month is zero. remembe.

titan x is excellent,great efficiency,silent,super fast gpu. prepare to see it august 2nd!

its solid HD gpu also,u cant play it QHD games and not ever near smooth 4K games.
its old and forget after 6 month.
uneed buy new after 1 year,
so is it cheap,no
fro amd rx 480 quality and what is offer you,i call it expensive.

much much better is nvidia budget gpu, gtx 1060,its same price,much better quality and fasten for games.

quality is that is running low temo,get less wats from wall and run smoothie even 4K games,not all but some,but run smoothie all WHD games.