Powenetics, In Depth

In brief, Powenetics utilizes Tinkerforge Master Bricks, to which Voltage/Current bricklets are attached. The bricklets are installed between the load and power supply, and they monitor consumption through each of the modified PSU’s auxiliary power connectors and through the PCIe slot by way of a PCIe riser. Custom software logs the readings, allowing us to dial in a sampling rate, pull that data into Excel, and very accurately chart everything from average power across a benchmark run to instantaneous spikes.

The software is set up to log the power consumption of graphics cards, storage devices, and CPUs. However, we’re only using the bricklets relevant to graphics card testing. Gigabyte's Aorus GeForce RTX 2080 Ti Xtreme 11G gets all of its power from the PCIe slot and a pair of eight-pin PCIe connectors. Should higher-end 2080 Ti boards need three auxiliary power connectors, we can support them, too.

Idle

An average power reading of just under 18W is a little higher than what we measured from GeForce RTX 2080 Ti FE. Then again, the Aorus GeForce RTX 2080 Ti Xtreme 11G does have an extra fan, plus a bunch of lighting that the FE board lacks.

It’s also worth noting that Gigabyte offers semi-passive functionality, which cuts power consumption slightly. But we prefer a bit of active cooling, even at idle load levels, so we test with the fans spinning.

Gaming

Running the Metro: Last Light benchmark at 2560 x 1440 with SSAA enabled pushes GeForce RTX 2080 Ti to the max, yielding an average power consumption measurement of 302W. Most of that power is delivered evenly through both eight-pin auxiliary connectors.

We pulled most of the lower-end cards out of our comparison chart since they use quite a bit less power than the GeForce RTX 2080 Ti Xtreme 11G. However, AMD’s reference Radeon RX Vega 64 remains. Although it’s much slower through our benchmark suite, we can see that AMD’s flagship tries to maintain a similar power target.

Recording current through three runs of the Metro: Last Light benchmark gives us a line chart that resembles power consumption, naturally. Still, breaking the results down this way tells us that the PCIe slot hovers around 4A—well under its 5.5A ceiling.

FurMark

Power consumption under FurMark isn’t much higher than our gaming workload. There is a slightly higher peak reading, and one of the eight-pin connectors shoulders more of the task than before. Still, an average of 303W shows that Gigabyte has its Aorus GeForce RTX 2080 Ti Xtreme 11G capped. Once that power ceiling is hit, voltage and frequency are scaled back.

Maximum utilization yields a nice, even line chart as we track ~10 minutes under FurMark.

Tracking power consumption over time in FurMark doesn’t look much different from what we saw under Metro: Last Light. The Aorus GeForce RTX 2080 Ti Xtreme 11G holds steady around 300W. AMD’s Radeon RX Vega 64 tries to maintain the same power level, but has to switch between throttle states after a few minutes. Even a GeForce GTX 1080 Founders Edition starts acting up after a while.

Current draw over the PCIe x16 slot is slightly higher than 4A, and well under the 5.5A ceiling defined by the PCI-SIG.

The price stated of vega 56 and 64 are there to make the price of the 2080ti less wow but the real price of vega 56 is 370 euro and the vega 64 449 euro. i realy hate it when Toms H. uses over blown prices on AMD products just to make the competition a good buy.

The price stated of vega 56 and 64 are there to make the price of the 2080ti less wow but the real price of vega 56 is 370 euro and the vega 64 449 euro. i realy hate it when Toms H. uses over blown prices on AMD products just to make the competition a good buy.

I am not sure where they get their pricing but they did the same with the RX590 review with a over priced 1060. I think its a site algorithm that pulls prices and not a person as any normal person can easily find better deals.

I would think Nvidia would have learned a lesson on overly pricing cards with gimmicky tags... well I guess not and no wonder their stock tanked. I wouldn't think it's all that hard to understand, Selling 20,000 cards @ $1300+ or 250,000 @ $200? Over the last 15 years what range of cards have sold the most and which one gets released last in line? Splatting a big RTX label and charging a premium on a GPU when software is still 12 months behind was just stupid and we can add it to the many other gimmicks Nvidia has come up with over the years that failed to draw sales " PhysX, Early stages of VR, Hairworks, and now RTX" These cards are indeed nice but adding that extra premium just for "RTX" is just bad marketing.

About ready to retire this site for good at this point; first "just buy it because there is a price to NOT be an early adapter tripe", then a 4.5/5.0 for any NVIDIA RTX product is just a slap in the face of the site readers.

Clearly this is about the ad bucks, not sure how your reviewers can look themselves in the mirror anymore.

Any1 with a 1070ti or above is absolutely fine at 1440p resolution. My card scores about 60 fps in the last tomb raider and 52 in AC Oddysey benchmark. In game with dips to just unde 40. Coupled with a high refresh g sync monitor I don't even notice low framerate.

I'll say pass to shitty rtx. I cannot believe how people could pay 1300 for a gimmick like rays. Gimmick that you will find in 0.0001% of the time played in the 2 existing games. It's like getting the gold frame. I would have preferred they invested in better more advanced phisics and market that.

Any1 with a 1070ti or above is absolutely fine at 1440p resolution. My card scores about 60 fps in the last tomb raider and 52 in AC Oddysey benchmark. In game with dips to just unde 40. Coupled with a high refresh g sync monitor I don't even notice low framerate.

I'll say pass to shitty rtx. I cannot believe how people could pay 1300 for a gimmick like rays. Gimmick that you will find in 0.0001% of the time played in the 2 existing games. It's like getting the gold frame. I would have preferred they invested in better more advanced phisics and market that.

Not to mention, the current gen of RTX is only optimized for 1080p!RTX Off: too powerful for 1080p. Ideal for 1440 and 4kRTX On: Works best at 1080p. But for 1440 and 4k? NOPE.The core feature of these cards(flagship) is useless to folks with higher res monitors, while being too powerful without said feature for those on lower res ones. This is why I'm passing on this gen's flagship.It's just a bad investment all around.

And the 2060 won't fix this either:RTX Off: great for 1080p.RTX On: Not going to be able to run max settings like the 2070(?) and up, but hey, the overall build will be better balanced at least?

If they can keep the RTX train running, I'll hop aboard when they can do RTX On: 1440p, 100+hz.

Any1 with a 1070ti or above is absolutely fine at 1440p resolution. My card scores about 60 fps in the last tomb raider and 52 in AC Oddysey benchmark. In game with dips to just unde 40. Coupled with a high refresh g sync monitor I don't even notice low framerate.

I'll say pass to shitty rtx. I cannot believe how people could pay 1300 for a gimmick like rays. Gimmick that you will find in 0.0001% of the time played in the 2 existing games. It's like getting the gold frame. I would have preferred they invested in better more advanced phisics and market that.

Not to mention, the current gen of RTX is only optimized for 1080p!RTX Off: too powerful for 1080p. Ideal for 1440 and 4kRTX On: Works best at 1080p. But for 1440 and 4k? NOPE.The core feature of these cards(flagship) is useless to folks with higher res monitors, while being too powerful without said feature for those on lower res ones. This is why I'm passing on this gen's flagship.It's just a bad investment all around.

And the 2060 won't fix this either:RTX Off: great for 1080p.RTX On: Not going to be able to run max settings like the 2070(?) and up, but hey, the overall build will be better balanced at least?

If they can keep the RTX train running, I'll hop aboard when they can do RTX On: 1440p, 100+hz.

It's not that the hardware isn't powerful enough.. It's that the API's are a year behind being able to render it without massive performance issues. So yes it makes these premium cards now seem Gimmicky and over inflated in price.

In a way they are listening by releasing Touring cards without the RTX badge at a closer to normal price. The issue here is the market is flooded and Nvidia is going to rush out Touring with 2X the badge numbers they normally release. Glad I moved all my Nvidia stock to AMD when Ryzen launched.

About ready to retire this site for good at this point; first "just buy it because there is a price to NOT be an early adapter tripe", then a 4.5/5.0 for any NVIDIA RTX product is just a slap in the face of the site readers.

Clearly this is about the ad bucks, not sure how your reviewers can look themselves in the mirror anymore.

Pretty sure 2080 Ti is the *only* RTX 20-series card that has received a 4.5/5.

We've been pretty consistently critical of the 2070 and 2080's value proposition vs 1080 and 1080 Ti (though that's not even really worth debating anymore, since those cards aren't available). Future promises about ray tracing and DLSS weren't a factor in this review. They weren't even mentioned. Rather, performance in *today's* games is what impressed us. Of course, if you're not gaming at 4K, then cards up in this range, which is where multiple generations of Titan models also lived, probably aren't a concern for you anyway.

I watched a review on YouTube a while back, where despite the reviewer stating that the video doesn't do the RGB justice and it looks even better in person, it was still impressive. And I'm not really interested in RGB normally - probably still wouldn't pay extra for it - but for people into it, definitely check it out:

"this is an extremely heavy card to hang from a PCIe slot, so Gigabyte bundles a stand meant to support the card by pushing up from your bottom-mounted PSU or the floor of your chassis."

This made me lol, what a time we are in.

No kidding! It's rather absurd to think anyone is going to transport their PC with that thing slotted in. At that point, IMHO is far more sensible to just go water cooling with an AIO loop where the rad is securely mounted; it frees up the weight from the card. Ergo, release of stress on the PCIe slot.

Note: that commentary is more applicable for LAN Party goers if anything.

About ready to retire this site for good at this point; first "just buy it because there is a price to NOT be an early adapter tripe", then a 4.5/5.0 for any NVIDIA RTX product is just a slap in the face of the site readers.

Clearly this is about the ad bucks, not sure how your reviewers can look themselves in the mirror anymore.

I don't think price should be part of the score. Mainly because price is all relative to the person. I agree the value is horrible. I plan to not pay that much for a GPU but there are those that don't see it as an issue. Mainly the performance and quality should be what garners a score with value being a side bar.

It's kind of sad when you buy a $1k card and it's considered a "deal." Still, can't get them no compromising frame rates @4k without it. The 1080ti is almost there but comes a bit short and will continue that trend in the near future.

It seems all you commenters on here are broke and jealous. I have the aorus 2080 ti xtreme waterforce and this card blows my 1080ti out of the water. This card stays cool and no weight issues since I mounted vertically. I love this beast. Do stop crying and talking about something that you have no idea how well it works since it is out of your grasp. All the whiners, ridiculous pansies.

It seems all you commenters on here are broke and jealous. I have the aorus 2080 ti xtreme waterforce and this card blows my 1080ti out of the water. This card stays cool and no weight issues since I mounted vertically. I love this beast. Do stop crying and talking about something that you have no idea how well it works since it is out of your grasp. All the whiners, ridiculous pansies.

Jealous isn't quite the word that comes to mind. I could buy it if I wanted, but I would rather build an entire rig.

Yeesh, just claim everyone criticizing the card is broke...I(at least) can afford the card, no problem. But having learned my mistake with 'early new tech adoption'[x299], I'm going to wait till RTX 3080ti, or whatever, if they can keep the RTRT ball rolling.

2080ti:up to ~30% performance improvement - depending on game, over 1080ticost over 70% more than the latter did at launchTuring is an even poorer OC'er than Pascal, so you won't be able to squeeze much more out of that cardRTX On can't really do more than 1080p, 60fpsRTX Off is too much for 1080p, it would bottleneck even some of the newer cpus. Best at 1440p or 4k, but those res are too much for current RTX featureSo, regardless of whether you play at 1080p, or 1440/4k, the consumer loses

'Blows 1080ti out of the water'... Umm, sure mate...Perhaps I overlooked something, if you'd be willing to share?

About ready to retire this site for good at this point; first "just buy it because there is a price to NOT be an early adapter tripe", then a 4.5/5.0 for any NVIDIA RTX product is just a slap in the face of the site readers.

Clearly this is about the ad bucks, not sure how your reviewers can look themselves in the mirror anymore.

Pretty sure 2080 Ti is the *only* RTX 20-series card that has received a 4.5/5.

We've been pretty consistently critical of the 2070 and 2080's value proposition vs 1080 and 1080 Ti (though that's not even really worth debating anymore, since those cards aren't available). Future promises about ray tracing and DLSS weren't a factor in this review. They weren't even mentioned. Rather, performance in *today's* games is what impressed us. Of course, if you're not gaming at 4K, then cards up in this range, which is where multiple generations of Titan models also lived, probably aren't a concern for you anyway.

Up until the 1080ti came out gaming in 4k wasn't really a realistic option unless you turned your visual quality down a bit. The "Titan" hasn't been living in 4k territory until this most recent version of it. And your implication that this might be outside my market is amusing. Just because I can afford a $1200 waste of cash doesn't mean I want to waste $1200. I could buy a Corvette without harming my bank book as well, doesn't mean I will waste my cash on one.

Since when is VALUE not part of a review on a product. There is nothing of value to the RTX lineup. it's legit highway robbery of the consumer by Nvidia, and the longer reviewers like yourself cover for them the more they'll do it. Just because other 2080ti's got a 4.5 doesn't mean I approved of those anymore then this one. Looking at it from outside the message I seem to be seeing here recently from THG and the editorial crew is "Just Buy It"

It seems all you commenters on here are broke and jealous. I have the aorus 2080 ti xtreme waterforce and this card blows my 1080ti out of the water. This card stays cool and no weight issues since I mounted vertically. I love this beast. Do stop crying and talking about something that you have no idea how well it works since it is out of your grasp. All the whiners, ridiculous pansies.

Jealous isn't quite the word that comes to mind. I could buy it if I wanted, but I would rather build an entire rig.

Jealous is the most polite word I could think of. Your statement alone proves that people just say things to make themselves feel better. I bet you give excuses for every mistake you make instead of owning up to them.