Week in Tech: Flicker-free Screens, AMD Noise, Nvidia 780 Ti

Share this:

Suffering from headaches, tired eyes and all-round gaming fatigue? Must be that flickering LCD monitor ripping up your retinas. No idea what I’m on about? BenQ would have you believe flickering LCD monitor backlights are the new evil and it has the solution. Flicker-free backlight tech. I’ve tried it and can reveal whether it’s the next big thing after 120Hz-plus panels. It’s not. Next! Graphics. AMD and Nvidia are currently squelching about and looking grumpy following of one of their traditional pissing contests. An unpleasant image but it’s good news because it means things are very closely matched. Still, we need to tidy up a few details after all the new GPU launches and some last minute changes including AMD’s Radeon R9 290 and its dodgy cooling and final specs on the Nvidia Geforce GTX 780 Ti.

The long version:
BenQ is punting a new PC monitor with alleged flicker-free LED backlight properties. Can’t say I’ve ever had an issue with flicker on an LCD monitor. CRT screens, yup. LCDs, nope. Then again, there was a time I’d have scoffed at the benefit of going beyond 60Hz refresh on an LCD panel.

I still can’t entirely compute that, what with 48fps being good enough for HFR movies. But I was simply wrong. 120Hz-plus is lovely and has become a source of some woe. I love my 30-inch panels. But I want 120Hz pretty badly, too.

Anyway, the issue here involves backlight modulation. Run a typical monitor at full brightness and the backlight is simply on. No flicker. No opportunity for flicker. However, crank it down a few notches and the problems, allegedly, appear.

Low-Hz flicker used to be a fundamental issue

That’s because lower LED backlight brightness settings are usually achieved by pulsing the backlight on and off, a technique known as pulse-width modulation. The dimmer the setting, the more time it spends off. Say hello to flicker.

So says BenQ, anyway. Frankly, I can’t tell the difference. This may be a subjective observation. Some people, for instance, are more sensitive to the rainbow effect from cheapo DLP projectors than others.

But whether it’s the rainbow effect, anti-glare sparkle, inverse ghosting, gradient banding, IPS glow – whatever – I tend to find myself towards the OCD end of the sensitivity spectrum when it comes to minor display technology flaws. And I really couldn’t sense the difference with flicker-free technology.

I suspect it’s also quite telling that BenQ’s bumpf tells you that the best way to spot the benefits of flicker free technology is to put a large fan in front of both its screen and a conventional screen. At which point the flicker on the conventional screen presumably becomes apparent.

I am not making this up, it’s the best BenQ can come up with

Oh, BenQ also suggests you take a picture with a digital camera. You’ll see the flicker in the form of banding in still images. If these two examples are the really the best BenQ can come up with, I’m not sure much more needs to be said on the matter.

Also, for the record the screen I looked at is the BenQ GW2265HM. It’s actually a damn fine 22-inch 1080p screen for a whisker over £100 thanks to a VA panel that’s claimed to be good for 3,000-to-one static contrast (the blacks are bloody brilliant). So it’s definitely worth a look, just forget the flicker-free nonsense.

That graphics stuff
As for graphics, we’ve covered the major points in recent posts. But here’s what you need to know from the very latest developments:

AMD
There’s something weird going on with the cooling on AMD new R9 290 boards:

1. The second-rung R9 290 looks fabulous on paper, cranks out awesome numbers for £300-ish
2. But AMD has upped the fan speed at the last minute
3. This makes the performance even better
4. But it also makes an already noisy card trend towards cacophonous

Where does that leave us?
We need a bit more time for things to play out. At a little over £300 the AMD Radeon R9 290 blows everything else away at the high end for bang-for-buck and would be the obvious choice. Putting the noise to one side, I reckon it will give you a gaming experience that’s largely indistinguishable from a £550 780 Ti.

But if it’s as noisy as some say, that’s a problem. I haven’t heard it running with the shouty new fan firmware, because I’ve been too busy driving this:

A game changer with great graphics but makes the odd surprising noise

Which is quite the thing about town and happens to have a pretty nice line in graphics rendering and a few noise issues itself (two-pot range extender is, er, interesting). But isn’t going to help us get any nearer a final answer for this latest round of the GPU war.

My advice is to wait a bit for the dust to settle. AMD may make further revisions. Board makers will have their own cooling solutions, too, so any noise issues with the reference design from AMD may turn out to be moot a month from now.

Anyway, all that’s left to say for now is that only AMD could take what ought to be a winning position with the new Hawaii / R9 290 GPU and cast doubt on the whole enterprise courtesy of what is a pretty minor issue in the broad scheme of things, namely the cooler. If it was any other company, it would be a minor scandal. From AMD, it’s depressingly predictable.

52 Comments

I am in the market for spending some silly money on a top end rig, simply because I always wanted to (read: Have wanted something tippy top for the last 10 years but never had the cash) and can at last afford it. I can’t wait to see what comes out on top, though I want to wait until the manufacturers come out with some better cooling solutions. Hoping to go for either a 780Ti or 290x, a 2550 x 1440 monitor, backed up with a Rift. No idea about processors though. What would be a good match for those components?

(sorry supposed to reply to the guy above)
The 650Ti would be a completely pointless upgrade from your 460 seeing as how close they are in peformance (Even a 560Ti is better then that,and they’re dirt cheap).link to videocardbenchmark.net

Unless of course you are talking about the 650ti BOOST editions which is still only a couple steps above a 560Ti.

@sebmojo. In your case the 660Ti is the sweet spot right now especially with the recent price drops.

Listen to nrvsNRG. The 650ti would be so barely noticeable I’d be rightly pissed off at my self if upgraded from a 460. My thought on a noticeable upgrade is at least a 35% boost in performance and preferably 50%. The 650ti does not deliver this. It is more like 0-20%. Yes I said zero the 460 matches the 650ti in some games :D. The 660ti is closer to that magical 50% boost that feels so good.

I would rather look at Jeremys earlier guide on cpus. It is still vallid as far as i know. You should also rather take a i5 than a i7 if you are going for gaming, because the chips are the same and i7 only has extra virtual cores, which are suboptimal for gaming, as two virtual cores might be emulated on the same physikal core, which would actually be slower than having just a i5 with only the real cores. I think Jeremy explains that in the cpu guide. Also he is probably advising intel because they are faster at running applications with few threads, while amd is faster on a lot of threads, but games usually only have a few at most, some even running everything on one thread. Thats why intel.

Frostbite 3 and some other new game engines reverse this trend, though, with full 8th core usage, making the AMD processors perform with a very interesting price/performance ratio (As in, much better then intell). This change has only kicked in this year, though.

I would recommend an I5 3570K for processor. It’s basically as fast as the 4670K, but cheaper with better overclocking possibility. You can spend a lot on an aftermarket CPU cooler, but I’ve had great results with a CoolerMaster Hyper 212 Evo. It all depends on how much of an overclock you want to push.

Also, if you really want to spend silly money, 3 monitors is awesome. I’ve got 3 23″ 1080p monitors on an Ergotech stand off a GTX 670, and enjoy the peripheral vision much more than a larger screen. Running a game on a single-screen now feels like I’ve got blinders on.

Neither of the 290s are what I would call “cacophonous” under normal operation, but I’m not going on record saying they’re nice and quiet either. They’re of a very similar loudness to the reference HD6950/6970 from a couple of years ago.

Yup. I’m not claiming they’re quiet, indeed the 780/780Ti reference cards are quieter, but having played on a system for quite a few hours in a case with good airflow a few feet away, its not dramatically more audible than other reference GPUs I’ve used.

Actually thinking about it, the sound level may not bother me as much as it may others; different people obviously have different tolerance levels and/or have problems with different sound frequencies.

IIRC end of this month. There have also been rumours that AMD was holding back manufacturers, to first see what the 780Ti performance would be like, before the release of custom coolers and factory OCed cards.

My lowly GTX 260 is bastard loud when the graphics are flowing, so I doubt the R9 290 would bother me in the slightest. Who hasn’t got a set of kick-ass speakers blaring at least enough to drown out fan noise anyway?

It’s during the quiet bits that I notice the noise from my GPU, especially as my CPU is watercooled and dead quiet.
I should really watercool my graphics card as well, but at water block is something like £80, that I could spend on getting a better graphics card. I fancy getting a new monitor as well, decision decisions.

The 290 blows even the titan out the water, let alone the 780 for a third of the price and has plenty of room for overclocking to boot. No doubt the manufacturers will put out quieter and and OC versions in the next month or so. Either way, RPS should be screaming this card as a victory for consumers rather than wallowing in the noise, as competition brings prices down across the board – and frankly nVidia have been taking the biscuit for a while now.

Yeah, I think it’s probably a tiny bit early to scream victory for the 290. As I said in the post, on paper it blows everything else away. And I’m really pleased it exists for a little over £300.

But there are concerns. And not just the noise. It runs very hot and it wouldn’t be the first AMD card that ran hot and failed in numbers. You may be too young to remember the X800 XTX, young padawan (OK, an ATI card). It may all be dandy, but like I said, a little caution to see how things play out is probably prudent. If it was my own money, I’d probably roll the dice on a 290. But recommending what others spend their hard earned means being a bit more careful.

Hi Jeremy. I don’t know if I misunderstood, but in the review I read of the 290 vs the 290x they said part of the reason the 290 looks so good against it is because of the throttling that kicks in on the 290x when it gets hot, losing it a massive chunk of its processing speed and actually dropping it below the 290 (1000MHz down to sub 940MHz I think it said). If companies like ASUS come along and bolt on some more capable coolers that don’t need the throttling, would this not see the 290x very clearly back on top?

Edit: Here is the page of the article that makes the reference, just below the final benchmarking table and bolded in red:

You misunderstood, they either meant on price/performance ratio or they’re just talking about the range in general.

Few places put the 290 above the X in performance because most of those hardware geeks (not an insult) have well cooled setups (great cases with mutliple fans, tidy cable arrangement, great CPU cooling, etc) where the 290X isn’t throttled as much as in that one instance. But yes, a better third party cooling solution and a better overall environment should negate that too.

Young Padawan! Oh the insult Jeremy! :) My first PC was a hand built 386 DX25 that sat in an original 1981 IBM case, designed to survive the zombie apocalypse. A friend put it together for me, gave me the DOS disk’s and then said good luck and left me to it!

And let’s not forget those full length behemoth’s nVidia inflicted on the world that required a small reactor to power. So the 290 is in familiar territory, and as always, speed combined with price, wins.

You know as well as any of us that custom coolers are more than capable of handling this card, and that the X800 is too old to be of any useful comparison (both in engineering and cooling terms).

It’s OK to like nVIDIA cards better. I do so myself. It’s not OK to like nVIDIA cards better and say something else, just because it looks better on paper to “officially” be “unbiased”. Per the performance tests, and the throttling, the card is a monster being reined in by the reference cooling. It could very well be a game changer. It would be pertinent to say as much, and not dismiss AMD/ATI outright, when they obviously aren’t to be dismissed at all (with this card). Nerds (in the Plottian/Stemkoskian sense) everywhere are rejoicing because of this card. Telling the opposite story, just because it’s an AMD/ATI card is dishonest (I’m referring to your last line).

PWM apparently only affects a small set of users, but BenQ is hardly doing something revolutionary here; many monitor makers are ditching PWM for alternatives, or cranking up the frequency high enough that it would not likely affect anyone at all.

VA panels, however, are pretty interesting. They are supposed to offer color accuracy and view angles somewhere between TN and IPS/PLS panels, but with shockingly good static contrast ratios. Pixel response times, however, are apparently quite poor on them, so not sure how it would work out for gaming. And they are available in only 1080p, which for people like me who are looking into 27inch 1440p monitors, is a bit annoying.

Also, on the one hand, the R290 is quite amazingly priced, and with an open cooler design it might actually work, I’d probably pick up a 780 if they dropped the price again to match. If g-sync is as good as it is supposed to be, and will come to 27inch 1440p monitors, I don’t want to be shut out waiting for a competing technology, or for nvidia to stop being bastards and license it.

I do find the PWM ditching amusing. LCDs didn’t originally come with pwm backlights, they are only recently being introduced as they save some cost. The problem is that more than a few people hate the effect it creates on the screen (once you have seen it you always see it). Its kind of ironic that one of the companies that championed its introduction is now championing going back.

Most people will end up with non reference cards utilizing customized cooling solutions, often the same for both brands, the usual Windforces or whatever each third party calls their own, which will yield even better performance and OC potential. I’ve not bought a reference card from AMD or Nvidia in ages, if ever, although most of my GPUs in the past have been with Nvidia chips, the last being a GTX285 (my current is an 7970 OC), it’s almost always been Gigabyte, MSI or others, whatever felt like the best choice at the time. The lackluster AMD cooling solution is a practical non issue some people echo just to downplay the reality of great GPUs that match or surpass Nvidia’s top at far lower price points even post-price cut, meaning Nvidia’s only advantage remains their software like Shadowplay and mainly PhysX enabled games (not their Windows drivers, neither company has been very problematic except odd cases like those overheating Nvidia GPUs and outside Crossfire which once again is for the few who buy multiple GPUs but is also improving), though AMD’s own Mantle is gaining bits of ground too with the recent addition of companies supporting it including Eidos Montreal (known for the last Tomb Raider, Deus Ex: Human Revolution and of course the next Thief game), Oxide (Stardock fostering Civilization veterans, their new engine will likely be used in many if not all of the Stardock family’s future games) and Cloud Imperium (Star Citizen therefor possibly, eventually, CryTek/CryEngine in general) on top of EA/DICE.

The heat and noise will only become a real issue if those third party solutions arrive and still fail to fix it. Though I must say even the cards running that hot clearly don’t fry, they’re within working limits. As undesirable as that might still be, it would only be a real issue when put in an already hot rig that’s not maintained well so that its fan is eventually clogged by dust and it ends up frying for real. Not exactly your average enthusiast’s rig, not exactly the primary market for these higher end GPUs. Only one card has ever fried for me and that was my fault, a custom X800XL with passive cooling (yep, no fan on an already hot range of GPUs) that I chose to shove in my at the time barely cooled system.

Been running a BenQ VA panel for almost a year now, very happy. Better colors and contrast than TN panels, but still response times rather close to them. I find it’s a good middle ground, and quite affordable as as well.
Bang for buck indeed.

As to the new AMD cards, I suspect the noise issue will largely be eliminated by Sapphire and the likes. I haven’t bought a stock ATI/AMD card in ages, the price difference is usually miniscule to nonexistant.
In any case, I certainly hope the two giants manage to match each other rather evenly. We can only profit from some healthy competition.

That doesn’t quite make sense. VA is traditionally the technology with the worst response time. TN has the best, IPS lagging somewhat behind, and VA being forced to use various “overdrive” tech and similar to just keep up.

Don’t get me wrong, I like VA panels myself, been using them for years for the black levels, contrast and colours, but they’re really not very fast at all.

Basically monitor devs have realized that they can use strobing backlights to eliminate motion blur inherent to LCDs due to the way human eyes work with the sample-and-hold nature of LCDs. However the tehnique causes the LCDs to flicker equally to the way a CRT at the same Hz would.

Can’t find on the specs whether that’s static or dynamic. VA panels have their own issues though, like the weird shadow detail loss at just off from straight viewing angles and generally inferior pixel response.

I’d definitely be up for a 120Hz IPS when one comes around, assuming it’s well reviewed for it’s 3D abilities. Pretty happy with my 120Hz Asus TN for now though, it’s quick, the 3D is great (245 hrs of skyrim in the last few months, all in stereo 3D) and the viewing angles are about as good as you can get on TN.

Of course we’re talking about static contrast. Eizo advertizes 5000:1, I’ve seen reviews where after calibration they hit 4,800:1, which is amazing. The drawback, being Eizo, is the price, about 480 pounds in UK.

That sounds awesome! I keep on hearing about super cool new monitors with high res/refresh/response time/great colors. Now I just want all of the manufacturers to adopt G-sync and everything will be more than awesome.

Yeah I was wondering what the hell was going on. Nobody ever uses the shitty reference coolers in practice, yet all the R9’s are using one right now? Seems fucking retarded when the custom ones always outperform stock ones in every way at virtually no extra cost.