Even just a year ago, having a hot-running graphics card such as AMD's R9 290X, was par for the course. Admittedly, there have been hotter and cooler examples of 'the must have' GPU over the years but in general, if it's good value and performs well, I'm usually sold.

This is especially true with me as I usually rip the stock cooler off a new graphics card straight away and fit a waterblock, so heat has never really bothered me. The exceptions were excessively inefficient models such as Nvidia's GTX 480, which weren't that fast and could heat your average Olympic swimming pool. Equally, AMD's dual-GPU offerings have often generated too much heat and been overkill for my needs.

During my time at Cooler Master HQ, I sat down with various product managers, one of which, Bram Rongen, handles the gaming division CM Storm. His responsibilities extend to all peripherals and accessories, though some cases are also marketed under the brand.

Click to enlarge

Unfortunately, CM Storm has nothing new to announce at present. After all, it's had numerous launches in the past few months, including the Sirus-C, the Quick Fire Rapid-I, the Resonar and most recently the NovaTouch TKL. This latter one is the division's most important product for at least the rest of the year, as it is the new flagship keyboard (it even has its own microsite). Bram is rightfully very proud of the NovaTouch TKL – it's a fantastically crafted piece of kit.

Of course, no company will do too well without a strong product catalogue, but our conversation got me thinking about how there is now so much more beyond this for CM Storm and other such companies and brands when it comes to success in the world of gaming peripherals.

Resolution, particularly pixel density, is the new frontier when it comes to gaming graphics. There’s little doubt, certainly from my first hand experience, that 4K offers huge advantages in sharpness, even on 24in and 27in monitors – not just on super-large screens.

Some may disagree here, but to me, I’d welcome more pixels than my current 24in 1,920 x 1,200 main monitor offers. As I have two screens, I’ve also considered investing in a super-wide screen too.

LG's 34UC97 is a curved 34in super-wide monitor that sports a resolution of 3,440 x 1,440 - Click to enlarge

There are some fantastic-sounding options here in the ultra high resolution department as well. LG and AOC have 34in 3,440 x 1,440 monitors plus Dell and LG have recently announced their own curved versions (WANT). The prospect here for immersive, high resolution gaming is pretty compelling but the extra screen real-estate is useful for all manner of other tasks too. I’ve played with super-wide monitors before as well, as you can read about here, and despite older 30in models only sporting 1,080 vertical pixels, I didn’t find this too restrictive when editing photos and the like.

AOC's u3477Pqu super-wide 3,440 x 1,440 monitor will retail for around £500 in October - Click to enlarge

However, there’s one major issue stopping me splashing some cash on a new ultra HD monitor. This is the fact that I’d need to invest twice as much again in the graphics department to be able to get playable frame rates in games. I’ve never been one to tone down graphics settings in order to get playable frame rates; this is partly the reason I find myself writing about PC hardware for a living, apart from the fact I caught the upgrade bug two decades ago.

However, even if I was prepared to drop a little in terms of detail settings, this still wouldn’t be enough to allow even a £400 single-GPU graphics card to handle all the latest games, never mind my aging GTX 660 Ti. Even Nvidia’s latest effort – the GTX 980 was a long way from achieving playable frame rates in Crysis 3 in our review; you’d need to opt for a monster such as AMD’s R9 295X2 in order to get some headroom at 4K.

To be able to play all current games at 4K, you need to invest in multiple GPUs or AMD's R9 295X2 - Click to enlarge

Something else that concerns me, though, is that there’s not much effort going on to address the main issue here, which is that higher resolutions are going mainstream. Windows 8.1 achieved a lot in terms of 4K scaling, though there are a few more issues to iron out, not least of all by software companies with their own program scaling.

However, we’re nearly at the point where it makes absolute sense to aim for 4K in a mid to high-end system, rather than a super-high end one as is the case at the moment. This doesn’t mean I think those of us with limited wallet power won’t consider splashing out £300-400 on a 4K-capable graphics card, but the fact is that once 4K monitors fall in price further, mid and high-end enthusiasts will have a bit of a problem on their hands.

Nvidia's GTX 980 can play some games at 4K, but doesn't offer much headroom - Click to enlarge

They can afford a 4K monitor, but not the graphics card/s to power it in games. We haven’t had such a big reason to upgrade our graphics card since Crysis landed but AMD and Nvidia need to do more to make these ultra high resolutions more attainable outside of super-expensive systems. In the past, you've needed to invest heavily if you game on triple screens, for example, and I think this needs to change.

4K is waiting to take off, be it in super-wide or standard aspect ratio monitors. In addition, true 4K gaming is also something the latest consoles lack. So this is also a huge opportunity for PC gaming to take a giant leap forwards and offer something tangible when it comes to a better gaming experience.

In short what we need is a GTX 970-type graphics card that can handle the latest games at 4K – something in the region of £250-350 – not the £700 odd that you’d currently need for something like the R9 295X2. So come on AMD and Nvidia, rise to the challenge and give us more reasonably-priced 4K-capable graphics cards.

The last few weeks have been a dismal time for many people involved in the games industry. A combination of vicious personal attacks on important female figures in the industry, torrents of accusations regarding journalistic ethics and rampant paranoia over the perceived destruction of gaming itself has all bundled together in one great snowball of malevolence, misinformation and outright misery. I'm not going into the nitty-gritty of recent occurrences in this article, but these pieces here and here here give a pretty good summary of events.

Dearest readers of bit-tech! Come hither and listen to my whispered words, as I am a troubled soul. For a long time now I have lamented the lack of progress made in the AI sphere of game development. In the years surrounding the millennium AI was bold and bright and exciting. Games like Unreal Tournament, Thief, Black and White and Halo were doing clever and innovative things with artificial intelligence, providing enemies that could use teamwork to outmanoeuvre us, guards that would hunt us, and a big daft monkey that could learn from us.

This continued until around 2005, with FEAR being the last game I can recall with truly memorable AI. Then something changed, and after that nothing changed. Stealth AI has patrolled the same pathways for years, shooter AI crouched behind a wall circa 2006 and decided to make a home there, and when was the last time you played a game that involved the AI learning anything?

I think it's fair to say that Intel's latest CPUs have been met with a mixture of emotions by enthusiasts. At the crux of the issue is likely the fact that Intel is likely looking at markets away from the PC as tablets and smartphones take a big old slice of PC sales pie, despite the fact that PC sales have been predicted to and now seem to be stabilising.

That's not to say it's pulling out of the PC market - far from it. However, we haven't seen the kind of performance increases in new architectures or refreshes/ticks that we have done in the past. Even after AMD was placed firmly in catchup mode following the release of the first Core architecture, we still saw significant improvements in performance, for example, in the move from LGA775’s Penryn and Wolfdale to Clarkdale and again from Clarkdale/Lynnfield/Nehalem to Sandy Bridge.

Click to enlarge

There was a huge leap in performance going from a dual core Core 2 CPU to the dual core Core i5-530, which even gave previous generation quad cores such as the Q6600 a run for their money. Only in specific tests do we see anywhere near this level of performance increase in the post-Sandy Bridge era, and even then, the argument for upgrading, even including the LGA1155 to LGA1150 socket change, is only strong if you own a Sandy Bridge system: Ivy Bridge and Haswell owners needn't bother, although the additional features provided by the Z97 chipset may well temp you too.

However, there's another very good reason for upgrading to Devil's Canyon. Overclocking. It seems that as well as providing smoother power delivery and a better thermal interface material (I should add that people are still delidding these CPUs and seeing better cooling though), Intel has been speed-binning CPUs.

Click to enlarge

In short, the widely varying overclocks we saw with Ivy Bridge and Haswell, which ranged from 4.3GHz to 5GHz, appear to be a thing of the past and the vast majority of new CPUs, the Pentium G3258 Anniversary Edition included, can reach 4.8GHz with relative ease. It's always been a lottery with CPUs and overclocking and retailers have often cherry picked CPUs too and sold them at higher prices for guaranteed overclocks. You'd need a good CPU cooler, and for some reason, 4.8GHz appear to be the limit unless you drastically boost the CPU voltage, but even so, this is 300-400MHz faster than you'd expect from a typical Core i5-4670K retail sample.

Click to enlarge

To prove our point, as Intel annoyingly didn't ship out a Core i5-4690K to us, we bought our own retail copy and this performed exactly the same as our Core i7-4790K press sample. You only have to look at forum system spec signatures to see just how many people are having to be content with a 4.3GHz or 4.4GHz CPU. In addition, our 4.8GHz test system only drew 20W more at load and the CPU was much cooler than a 4.6GHz Core i5-4670K-based system, so it's quite feasible to have your CPU at 4.7GHz or 4.8GHz 24/7.

This on its own is a very appealing feature - after all, who wouldn't want a speed-binned CPU and in the past, retailers have even charged more for such sought-after silicon (remember G0-stepping Q6600's?) Yet Devil's Canyon CPUs didn't cost much more, if at all than their predecessors. Yes there’s not much if any improvement in IPC and most of the speed boosts at stock speed are due to increased CPU frequencies – the Core i7-4790K for example has a stock speed of 4GHz, which is a substancial 500MHz faster than its predecessor, but that’s the kind of thing you can do with a cherry-picked CPU.

There is, of course, the argument that Intel has shunned the PC and the PC enthusiast by just speed-binning Haswell cores, but if anything, this is showing that it still has a commitment to enthusiasts and in particular overclockers.

Yes I’d like to have seen more of a performance boost or shrunken manufacturing process, although the latter is what we'll be looking at with Broadwell, but I’d rather have a Devil’s Canyon CPU that’s guaranteed to hit 4.8GHz and provide some overclocking fun, than something that performs a few percent faster clock for clock than the previous round of LGA1150 CPUs and once again runs very hot under the collar once you’ve overclocked it along with being lucky to be able to get a stable overclock above 4.4GHz.

Click to enlarge

This won’t be everyone’s cup of tea of course, but I’m willing to bet that most people reading this, especially those that are potential buyers of the new Core i5 and Core i7, would be overclocking their CPUs too. It probably cost Intel less to tweak the power delivery, use better thermal interface material and speed-bin some CPUs, than it would have done to make similar changes to what we saw moving from Sandy Bridge to Ivy Bridge, but I’m not really that bothered.

However, I do want to see some improvements with Broadwell. With Windows 9 due out at roughly the same time (although rumours are that K-series Broadwell CPUs may have been delayed yet again till next summer) and rumoured to be much more geared toward PC users, they could provide the perfect opportunity for pre-Haswell owners to reach for their wallets.

If you've been keeping up with our Computex 2014 coverage, you'll know that we saw a whole host of companies keen to show off their latest and greatest products, including numerous ones that are yet to be released. I thought it would be interesting to open a discussion about the products that stood out most to me – let me know if you agree or disagree with my picks.

Last year I interviewed Ken Silverman, creator of the Build engine (used in games like Duke Nukem 3D and Shadow Warrior) as part of a monthly article series I write in Custom PC about graphics engines. While preparing for the interview, I read through his timeline for the engine's development, which is published on his website. Amid all the technical jargon and details of publisher deals was the simple line "Finally added SLOPES!"

It stood out because whereas so much of the information was factual and to the point, this entry conveyed more emotion; a strong sense of both relief and achievement. I asked him what the big deal was, and he responded thus:

I've been in somewhat of a dilemma recently. My aging Saitek Eclipse membrane keyboard has been slowly giving up the ghost (falling to pieces is more accurate, but then it is over five years old) and I've been doing some research into what to get next. Being a tech journalist, having a good keyboard is akin to a carpenter owning a good set of tools - as such, money isn't really an object (you'd struggle to find a typical gaming keyboard that costs much more than £100 anyway).

The main issue, though, is that despite countless keyboards having come through our lab, I haven't really used one that I like - Matt can attest to this as I've been tapping on every keyboard that I could find and walking away disappointed. This is mainly because we, like most other tech review sites, are primarily focused on looking at mechanical switch keyboards - blue, black, red, brown etc. Cherry MX switches are certainly all the rage, but like quite a few other people I've come across, I haven't taken to the craze at all.

One of the variants of the Saitek Eclipse - a pretty good, if basic, membrane keyboard - click to enlarge

This is mainly because of the noise they make, and I've tried all four main switch colours on various different keyboards. I do have a thing about this, though - noise is one of my pet hates when it comes to PCs and it's one reason my main PC has been fully water-cooled since about 2003. However, I've seen and heard of plenty of instances of office colleagues receiving complaints about their noisy keyboards too.

Out of all the Cherry switches, the black was my favourite but even it was quite loud - click to enlarge

Even during a weekend recently when I borrowed a black switch keyboard for the weekend, my better half noticed instantly when I switched from membrane to Cherry switches, and spent the rest of the weekend in the garden. Admittedly, it was very pleasant weather outside but she knew exactly when I was on the PC from the noise. The noise isn't just annoying to other people, though - I find it pretty intrusive to type on these keyboards too, however more tactile and responsive they are.

This leaves people like me in a bit of a desperate situation. I love some of the features that are being added to the latest keyboards - USB hubs and backlighting especially, but finding a decent membrane keyboard with these features is extremely difficult. Recently, though, I found a possible solution - Cherry switch dampeners - specifically rubber o-rings that you can insert under the keys that reduce the noise caused when the keys bottom out.

The o-rings sit under the key caps preventing them from bottoming out - click to enlarge

I purchased a pack from OcUK and spent half an hour or so adding them to a Mionix Zibal 60 keyboard with black switches. The difference in noise was certainly noticeable - the bottoming out tapping noise was nearly eliminated and the feel of the black switches wasn't altered too much either, although it was still much noisier than my old Saitek Eclipse.

It takes a while to fit the o-rings but it's easy to do and does reduce noise without completely ruining the tactile feel - click to enlarge

However, my ears were now focusing on the upwards tapping noise - when you release a Cherry switch key, there's quite a thwack as the key bounces back up to its rest position. Sadly, there's nothing I know of that you can do about this and it seems that, for now, I'll have to give up my quest of modding a mechanical keyboard to suit my needs. Thankfully, the Saitek Eclipse is still available, albeit in a slightly revised version in the form of the Cyborg V5, so I now have one sitting on my desk.

The Cyborg V5 is still readily available and it's membrane feel has been tweaked compared to the older models and is quieter and more tactile - click to enlarge

It's even quieter than the original Eclipse - in fact it's keys are practically silent compared to my dampened Mionix Zibal 60. Have you struggled to get on with mechanical keyboards? Have you modded yours or found a good membrane alternative? Let me know in the comments.