We originally ran this article in May 2015, but with the release this week of Windows 10, we thought we'd revisit DirectX 12 on the launch version of the OS, using the latest drivers in order to update the benchmark data. We've also replaced the AMD A10 7800 benches with the same tests run on an FX 6300 - this is a more direct equivalent to the Core i3 4130. We also re-tested Call of Duty: Advanced Warfare and found that significant issues remain with AMD's DX11 performance on less capable processors on both Windows 8.1 and Windows 10.

There's a palpable air of excitement surrounding the arrival of Windows 10 and DirectX 12 - a sense that the PC will finally shrug off the shackles holding it back and that the cutting-edge components released by AMD, Nvidia and Intel will finally reach something approaching their full potential. We experimented with Windows 10 this week and came to a highly satisfying conclusion - DX12 offers huge advantages to virtually all PC owners, but it will be a boon to AMD in particular, perhaps going some way to restoring a degree of plurality to the PC hardware market.

In the here and how - in the era of DirectX 11 - life isn't particularly easy for AMD. Its problems in the CPU market are well documented. Its Bulldozer architecture bet the farm on numerous, slower cores in a world where DX11-driven gaming benefits more from fewer, faster cores, giving Intel a virtually unassailable advantage. AMD still produces 32nm and 28nm processors, while Intel is now down to 14nm, giving it power efficiency advantages on top of its inherent performance improvements.

Among the most ambitious 'free to play' titles available today, World of Tanks gets a visual overhaul on Xbox One that few expected. It's unique for being the only game on Microsoft's box to have cross-platform play with its older Xbox 360 release - where owners of both consoles can play in the same 15v15 online battles. But Xbox One gets plenty of upgrades over and above this, from remade map assets, a new lighting model with high dynamic range (HDR) and improved physics and effects. It's a makeover that puts PC behind in some regards, but can Xbox One claim to offer the true, definitive version?

First up, the basics: Xbox One keeps several features from the existing, bespoke Xbox 360 release that are not available on PC. Developed on the Bigworld engine by Wargaming's Chicago and Baltimore studios, the two console versions offer weather effects such as rain, plus unique time of day settings for each map - features that are not officially available on the PC release without mods. The game design and mode selection are identical between the two consoles too, right down to the camera's field of view setting, and a new Proving Grounds option where players can practice against AI opponents. Microsoft's two machines essentially have parity in their core gameplay features, while PC takes an independent path.

The similarities on console stop there. Xbox One runs at a full native 1920x1080, storming ahead of the last-gen release's 1280x720 image. That drives image quality up hugely in direct comparisons, especially when picking out details to the far end of giant maps like Abbey. Unfortunately the use of what appears to be FXAA post-processing keeps absolute clarity at bay - though jagged edges are treated well enough. Compared to the PC's superior TSSAA option (as added via recent patch 9.9), Xbox One owners miss out on a super-sampling pass, while a temporal component helps PC to avoid pixel crawl during motion. As it stands, Microsoft's platform turns in only adequate results for image treatment, though running at 1080p is obviously a big plus.

Journey. Perhaps the perfect word to define the trajectory of this particularly elusive PlayStation 4 remaster. Having briefly spent time with the game just last year at Gamescom, performance was solid enough that it seemed likely that a full release would follow shortly thereafter. Instead, nothing. Outside of the odd blog post, this particular port remained shrouded in mystery throughout most of its development cycle. Now, three and a half years after its original last-gen release, Journey has finally emerged on PlayStation 4.

Its technological underpinnings are based on an advanced iteration of PhyreEngine - a modular, cross-platform, free to license graphics engine created by Sony Computer Entertainment. It's an engine which thatgamecompany, creator of Journey, previously used in each of its PlayStation 3 projects. With its complex sand rendering, dust effects, and fluid simulation, Journey made extensive use of the PS3's SPUs to bring its world to life. With a small development house known as Tricky Pixels tackling this PS4 port, we were genuinely curious to see how this ambitious project would translate to Sony's latest console platform.

At first glance, this PS4 iteration appears to fully retain the beauty of the original game. The simple, clean designs, attractive lighting, and lovely sand simulation return alongside a nice bump in performance and image quality. Even better, it's available for free to owners of the original game thanks to the CrossBuy feature - if you bought the original and upgraded to PS4 in the meanwhile, the new remaster is sitting in your download list right now, just as it was for. We finished the remastered version of the game in one sitting and have to say that the experience was just as wonderful as we had remembered. However, upon closer inspection, we began to notice some rather subtle changes, suggesting a conversion scenario that was less straightforward than you might imagine.

The deluge of HD remasters shows no sign of abating - fatigue is starting to kick in, but the allure of prettier, smoother, enhanced versions of genuine classics remains a pretty enticing proposition. God of War 3 Remastered stands apart from the crowd by delivering a full 1080p presentation in combination with performance that is to all intents and purposes locked at 60fps. From a visual perspective, nothing can quite top the experience found when locking resolution and frame-rate to the specs of your display, but even better than that, God of War 3 emphasises how gameplay can be improved via the remastering process too.

Often compared to Uncharted 2 as a technical showcase, God of War 3 remains a benchmark visual achievement for PlayStation 3, offering up levels of detail and graphical polish that firmly established the console's technological prowess back in 2010. The breathtaking opening scene alone, set upon the giant Titan Gaia, still impresses with its sense of scale and cinematic direction. Our initial look at the opening 20-30 minutes of the game last week (and embedded at the foot of this article) revealed a near locked 60fps throughout the entire sequence, with only a single two frame drop manifesting at the beginning of one battle. The impact to gameplay was non-existent, and as result we were looking at the most consistent 1080p60 remaster since 4A Games' superb Metro Redux collection.

Impressively, this level of performance continues throughout the rest of game, with frame-rates rarely seeing any impact at all. And when dropped frames do manifest, interruptions to the gameplay experience remain almost non-existent, to the point where they are likely to go by completely unnoticed while your attention is focused on brutally dispatching the hordes of mythological creatures unleashed by Zeus and his cohorts. As such, gameplay is transformed over the PS3 release, feeling much more refined and significantly more responsive. This the sort of result the remastering process should engender: with this release, there's the sense that the full potential of Sony Santa Monica's original design is finally being realised.

Unreal Engine 3 enjoyed a long, prosperous life on last-gen following its 2006 debut, but the new UE4 has yet to hit its stride in the new console gaming era. We're almost two years into the current generation, but by our reckoning, only five UE4 games are now available on PS4 and Xbox One. Without Epic itself setting the bar this time around, there's the sense that we haven't seen the engine at its best. However, with the release of The Vanishing of Ethan Carter, Polish development team The Astronauts might well be the first studio to deliver on the powerful middleware's latent potential.

On the face of it, the game is a PC port, but Ethan Carter's journey to the PlayStation 4 was longer and more involved than one might expect - after all, the original release was based on the last-gen Unreal Engine 3, an altogether different technology compared to its successor. While the visual quality pushed the boundaries of Epic's ageing tech, The Astronauts actually wanted to build the game using UE4 from the beginning - something that wasn't possible at the time as the engine itself simply wasn't ready. With the move to PlayStation 4, the studio seized the opportunity to move the entire project over to Epic's new platform. This transition would help make the leap to PlayStation 4 easier while simultaneously serving as a trial exercise in using the new tool-set. Now, ten months after its initial release, Ethan Carter has re-emerged on PlayStation 4 - with an upgraded PC version to follow, free to all existing owners of the game.

To give us some idea of the substantial work undertaken in the transition, The Astronauts' lead artist - Andrzej Poznanski - took the time to share his experiences with us, providing valuable insight into the development process. There hasn't been a lot of discussion surrounding UE4 projects on the current wave of consoles, so we were eager to learn more about the capabilities of the engine and the challenges and opportunities in working with it. The technology is clearly highly capable but on a constrained platform like PS4, the full extent of its entire feature-set can't be deployed at will.

While there are legitimate concerns that the sheer volume of remasters hitting the current-gen consoles is starting to verge on the ridiculous, we've still got a lot of time for Sony's continuing efforts in bringing PlayStation 3 glory days to its latest console platform. The Last of Us Remastered worked beautifully overall, God of War 3's 1080p60 presentation is excellent, and our first look - more of a glimpse really - of the Uncharted Nathan Drake Collection is also heavy with promise. Right now, from our perspective, all that's missing from the line-up is a Killzone 2/3 release.

We'll be running an in-depth look at God of War 3 Remastered in the next day or two, but in the meantime our focus is on the Nathan Drake Collection and that 1080p60 gameplay clip released last week. Sony sent us two assets to look at - the video clip, along with five screenshots. Bizarrely, the shots are rendered at a native 4K resolution - not representative of real-time gameplay, but perhaps useful in another way. Even at the ultra-HD pixel count, texture detail seems to hold up, though understandably geometry and lighting are very much last-gen in nature.

But it's the gameplay clip that is most exciting. It may be less than four minutes in duration overall, but it seems to confirm that at the very least, Bluepoint Studios are on track to deliver a remaster at least on par with Naughty Dog's excellent in-house PS4 rendition of The Last of Us. The scene is effectively cut into two sections, indicative of the typical Uncharted gameplay mix - we see an initial traversal section, where frame-rate barely shifts from the target 60fps, and that's followed up by a more action-orientated sequence, complete with intense gunplay and a signature Uncharted set-piece - a more challenging work-out for the engine.

It's an excellent example of how competition drives performance. When Nvidia's GTX 970 arrived, the high-end graphics card market was blown apart. Factory overclocked examples traded blows with Nvidia's prior flagship - the GTX 780 Ti - and performance could be pushed further, bringing it into line with the top-end GTX 980. AMD's Radeon R9 290 and 290X suddenly looked excessively expensive and rather mediocre. Almost a year on and the red team has responded well: it has its limitations, but the performance can't be denied - the new R9 390 8GB is indeed faster than the GTX 970.

AMD's formula for challenging Nvidia's top-end cards is simple enough. The existing Hawaii chips used in R9 290 and 290X are utilised here, receiving an overclock to the GPU core, while the GDDR5 memory modules are replaced with faster, more capable parts with 500gbps of additional bandwidth. It's a simple recipe for success that extracts just enough additional performance on the Radeon R9 390X to make it a contender in its price range, and the formula is arguably even more effective on the R9 390, delivering impressive results up against the GTX 970.

In essence then, the performance is there but it comes at a cost: the Hawaii chip now generates even more heat than it did previously, meaning quite remarkable levels of power consumption. Secondly, dissipating that heat requires a substantial cooling assembly. For our R9 390 8GB review, we're assessing the new Asus DirectCU 3 model - the top-end cooler the company produces, based on top-of-the-line components and boasting beautiful build quality. In fact, it's the exact same set-up used on the firm's R9 Fury card, the difference being that the less capable R9 390 actually produces more heat than AMD's slightly cut-down flagship.

It's the biggest update so far for The Witcher 3, with patch 1.07 demanding chunky 7.3GB of space on your PlayStation 4 or Xbox One. With that HDD footprint comes some suitably big features - most anticipated of all being a new stash system for your inventory, and an alternative animation system designed to make control of Geralt more responsive. But contrary to expectations, its touted benefits to performance aren't apparent on close examination. In fact, frame-rates in many areas are noticeably worse on both platforms once the patch is installed. Suffice to say, we weren't expecting that.

Let's start with Xbox One. Our analysis shows Microsoft's console retains its advantage in frame-rate over the PS4, but there's a downgrade in effect with the transition from patch 1.05 to 1.07. A horseback tour around Novigrad city remains largely stutter free on the latest update - as was the case before. However, charging through Crookback bog during heavy rainfall tells another story as we stress the engine to its fullest.

Essentially, gameplay on Xbox One now appears to rely on a similar double-buffer v-sync set-up to the PS4 game, locking its frame-rate to 20fps during these lulls in performance. This means that the reading on our graph is consistently lower on patch 1.07 as compared to 1.05, where it was free to waver between 20-30fps freely. On a matching route through the bog, frame-rate can be up to 8fps slower on the latest version of the game.

We're currently working on our analysis for Sony Santa Monica's remastered edition of God of War 3 for PlayStation 4, but in the meantime we thought we'd republish our original tech interview with the team, produced in the wake of God of War: Ascension's release and covering the development of both of Kratos' PS3 outings. Originally, this piece was published with a frame-rate analysis video of the spectacular opening level of God of War 3 on PlayStation 3. We've swapped that out with a new performance test, featuring both PS3 and PS4 versions of the game, compared head-to-head. That 1080p60 remaster we discussed with SMM at the end of this article, almost two years ago? Well - there it is.

As we reach the end of the current-gen console era, it's safe to say that it is the difficult, flawed, but ambitious PlayStation 3 that has offered up the most technologically advanced console games of the age. The complex hardware set-up may have banjaxed even the best third-party developers in its early years, but PS3 owners have been spoiled by a range of state-of-the-art gaming epics from Sony's own in-house studios - foremost amongst them, God of War creator Sony Santa Monica.

God of War 3 was a watershed moment in the history of the PlayStation 3. At the time, few believed that Naughty Dog's Uncharted 2 could be matched or even bettered in terms of sheer technological accomplishment, but Kratos' PS3 debut raised the stakes still further. The third game's legendary titan boss set-pieces looked and played with an almost CG-like level of polish, astonishing many with its breathtaking per-pixel lighting, rich detailing and pristine motion blur effects. The sheer scale of ambition on display here was simply breathtaking and even today, God of War 3 ranks as one of the best platform exclusives on the market.

Without any kind of pre-launch announcement - or indeed, a single word of warning - remasters of Prototype and its 2012 sequel crept quietly onto the Xbox One store earlier this week - almost as if neither wanted to be caught in the act. On first glance you might wonder why. On the face of it, we're in standard remaster territory: both versions output at native 1080p, boosting image quality over their sub-720p setups on PS3 and Xbox 360. However, everything else remains as you remember it: each game maintains the exact same draw distance range as the older console releases, while texture and shadow quality is identical.

If what we're looking at is an HD remaster by the numbers, the situation swiftly goes downhill from there. The first Prototype remaster plays at 30fps with adaptive v-sync on Xbox One, dropping to the mid-20s with tearing once explosives are triggered. It's generally solid - and playable enough - but a touch under-ambitious given the plain, repetitive design of the world.

Even by latter-day PS3 and Xbox 360 standards the animations, effects and textures don't hold up well on the current generation of consoles, and it's hard to fathom why such a game couldn't operate at an absolutely locked 30fps. Indeed, given the vintage of the game, it's a real disappointment that the developers did not go the whole hog and target 60Hz gameplay. Otherwise, the original game is passable in its remastered guise, if a somewhat barebones conversion.

Despite the shaky start, it was worth the wait. Uncharted 4 concluded Sony's E3 show this year with style - an extravaganza of live gameplay that showed Naughty Dog's growing affinity for PlayStation 4 hardware. Compared to the meat and potatoes gameplay shown at the PlayStation Experience event in December, Nathan Drake's E3 showing pushes the game's technological credentials more forcefully, calling on the team's incredible talents in set-piece design. But having seen some accomplished current-gen titles over the past few months, is what we're seeing here the consummate next-gen game we've been waiting for?

Uncharted 4 goes much further than any PS4 or Xbox One game we've seen in its use of physics, and the layering of world detail with shaders. Opening at the gates of a bustling Madagascan marketplace, what unfolds is unlike any major title seen on console to date. Where the December demo focused on wide spaces, foliage systems, and its more flexible AI, the E3 demo goes to town with object physics applied across the world, as flaunted brilliantly with a jeep chase to 'Sam's tower' at the bottom of the city.

Indeed, vehicle control is at last planted fully in the player's hands this time - a choice that means we can carve out a unique route to hill's nadir. It's rare to see such rapid asset streaming go off without a pop-in hitch; for example, games like The Witcher 3 on PS4 operate at equivalent to PC's medium to lower settings, producing some obvious pop-in for shadows and foliage. The console's substantial reserve of fast GDDR5 memory only goes so far in solving this, and the rest comes down to smart control of LODs from the developers.

The challenge facing Codemasters is immense: with F1 2015's debut on PS4 and Xbox One, the game needs to provide a generational leap over its predecessors - our early impressions of the game revealed substantial improvements in terms of graphics, physics and handling but the game also falls short in some areas: the developers clearly faced hard choices in terms of balancing new features while hitting a consistent level of performance. As we've previously discussed, frame-rates are an obvious sticking point on console, with tearing and judder demonstrating that the studio couldn't quite hit its 60fps target, though the impact on actual playability isn't particularly severe in most cases.

But if there's a sense that the console versions aren't quite as smooth and as fluid as they should have been, the introduction of the PC version adds an interesting variable to the mix. Not only is the horsepower available to hit a solid 60fps, but the potential is there to deliver more substantial graphical enhancements too. Basic resolution naturally comes down to user selection, but as always we choose 1080p to match the current-gen console standard, ramping up all settings to the max. The end result is additional refinement over console as opposed to any revelatory difference.

There are more similarities than differences - for example, anti-aliasing duties are carried out via the popular SMAA post-process technique, though Codemasters has added an additional TAA temporal component, where information from previously rendered frames is used to smooth off the current render. Native 1080p resolution is present on the PS4, while the Xbox One game presents at 1440x1080, with a light horizontal upscale in play. Anti-aliasing appears to share a similar SMAA profile to the PC release, including the additional temporal component.

With AMD's Radeon R9 Fury X, the red team promised us the fastest single-chip GPU on the market, but the final product didn't quite live up to the hype. An intriguing cooling solution, a small form-factor and state of the art memory tech gave Fury X some unique properties, but Nvidia had outmanoeuvred AMD on overall performance with its GeForce GTX 980 Ti. However, the new cut-down, air-cooled R9 Fury (the non-X edition, if you will) is an interesting proposition: inevitably, it will be slower, but at the mooted $550 sticker price, Nvidia has no real alternative at the same price. AMD is marketing this as a GTX 980 beater - more expensive, but faster overall. Of course, the reality is a little more complex than that, but the strategy is sound: AMD wants to carve out and own a new niche for the Fury.

At its core, the air-cooled R9 Fury is indeed a cut-down version of the existing Fury X. At an architectural level, it's the same Fiji chip at the heart of the design, but there's a 12.5 per cent drop in shader count - 4096 stream processors become 3584 - while clock-speed drops from Fury X's 1050MHz to a round 1000MHz on the new card. Texture mapping units are reduced from 256 to 224, but otherwise this pared back Fiji Pro chip is much the same as the top-end Fiji XT found in Fury X.

However, in purely physical terms, the new Fury is an entirely different proposition. The short, 7.5-inch PCB for the Fury X is gone, replaced by a full-size board combined with a top-tier cooling solution. The Fiji processor is large, and while it's not the hottest, most power-hungry chip in AMD's line-up (that honour goes to the overclocked Hawaii found in the Radeon R9 390X), it's clear that the firm is taking no chances in making sure that Fiji is kept cool. The closed loop water-cooler is gone, but the alternative - premium-level heat-sinks and fans from Sapphire and Asus - are still effective in making sure that we don't find ourselves facing another overheating Hawaii situation. Throughout our testing, we never saw the Fury shift under load from its target 1000MHz boost clock, so there's no thermal throttling here and Asus' DirectCU 3 cooler is beautifully quiet - even under full load (and there's no 'coil whine' either - though in point of fact, Asus tells us that this noise tends to comes from the onboard chokes).

UPDATE 11/7/15 2:00pm: A closer look at F1 2015 suggests that the Xbox One version of F1 2015 is actually running at 1440x1080, not the 900p the developer has previously suggested. It's an extra eight per cent of resolution over 900p, but more importantly, it limits upscaling artefacts to the horizontal axis only, in part explaining why image quality looks better than expected. We've also revised our comments on texture filtering as the situation here is a little more complex than previously thought.

Original story: F1 2015 is an enticing prospect for fans of Codemasters' racing series. Built using a reworked version of its existing Ego engine, this year's instalment is the first driving game from the studio that targets 60fps since the Colin McRae Rally titles on PS2 and the original Xbox. PS3 and Xbox One saw a prevalence of 30fps caps, but with the new wave of consoles there's a sense that things are starting to change - that there's more of a focus on 60fps gameplay in genres that particularly stand to benefit from smoother, more responsive action.

We'll be taking a closer look at F1 2015 in our upcoming Face-Off (we only received PC code yesterday), but an early look at the game on PS4 and Xbox One reveals some good intentions let down by clear technical shortcomings in the race to achieve the desired 60fps. First impressions start positively enough, as F1 2015 comfortably delivers a significant graphical upgrade over last-gen instalments in the series: trackside scenery is rich with minor detail, while the cars are meticulously modelled right down to the joints that hold the suspension rods in place. F1 2015 uses a mixture of different texture filtering techniques for varying elements in the presentation - sometimes we see a near flawless blend in the overall presentation, but other times the overall effect is not quite so impressive. It seems that some elements in the scene use 4x or 8x anisotropic filtering, while the road appears to use inexpensive trilinear filtering - which is fine as it's mostly a blur in motion.

AMD has lifted the lid on the cheaper, cut-down, air-cooled version of its top-end R9 Fury X - and we've had a day of hands-on time with the new product. Our initial thoughts? Some of the cutbacks may look significant and the small form-factor of the top-tier product is gone, but the new non-X Fury offers an accomplished level of performance at ultra-HD resolution. On top of that, initial overclocking results paint an interesting picture of Fury performance when compared to the full-fat product.

As far as we're aware, only two different Fury boards are available at launch, with key AMD partners Sapphire and Asus offering up designs based on their top-end tri-fan coolers. Our test board comes from the latter, using its new DirectCU 3 technology - and it's a beautiful piece of kit, more than up to the task of cooling the large Fiji processor. We noted max temperatures in the mid-70s Celsius, even when overclocked - though this was on an open air test bed.

In terms of the Fury itself, the inevitable comparisons with the top-end X model reveal a drop from 4096 to 3584 shaders (a loss of 12.5 per cent in shading resources) while texture units drop from 256 to 224. Base clock speeds drop from 1050MHz to 1000MHz. Otherwise, the spec remains static - the all-important next-gen HBM memory remains in effect at the full 4GB, still running at 500MHz.

AMD's 300 series graphics cards aren't getting a lot of love from the enthusiast community right now and on the face of it, it's not difficult to see why - each and every one of the products is a retooled version of existing processor designs, and the only actual new technology coming from AMD is the new Fury line of GPUs. But as the 'Rebrandeon' controversy continues, one thing is clear - regardless of whether the GPU itself is old or new, the performance is good and the fact that it now ships with 4GB of RAM is absolutely crucial.

Sitting in the £150-£180 price tier, this market sector is one of the most important for both Nvidia and AMD. While the halo products like Fury X and GTX 980 Ti command the headlines, it's cards like the R9 380 and the GTX 960 that actually shift volume. It's also an area where AMD can make hay: Nvidia has just one product in this sector - and right now, we stand by our assertion that the GTX 960 is a card that's "good, but not great".

With the Radeon R9 380, what we're dealing with is effectively a slightly retooled version of the R9 285 we reviewed last September. There's some evidence to suggest a performance improvement between the two cards in like-for-like testing scenarios, but right now it seems clear that the biggest leaps in performance we see in our testing come from a better driver, which is currently only functional on older cards via unofficial hacks - though a unified driver must surely be coming sooner rather than later [UPDATE 9/7/15 10:03am: AMD has just released Catalyst 15.7 which appears to bring all the 300 series optimisations to 200 series owners]. The key issue here is that AMD is selling the R9 380 at a premium over the older 285, which has been on sale for as low as £120 across the last few months. To all intents and purposes, it could be suggested that with the 2GB R9 380, AMD has hiked prices for no substantial improvement in hardware performance - and that rankles.

Almost two years into the PS4's lifecycle, Sony has updated the console with a substantial revision that is not only cheaper to produce, but is also quieter and considerably more power-efficient. The new CUH-1200 revision is currently available only in Japan so unfortunately we can't test it directly, but much of the analysis we'd like to carry out has already been done by the Pocket News blog, and the results are fascinating.

Let's talk about the physical make-up of the machine first. The changes extend way beyond the removal of the glossy plastics and a re-arrangement of the existing rear ports on the unit's exterior. On the inside, there's a new, smaller motherboard with a series of changes. Taking centre-stage is a reconfiguration of the GDDR5 memory set-up. Earlier incarnations of the PS4 used a considerable 16 memory modules to provide the 8GB complement - the new CUH-1200 makes use of double-density Samsung modules to halve that to just eight, which should reduce energy consumption significantly.

The main processor within PS4 has received a new designation, perhaps suggesting some kind of change to its design, but its physical dimensions remain the same, confirming that it is still a 28nm chip. There remains the possibility that Sony may have moved onto a more efficient iteration of the 28nm process, but our gut feeling is that it's still the same chip at its core.

With a show of work-in-progress gameplay footage - as captured from PlayStation 4 - Star Wars Battlefront's E3 demo was a genuine highlight of this year's event. Unlike its earlier April reveal (set to a woodland battle on Endor), it's also a far more realistic take on what to expect of PS4's multiplayer; a continuous run of on-foot and vehicular action around the snowy tundra of Hoth. But for all its flair, does it stack up to the game's impressive earlier reveal - and how is performance shaping up?

Before we go on, it's worth stressing that the "pre-alpha" PS4 gameplay we have here is likely to improve. There's still several months of development time left until Star Wars Battlefront's 20th November release in Europe - enough time to tweak its visuals and optimise performance further. Even so, our current analysis paints a fascinating picture of its progress so far, for the first time showing how DICE's series reboot actually plays on Sony's hardware.

First and foremost, the E3 Walker Assault footage is a genuine, practical look at console gameplay in motion. The PS4 capture is largely unedited, providing long stretches of action around the icy Hoth map, and letting us in on frame-rate and resolution details ahead of launch. Right away, from an early pixel-count we see the game matches the basic framebuffer setup of Battlefield 4 and Hardline: the evidence suggesting this PS4 build runs at a native 1600x900 with a pass of post-process anti-aliasing.

Echoing the reveal of its predecessor three years ago, Microsoft kicked off its E3 media briefing this year with a first look at Halo 5: Guardians' single-player campaign. A new approach to mission design and a host of new characters combine with a fresh coat of paint to present something that feels both new and familiar. Last December's multiplayer beta already highlighted a number of changes to the core formula, but it's only with this new demo that we're able to fully appreciate this new direction and to get a real idea of the full scope of 343's vision for its first true 'next-gen' Halo.

Many of the key improvements seen in Halo 5 are the result of its newly redesigned technical underpinnings. This is only the second time in series history that Halo has made the jump to a new, more powerful platform and 343's newly rebuilt engine is built with ambitious targets in mind. With a renewed focus on lighting, effects work and higher performance, the new tech promises a substantial evolution over Halo 4. A new materials system, making use of physically-based rendering, helps bring newfound realism to the world of Halo - and its key benefits are evident in this demo. Metal takes on a realistic sheen while more diffuse materials sit naturally within the world. While we spotted a fair number of low resolution textures, there's still a strong sense that 343 is moving in the right direction.

Of course, the real game-changer here is performance. For the first time, Halo 5 is built from the ground up with 60fps gameplay in mind. However, the question of whether 343 can hit that target with an acceptable level of consistency is still open to question, based on the E3 showing. While last year's multiplayer beta turned in relatively solid frame-rates, the conference campaign footage raises some concerns. Based on the demo, it seems that the target is a 'perceptual' 60fps experience more in line with the Call of Duty titles. That is, while the target remains 60fps, performance levels often dip below during action sequences.

]]>http://www.eurogamer.net/articles/digitalfoundry-2015-can-halo-5-deliver-on-its-60fps-promise
http://www.eurogamer.net/article.php?article_id=1766105Sat, 04 Jul 2015 10:00:00 +0100What does it take to run Arkham Knight smoothly on PC?

Is it actually possible to run the current, hobbled version PC of Batman: Arkham Knight at 60fps at any resolution? Indeed, is it actually capable of matching the 30fps performance profile of the console versions without investing a small fortune in hardware? Rocksteady and PC port developer Iron Galaxy are currently working on substantially improving the lacklustre performance, and the game itself is currently withdrawn from sale. But for those of us lumbered with the existing code, what can be done to get a decent experience?

The community is doing its part to improve matters, of course. This particularly impressive deep dive into the .ini variables offers up some improvements, but we still measure dips below 40fps in problematic areas, producing a sub-optimal experience that doesn't match the sheer consistency of the console versions. For those that aren't so sensitive to the stutter we experienced, Kaldaien's tweaks could serve you well, but it's fair to assume at this point that if the game could be fixed by replacing a bunch of .ini files, the developer would probably have done so. After all, easy fixes - like restoring the rain effects and ambient occlusion missing in the original release - were deployed in a patch released on June 27th.

Arkham Knight has severe problems on PC, which appear to derive from a sub-optimal approach to memory management. On console, developers have 5GB of memory that can be used at will for game and graphics. On PC, memory is split - divided between system RAM and your GPU's VRAM. The evidence suggests that the game struggles to effectively stream fast enough from one pool of RAM to the other, and clearly struggles with graphics cards with 2GB of memory or lower (spectacularly so when it comes to certain AMD cards, as you'll see later). On top of that, there are other issues: the CPU requirement is quite high despite relatively low measured utilisation, and transparent textures - smoke, explosions etc - inflict a heavier impact on GPU resources than we would expect.

We're halfway through Capcom's 'Year of the Remaster' - Devil May Cry 4: Special Edition is the third of five revisitations of older titles due this year, with Resident Evil Zero and the Mega Man Legacy Collection still to come. As one of the earlier games created using Capcom's MT Framework engine, the original DMC4 was quite a looker back in early 2008. Now, more than seven years later, do those once cutting-edge visuals still make the grade on newer hardware and have any new issues appeared in the transition? Given Capcom's somewhat spotty remaster track record, we tackled this latest release with some trepidation.

Perhaps the most important element in any Devil May Cry title is its performance. Ask any hardcore fan of the series and they'll tell you how critical a fast frame-rate is for high level play - something Ninja Theory was blasted for in the last generation with its 30fps take on the franchise. High-level play demands fast input response and an unwavering frame-rate. Thankfully, in our testing, both PS4 and Xbox One turned in excellent performance overall, delivering a smooth, stable 60fps for the vast majority of the duration. In fact, frame-rate on the whole is actually more consistent here than it was in its original console incarnation.

Unfortunately, for those interested in ramping up the experience to the next level, there is a caveat - there are slight performance issues on Xbox One when playing Legendary Dark Mode. This is an challenging gameplay mode previously introduced in the original PC version of the game that throws a much larger number of foes at the player. More than any other mode, this setting demands a perfect frame-rate - something that the Xbox One can't quite provide in all instances. Essentially, we're looking at minor dips into the mid-50s - something that is just enough to interrupt fluidity. By comparison, these same sequences play without issue on PlayStation 4.

We originally published this article on the PlayStation 4 PlanetSide 2 beta on April 26th, and were all set to produce a new piece for the game's launch. However, once the EU servers came online, we found that our thoughts on the game - along with its overall performance profile - remain largely unchanged. That being the case, we're re-publishing our original piece for those that may have missed it originally.

Just how many players is too many for a console first-person shooter? After three years of sole residency on PC, PlanetSide 2 asks that very question as the PlayStation 4 beta rages on this month. It's an affair that has thousands of players vie for supremacy on a single server at once - an unprecedented figure for any console game. But given PS4's struggle to hold 60fps even in 64-player titles like Battlefield Hardline, how has this been made possible? And indeed, is everything intact in the transition from PC to console?

First up, the basics. From a technical perspective, developer Daybreak Game Company (previously Sony Online Entertainment) is targeting a native 1920x1080 framebuffer and a capped 30fps update for the final PS4 release. An initial look at the beta right now confirms this full HD presentation is in effect, giving us the sharpest base image we could hope for. However, it uses a similar post-process AA method to the PC version; an effect that blurs over its artwork to a degree, while sub-pixel coverage is at times spotty on fine details. Even PC struggles in this area, with no alternative methods offered directly via its display menu - meaning overall image quality is extremely close between the two in practice.

Despite the drama surrounding the PC release leading to its subsequent withdrawal, there's a sense of success in the console space as PlayStation 4 owners, and indeed those on Xbox One, get a superb rendition of Batman: Arkham Knight. It's fair to say Rocksteady sized up each console's strengths well ahead of producing its first current-gen title, and it's paid off in one of the best Batman games we've seen in years. But in playing the game this week, the evidence strongly suggests that Unreal Engine 3's impressive Samaritan tech demo in 2011 paved the way for many of the game's crowning technical achievements.

From the city's crisp neon reflections and bokeh-dotted backdrops to the colour-shaded rain and smoke, the Samaritan teaser didn't just bring a spec feature-list, but a pretty close match for what would become Arkham Knight's final aesthetic. It's an approach that likely struck a chord at Rocksteady at the time, a team that in the same year had just wrapped up development of Arkham City and looked towards its next venture - seemingly too early to catch Unreal Engine 4's wave. Looking at the demo and final game side-by-side, the end result is uncanny in its similarities, especially in the use of lighting effects, point-light reflections, and the integration of Nvidia's Apex tech for cloth simulation across its characters.

Of course, Rocksteady adds much more to the equation on its own. The scale of Gotham City is unlike anything we've seen on the engine, as well as the procedural method to enemy encounters - cut-scenes that dynamically weave into play as you traverse the city (often inviting you to a new side-quest). The seamless nature of the animation system, with its single camera swoops to and from the Batmobile cockpit, also deserves huge credit. An absence of loading screens also sets it apart from last-gen hardware, where RAM proved a limitation in streaming open-world environments - particularly as dense as this rendition of Gotham.

AMD is leading us into a new, exciting era of graphics technology - where ultra-fast memory is connected directly to the core, enabling higher performance, enhanced power efficiency and a new wave of small form-factor graphics cards. The Radeon R9 Fury X is the first GPU to arrive boasting this cutting-edge tech, with AMD telling us that it is the fastest single-chip GPU on the market, a title currently held by Nvidia's mammoth Titan X 12GB. Well, the reality is that the Fury X is a fascinating first-gen product with plenty of positives, but in terms of raw performance, both Nvidia's Titan X and its cut-down GTX 980 Ti are generally faster and more versatile for the high-end enthusiast market.

As always, performance is king, so AMD's inability to be comprehensively competitive with Nvidia's GM200 across the length and breadth of our benchmarks is a little disappointing - but certainly in terms of the physical package, it's great to see that the poor reference cooling design of the 200 series is now a thing of the past. The Fury X is built from quality materials that look good and even feel good, and the dinky, compact nature of the 7.5-inch board is quite remarkable - it's a marvel of integration. The work AMD carried out on the Radeon R9 295X2's reference water cooler is carried over and refined on Fury X, which also has its own closed-loop set-up that is significantly quieter than Nvidia's reference coolers, though it is accompanied by a continuous, consistent, high-pitched tone - presumably emanating from the pump. It was a little bothersome on the test bench, but will hopefully be less of an issue when the card is installed deep within a decent case.

The Fiji processor in the R9 Fury X is based on AMD's third generation GCN architecture, previously found in the R9 295/380, and codenamed Tonga. Doubling up on stream processors brings the shader count up to a gargantuan 4096, up from the 2816 found in the R9 290X/390X. This core is then combined with the ultra-fast, ultra-wide HBM RAM.

It's unusual to see PC multi-platform titles failing to match up to their console equivalents - Xbox One and PS4 are based on PC technology, after all - but in Batman: Arkham Knight we have a rare example. Having tested the PC game on a Intel Core i7 3770K machine, with 16GB of memory and a GTX 780 Ti, a solid all-round experience should be within easy reach. In reality, performance levels are poor with this setup, and to throw salt into the wound, the PC's top tier settings miss out on visual effects found in the PlayStation 4 release.

Let's start with performance - the big bone of contention with the game at launch. Out of the box, the game is capped to 30fps - a number that hides lurches in performance we see on this machine, but compared to the unlocked frame-rate in the last two games it's a definite shortcoming. However, by navigating to Arkham Knight's config folder, it's easy enough to change this to 60 or beyond with the 'BmSystemSettings.ini' file. In our case, we choose 120 to see the state of the game's full performance profile - and it isn't pretty.

Even at low settings, holding 60fps on the 780 Ti is out of the question. With Nvidia's 353.30 driver (apparently optimised for Arkham Knight) installed, our setup suffers from hiccups and lurches to as low as 38fps as we glide across Gotham or drive in the Batmobile. Frame-times are highly variable, reaching a nadir of 410ms for a single frame as the game crosses a checkpoint. Amazingly, dropping to the worst quality textures, shadows and level of detail (while turning off anti-aliasing, v-sync and all Nvidia Gameworks features) still has us dropping to 40fps and lower while at 1080p.

The launch of Batman: Arkham Knight may have been blighted by the arrival of the PC train-wreck, but let's be clear - Rocksteady's console game is a slickly presented finale to the saga that should not be overlooked. Based on early impressions, the game has already proven itself in the performance stakes, and that solid frame-rate and stability is backed up by a more ambitious approach to the open world gameplay pioneered in 2011's Arkham City.

Towering above all other changes is the scale of Gotham City. At an estimated five times the size of the last game, Arkham Knight demands a rendering range not seen on older platforms. PS4 capably delivers on this too, with only minor texture pop-in when zipping between buildings - using Batman's extended range grappling hook, or driving at pace in the Batmobile. Draw distances are convincingly broad, and it's possible to even see lamplights from across the city, giving the urban sprawl a real sense of consistency as you leap from roof to roof.

Comparisons with the PS3 edition of the last game are perhaps unfair; even next to the PC release of Arkham City, the world design here is a huge step ahead in overall complexity. Gotham City arches and bends in a more organic way, from the bright pagoda gates of the Chinatown district to the neon sleaze of its downtown area - right up to the art deco spires of the city's skyscrapers. Every building is distinct thanks to a boosted polygon count across the board, eschewing the rigid layout of Arkham City for a more chaotic sprawl. And crucially, on PS4 there are no loading screens whatsoever - it's all one seamless experience from start to finish.

Packing in all of the post-launch content added to the PC game over the past 12 months, Payday 2: Crimewave edition on PS4 and Xbox One has the potential to be the definitive version of the game. From a visual perspective, Overkill Software promise native 1080p resolution alongside upgraded graphics and higher frame-rates - the latter presumably in comparison to the existing 360 and PS3 ports. With that said, how does the Crimewave Edition of Payday 2 stack up against the PC game? Are we looking at any clear graphical improvements specific to these latest console ports, or simply a straight up conversion that delivers a nice upgrade over the existing console editions? More to the point, in an age where broken online games routinely hit the market, does Payday 2 present a robust, playable experience? Spoilers: Xbox One has severe problems right now.

However, in rendering terms at least, Overkill has delivered. Both console versions deliver a native 1080p framebuffer with all the benefits this brings over the game running on last-gen hardware. Image quality matches up exactly with the PC version, appearing fairly crisp, though quite unrefined - stair-stepping artefacts are clearly visible across long edges, and there are plenty of jaggies that frequently shimmer across the scene. A high level of post-processing softens smoothens over the presentation to a degree, with depth of field and chromatic aberration key in emphasising this effect. Initially a closer look at edges on a pixel level suggestions that some kind of rudimentary post-process anti-aliasing technique is in play. However, the PC game lacks anti-aliasing options in the video settings menu, and makes no mention of any edge smoothing modes in the game's render_settings.xml file, suggesting that any coverage we are seeing across all three platforms is perhaps just a side effect of the heavy post-processing in play.

Talk of graphical upgrades over the previous editions of Payday 2 sound rather enticing, with higher resolution textures and improved frame-rates touted as one of the main selling points of the new Crimewave Edition. However, are you've probably guessed, that's judged on last-gen console terms. As things stand, the PS4 and Xbox One releases look very similar indeed to the PC game running at its highest preset: the same core assets are deployed across all three formats, with texture quality, shadow resolution, reflections, and alpha effects all matching up nicely, though there are a several differences in the way some of these elements are handled across between platforms.

On paper, the task facing AMD's engineers must have looked formidable. Having jostled for supremacy with Nvidia's GTX 780 Ti and Titan, the firm was clearly competitive with its Radeon R9 290X - until the arrival of GTX 970 and GTX 980, both of which outperformed the best that the red team had to offer, and did so with bags of overclocking headroom to spare. In producing the Radeon 300 series, AMD had to match or beat Nvidia's excellent performer - and not only that, it had to do it using existing silicon. We had doubts that it would be possible, but as the benchmarks rolled in, the bottom line became clear: AMD has done it.

It's a remarkable achievement bearing in mind that all the evidence points towards the R9 390X being little more than an overclocked version of the outgoing 290X. AMD dubs the 300 series version of the chip 'Grenada' but really and truly, a straight head-to-head strongly suggests it is the same as the existing Hawaii, simply with a 50MHz overclock - upped still further to 100MHz here in the MSI Gaming version of the card we're reviewing here. Now, there've been overclocked 290Xs before, but none of them have troubled the GTX 980 - something else must have changed to see the impressive performance we've looking at here.

That would appear to come down to memory bandwidth. The R9 390X features 6000MHz GDDR5, a 20 per cent improvement in terms of throughput over the 290X (and overclocked still further on this MSI board to 6100MHz) - the result of AMD shifting to a much faster, more capable Hynix memory module. And as a bonus, the firm has decided to double the allocation of VRAM up to a mammoth 8GB. Superfluous? Right now, yes. But in a market where many console ports now require 3GB to service 1080p, who knows what might happen in the months and years to come - especially when cards like the R9 390X are built to service more memory hungry resolutions much higher than the standard full HD?

A month on from Project Cars' original release, update 1.04 brings a number of enhancements to both PlayStation 4 and Xbox One that may surprise. Weighing in at 500mb, the update addresses a few of the qualms we had on PS4 specifically - such as the temporal anti-aliasing technique that caused ghosting behind moving objects that wasn't present on Xbox One. But above and beyond that, it's a solid, all-round performance boost for both consoles too, introducing new visual features previously only seen on PC.

To rein in our expectations, races involving rain and 20 cars or more still gives us a sub-60fps experience with tearing - but it is improved. Based on Slightly Mad Studios' patch notes, patch 1.04 sets out to optimise for each console's CPU, and comes equipped with a broadly reworked shader compiler. In effect, this claws back a maximum of 5fps during our Imola and Azure Coast stress-tests on both PS4 and Xbox One. That said it's not consistently that high in some areas, and it can dip below the earlier results, but it's a definite improvement on aggregate.

Comparing PS4 and Xbox One still puts Sony's platform ahead though. Even with upgrades to both platforms, the margin is still as wide as 12fps in favour of PS4 at points, and rarely does the twain meet in terms of the frame-rate read-outs between the two. If the PC version isn't an option for those eager to try Project Cars ambitious racing sim, PS4 remains the best performer in the console space.

When Microsoft announced that backwards compatibility would be hitting the Xbox One this year our collective jaws hit the floor. Even factoring out the effort involved in mapping the Xbox 360's GPU functions to entirely different Xbox One hardware, the idea of translating the 360's tri-core 3.2GHz PowerPC CPU to six low-power x86 Jaguar cores seems like a herculean task - yet somehow the engineering team has delivered. It's not perfect in several cases, but the fact that the virtualisation works at all is a supremely impressive achievement.

Unlike the spotty backwards compatibility available on Xbox 360, which required a custom wrapper for each individual game, Microsoft has taken a more extensive approach through the use of a virtual machine that runs on the Xbox One as a game in and of itself. This virtual environment includes the Xbox 360 OS features, though they remain unavailable to the user, enabling the software to behave as if it is running on original hardware. The Xbox One then views this "Xbox 360" app as its own game allowing features such as screenshots and video sharing. The emulator supports both digital downloads and original DVDs, though discs simply act as a key, the core data downloading over the internet via Xbox Live.

As part of the preview program, backwards compatibility is still in development and currently only supports a small list of titles. Of the games available, there are reports that only North American and region free discs currently function with Xbox One, regardless of the origins of the console (we used US discs for this piece). Furthermore, Microsoft is still working on a solution to enable support for multi-disc games - something that isn't currently working. While the majority of the initial titles are relatively basic and not much of a work-out for the VM, there are a few more demanding titles on tap including Mass Effect and Perfect Dark Zero. We sat down and put some time into several of these games, the idea being to stress-test the virtual machine's capabilities and to ascertain its strengths and weaknesses.

It's a sad fact that this generation's big releases are often playing catch-up on their promises post-launch - a "release now, fix later" mentality that developer Rocksteady thankfully doesn't appear to subscribe to in its excellent Batman Arkham Knight. Having played the PlayStation 4 review code extensively, we're pleased to see the game is set to launch in a very refined, polished state. As the finale to the Arkham saga, it's a superb production, but crucially it also turns in a slick, stable playing experience with solid performance on day one.

A full analysis is under way, but in the meantime we can offer a quick, spoiler-free taster of in-game performance based on the PlayStation 4 version of the game (naturally, we'll cover all three versions next week). Full disclosure: footage here runs from a Sony debug unit, using code that Rocksteady is clearly comfortable handing to critics ahead of launch. As such, you'll likely spot a familiar Eurogamer name (amongst other production info) watermarked into our captures - Warner Bros was unable to provide clean code ahead of its embargo. Regardless, the analysis still paints a solid picture of what you can expect on release in a few days.

First off, in line with most open-world games this generation Arkham Knight is capped at 30fps, in this case backed by an adaptive v-sync. This works out nicely in practice: unlike the last-gen iterations of the series that used a similar setup, dips below 30fps on PS4 are relatively uncommon across the run of play. In effect, this means tearing only creeps in at exceptional points, such as heavy interplay with physics while driving the Batmobile. Thankfully screen-tear is also difficult to catch by eye even when it does flare up, owing to Gotham's darker colour palette.

AMD has revealed its new top-tier performance graphics line - Radeon Fury. Three different cards based on its new silicon, codenamed Fiji, have been revealed - led by the ultra high-end watercooled Fury X, due for release before the end of the month. Later on in the summer we can look forward to a slower, cheaper, air-cooled Fury along with Fury Nano - a small form factor iteration.

AMD's Fiji processor is effectively a much, much larger version of its existing 'Tonga' technology, as found in the Radeon R9 285 and the 5K iMac. It sports a mammoth 4,096 shader cores (a 45 per cent increase over its prior flagship, the Radeon R9 290X), along with 4GB of HBM - AMD's revolutionary, high bandwidth memory that sits alongside the GPU core, offering enormous improvements in latency as well as throughput.

Closely integrating memory with the graphics engine offers many advantages. In theory, memory bandwidth should be removed as a limitation (AMD tells us that there is no point attempting to overclock HBM for additional performance, indeed the option may be removed entirely) while the physical form factor of the board itself is significantly reduced, allowing for Fiji-based products like the upcoming Nano to fit into much smaller PCs.

It's an enticing proposition. Microsoft has remastered Gears of War for Xbox One and PC in a new Ultimate Edition featuring massively improved visuals and a smooth 60 frames per second - in multiplayer at least. Following Monday's E3 conference, beta codes for the Xbox One version began to roll out, and luckily enough, we were one of the first to receive them.

Perhaps unsurprisingly, the Ultimate Edition is utilising a mature version of Unreal Engine 3 rather than the new UE4 engine powering Gears 4. The groundbreaking original release was one of the very first UE3 titles on the market almost ten years ago, so in theory moving to the latest iteration of the engine brings a lot to the table (the extent of which should be more pronounced in the single-player campaign). In addition, this isn't a simple repurposing of assets either, but rather a full-on remake using an entirely new set of textures, models, and effects. While the transformation isn't as dramatic as Halo 2: Anniversary, the changes are definitely more significant than we expected.

The basics are all there as well - a full 1920x1080 frame-buffer with a smattering of FXAA for good measure. UE3 titles for Xbox 360 were known to use 2x MSAA fairly early in the rendering cycle with spotty results but, even if that still is the case on Xbox One, the end results are nearly imperceptible through the veil of post-process anti-aliasing. There is a fair amount of shimmering objects along with typical post-AA blur, but overall image quality is attractive, and a huge boost over the original Xbox 360 game. That said, upon revisiting the original, we were surprised at how nicely the image quality manages to hold up, even at 720p.

After a rocky start on the PC last year, The Elder Scrolls Online finally comes to the PS4 and Xbox One, six months later than expected. Launching with a number of netcode-related issues that saw players unable to login to the game and stuck waiting in a server queue for up to several hours, it's fair to say the release hasn't gone as smoothly as expected. Indeed many bugs and gameplay issues are still being addressed, despite the emergence of a 15GB day one patch. In particular, the netcode code causes regular interruptions in performance across all platforms, compromising overall stability during gameplay. On the plus side, the new versions of Elder Scrolls Online deliver a new menu system built around regular gamepad control, so this is clearly not a straight port of the PC version - some thought has gone into redesigning gameplay for the console audience.

Both PS4 and Xbox One hand in native 1080p presentations, with similar anti-aliasing techniques in place. The Elder Scrolls Online generally takes on a fairly clean look with smooth edges and little in the way of obtrusive jaggies; sub-pixel shimmer is kept under control, though smaller details still break up when viewed from a distance. The type of anti-aliasing isn't listed in the settings menu on the PC game, but it's clearly a post-process solution. It's not bad either - texture blur is minimal with just a light reduction in clarity compared to the crystal clear look you get from traditional multi-sampling.

On first impression, the console versions of Elder Scrolls Online appear to align nicely with the PC release, the overall look doesn't seem unduly compromised in terms of texture work and overall detail, while the lighting and post-effects pipeline is a close match across platforms. Texture filtering is a sticking point on consoles though, with the lack of anisotropic filtering impacting clarity on flat surfaces, particularly across the ground where the amount of detail resolved quickly tails off within a few feet of the player. Filtering isn't perfect on the PC either, with the game sporting what looks like something closer to 4x AF instead of the usual 8-16x filtering that you'd expect to find - but it's enough to visibly reduce the texture blurring we see on both console versions.

Only available as bundled bonus with Final Fantasy Type-0 HD, the Final Fantasy 15 demo impressed us with the scale of its technological ambition, but fell short in terms of performance. But that's OK for now - the game is still deep in development, after all. Now, in an unprecedented move, Square-Enix has seen fit to update this demo based on real feedback from fans around the world. It's a fascinating new approach to game development from the Japanese giant - and with version 2.0 of Final Fantasy 15 Episode Duscae comes a whole host of improvements and changes.

Most of the work is focused on gameplay, but changes to the underlying technology have been made too, resulting in a more polished experience. While still rough around the edges, this Luminous Studio 1.5-powered experience is shaping up as one of the most ambitious titles we've seen so far this generation. Combining a massive open world with full global illumination, advanced animation capabilities, a realistic physically-based materials system, and GPU accelerated particles is no small task. However, with the poor performance observed previously, the question is to what extent the developer's vision is achievable while still handing in an acceptably smooth gameplay experience.

Initial impressions are mostly positive thanks to a much faster, more responsive camera system and controls. Battles play out much more smoothly thanks to an improved lock-on system, faster blocking abilities, and a dodge roll. Unlike the original release, the camera now properly tracks locked-on enemies while simultaneously zooming out, making it much easier to keep track of your targets while surveying the battle at large. These simple tweaks make a tremendous difference in terms of playability, even when the frame-rate falters.

On the face of it, the notion of bringing the entire Uncharted trilogy from PS3 to PS4 - with 1080p60 upgrades to boot - should be relatively simple. After all, PlayStation 4 represents a generational leap in system capabilities, particularly in terms of raw GPU power. However, Naughty Dog's recent GDC talk - "Parallelising the Naughty Dog engine using fibres" - reveals in stark detail how difficult it was to bring The Last of Us across to the new Sony console. Indeed, the initial porting work for the game resulted in a sub-optimal experience operating at less than 10fps.

In many ways, the scale of the challenge with the upcoming Nathan Drake Collection is even more daunting. Three games are in development, not just one, and the original developer itself isn't carrying out the conversion work - instead, Austin-based studio Bluepoint Games is taking the conn. Adding to the difficulty factor, Sony has dropped hints that the three remasters will actually see tangible improvements over the original versions in the form of "better lighting, textures and models", along with a photo mode, plus other enhancements suggested by the community. Just about the only concession is the somewhat disappointing news that the multiplayer components of Uncharted 2 and its sequel will be removed.

Regardless, The Nathan Drake Collection is a highly ambitious project - and the evidence suggests that the efforts Naughty Dog made into bringing The Last of Us onto PlayStation 4 form the technological foundation on which the remasters are based. So how did the studio turn that initial 10fps port into the slick, 60fps release we enjoyed last year? Based on Naughty Dog's GDC presentation, it seems that the developer was more limited by the CPU, rather than the GPU. The studio leveraged the PS3's Cell chip extensively, in particular the six available SPU satellite processors. The original engine targeted a 30fps update, based on a single processing thread consisting of game logic followed by a command buffer set-up (basically generating the instructions for the GPU). Most of the engine systems were hived off to the SPUs, with the main processor - Cell's PPU - running the majority of the actual gameplay code.

The PC graphics space is one of the most exciting areas of gaming technology right now. The battle between Nvidia and AMD remains as intense as ever, with each manufacturer doing their best to push the envelope with tangible year-on-year improvements in features and performance. Of all the components inside your PC, it's the GPU that is upgraded most often - reflecting the impressive boost in processing power we see from one generation to the next, meaning that it's really important to keep our graphics card upgrade guide regularly updated as each new product arrives and as volatile market conditions shift.

We first published this guide at the beginning of February this year. Since then we've seen the arrival of a new breed of graphics hardware, spearheaded by Nvidia's Titan X and GTX 980 Ti. We've also carried on benchmarking behind the scenes, testing out multi-GPU set-ups, and applying our range of CPUs and graphics cards to the latest games, offering quality setting recommendations to get the best gameplay experience. Much of that testing has been rolled back into this guide.

However, it's safe to say that while this guide gets increasingly larger and more dense, there's just as strong an argument for distilling our thoughts down into a simple series of recommendations - the best card for any particular budget - and you'll find that data below. We also intend to update these simple recommendations more frequently as prices shift. For example, at the time of writing, there's a clear move by specialist PC component sellers to shift stock of the Radeon R9 290. It may have been eclipsed by the GTX 970 somewhat, but at the £180-£200 price-points we've seen recently, it actually occupies its own price tier with no competition, offering remarkable performance for the money.

It's been nearly seven months since the troubled launch of Halo: The Master Chief Collection. November 11th 2014 marks the day that Halo fans everywhere were left out in the cold when the game hobbled out of the gate with serious functionality problems. While a number of other high profile releases shipped with nasty bugs, The Master Chief Collection is perhaps the most infamous of all. This is Halo we're talking about, after all. It's the most important franchise in Microsoft's stable. As good as the campaigns are, it's the series' multiplayer action that has kept fans playing for more than a decade now. Unfortunately, while the single-player modes were generally excellent, The Master Chief Collection was simply not fit for purpose when it came to online multiplayer.

It took a long time for the issues to be resolved, and 343 Industries attempted to smooth things over by offering early adopters a free copy of the Halo 3: ODST remaster as soon it was ready to launch. Six months after that announcement, the promised remaster is finally available for The Master Chief Collection, and with its release comes a new map and another round of changes and improvements.

Knowing how much has changed since launch, we felt it was high time to check back in on the game and see just how far along the collection has come. This leaves us with two real questions, then - firstly, how is ODST on the Xbox One and, secondly, have the release's larger problems been addressed? Diving right in then, there's no better place to start than the moody, jazz filled streets of New Mombasa with Halo 3: ODST.

UPDATE June 4th, 9:00am: After just one week on the market, the PS4 version of Ultra Street Fighter 4 has received its first patch bringing with it a host of improvements and changes designed to address complaints in the original release. Straight away, we noted that the menu system has been bumped up to a proper 60fps, making a world of difference during navigation. The patch notes mention changes to anisotropic filtering to "decrease blur" but it would appear this effect is applied only to specific stages. Namely, the training stage is now much cleaner with higher quality texture filtering that almost completely eliminates the issue. Unfortunately, this does not appear to be a global change as most other stages that we tested appear identical to our tests prior to the patch. Anti-aliasing also remains missing in action, while the low resolution transparency effects also appear unchanged.

A number of players are already putting the game through its paces and reporting a decrease in input latency. Running a quick trial on our own setup, there does appear to be a tangible decrease in latency across the board. Using the same technique detailed below, this updated version appears to reduce latency by one frame, or roughly 16ms.

The patch also addresses other issues such as disappearing projectiles, various UI issues, and Decapre's teleport animations. We can't say for sure if all of these issues have been fully corrected but we didn't encounter any missing Sonic Booms during our session, which is a good sign. That said, Decapre's Air Scramble still comes out at a different angle compared to the other versions of the game. We also noted minor issues with the audio mix with certain sound effects seemingly playing louder than they should.

We knew it was coming, of course - it was just the speed of its arrival that took us off-guard. Nvidia's astonishing Titan X graphics card, based on the 8bn transistor GM200 processor, redefined the boundaries of single-GPU performance when we reviewed it back in March. Imagine the firm's previous flagship - the GTX 980 - combined with a mainstream GTX 960 in a single package, backed by a ridiculous 12GB of 7gbps GDDR5 and you have a remarkable technological achievement. The only problem was its price - $999. The new GTX 980 Ti is a mildly cut-back version of the same product, offering around 98 per cent of the raw performance at 65 per cent of the price.

That's not a bad outcome bearing in mind the extent of the compromises Nvidia has made. The 3072 cores of the Titan X get cut-down by a factor of nine per cent, two shader clusters disabled to give us a final tally of 2816 stream processors. By extension, we lose some texture mapping units too - dropping from 196 in Titan X to 176 in the GTX 980 Ti. However, the biggest cutback is perhaps the most inconsequential of all, bearing in mind the current gaming landscape. The mammoth 12GB of GDDR5 found in the top-end flagship is pared back to a still lavish, but more reasonable 6GB. However, memory speeds remain the same, core and boost clocks are identical, and unlike the GTX 970, Nvidia's cutbacks have not come at the expense of ROPs or bandwidth - both are identical to Titan X. To be clear, this time there are no split-memory shenanigans.

Graphics cards are parallel by nature, distributing work across however many cores are available. That being the case, it's a totally reasonable assumption to suggest that losing around nine per cent of the computational power should result in a similar drop to overall performance. However, raw processing power is just one element of the equation and in a card which is seemingly already hitting bottlenecks elsewhere within the system (be it the driver, the DX11 API or the CPU), the reality is that the GTX 980 Ti is virtually interchangeable with its much more expensive sibling.

UPDATE June 1, 6:00pm: We've had some requests to test performance in The Witcher 3 in a number of stress points, including the swamps. We tested Crookback Bog and can confirm that there are still serious issues here, especially on PlayStation 4 - though Xbox One remains affected to a lesser degree. You'll find that video at the end of the article.

Original story: CD Projekt Red's latest patch 1.03 is a must if you haven't already downloaded it, with The Witcher 3's frame-rates tweaked for the better on both PlayStation 4 and Xbox One over its day one state. Counting in at 500MB, a 30fps cap is crucially also added to Xbox One with this update, at last evening out its frame-pacing to give smoother results. We've seen miraculous improvements to games like Borderlands: The Handsome Collection thanks to patches late in the day, but is this the one to fix The Witcher 3's rockier stretches of performance?

A look at the patch's changelog
shows a focus on various bug fixes, plus adjustments to minimise shadow pop-in. However, it's fair to say the core visual setup on console isn't much different. LODs are perceptibly the same as before, still borrowing from PC's medium and high settings for foliage and shadows, while post-process effects are left as-is in quality. Resolutions are unchanged too; PS4 makes the most of a native 1920x1080 frame-buffer, while Xbox One upscales from 1600x900 in most areas.

]]>http://www.eurogamer.net/articles/digitalfoundry-2015-does-the-witcher-3-patch-103-fix-console-performance
http://www.eurogamer.net/article.php?article_id=1758063Sun, 31 May 2015 09:00:00 +0100What does it take to run The Witcher 3 at 1080p60?

A massive critical and sales success, The Witcher 3 is a phenomenal piece of engineering - a technological accomplishment clearly built with the limitations of current-gen console in mind, but scaling beautifully on all manner of PC hardware. In producing our recent Face-Off, we saw that a relatively modest budget PC could match and in some cases exceed both PS4 and Xbox One performance at the same visual quality, but we wanted to go further - pushing those quality presets across a range of PC hardware.

On top of that, we wondered what it would take to run the game at 1080p resolution with a firm 60fps. 1920x1080 is by far the most popular PC gaming monitor resolution, and the vast majority of those displays update at 60Hz. Typically, when gameplay matches your display's resolution and refresh, you get the most visually pleasing experience. Effectively doubling console performance without trading on the visual quality is something of a challenge and while it can be done, a completely locked 60fps may prove too much for the most popular, mainstream £150 graphics cards.

The official Witcher 3 system requirements are pretty steep: the minimum is a Core i5 2500K, 6GB of RAM and a GTX 660 or Radeon HD 7870, while the recommended spec sees a jump to an i7 3770, GTX 770 or Radeon R9 290 and 8GB of RAM. Our minimum spec is a Core i3 4130/FX 6300 with GTX 750 Ti - much, much lower, but you get a PS4-level experience using the settings in this guide. However, our recommended spec would be a Core i7 4790K with a GTX 970. This will get you to 1080p60 with some visual enhancements over console, or alternatively, to really scale up the bling but keep your frame-rate above 30fps. In all cases, we recommend 8GB of RAM.

]]>http://www.eurogamer.net/articles/digitalfoundry-2015-the-best-pc-hardware-for-the-witcher-3
http://www.eurogamer.net/article.php?article_id=1757296Thu, 28 May 2015 08:00:00 +0100What can we learn from The Witcher 3 "downgrade" fiasco?

Mere weeks away from E3, developers and publishers are working flat-out behind the scenes to make this the show of their lives. There will be new game announcements, surprise reveals and eagerly anticipated re-reveals. In many cases, it'll be our first opportunity to see some of the biggest titles of this year and the next. Doubtless, we shall see some amazing software running on console and PC - but at the same time, we strongly suspect that there'll be a vast array of marketing materials that end up bearing little resemblance to final software, or at the very least misrepresent the quality of the featured game. Even with the very best intentions, it's a situation that can backfire badly, as CD Projekt Red has discovered over the last couple of weeks.

To be frank, it's been difficult and frustrating for Digital Foundry to cover this story. Of course, we see all the forum posts and Imgur comparison images that you do, but at the same time without access to the game itself, it's very difficult to add anything of additional value to the debate - and we only received PC code on the day of release. However, knowing that our colleague Robert Purchese would actually be in the studio with CDPR, with first-hand access to sources that could get to the bottom of the story, we were at least in a position to get answers to our most pressing questions, and to put our point of view across to the developer.

Bertie's remarkable story provides eminently plausible and reasonable reasons behind the differences with the 2013 reveal trailer, and reminds us that CDPR - in common with the vast majority of game developers - are fundamentally decent people at heart, doing their utmost to bring us their best quality work. It also emphasises that conspiracy theories - in this case an apparently deliberate campaign to make The Witcher 3 look better than it actually is - only flourish when there is a lack of communication between the studio and its fans.

The idea behind Borderlands: The Handsome Collection was enticing: billed as the definitive console editions of Borderlands 2 and The Pre-Sequel, the remaster promised full HD visuals and a smooth 60fps update on both PS4 and Xbox One. Unfortunately, the launch code felt distinctly uneven and lacking in polish, with a wildly variable frame-rate and intrusive screen-tear. Last week, Gearbox Software released patch 1.02 to tackle these performance issues in addition to squashing a number of bugs. Weighing in at around 9GB on both consoles (around 7GB for The Pre-Sequel and 2GB for Borderlands 2 on Xbox One), the update is remarkable, a genuine game-changer. The developer has finally handed in something closely resembling the experience we were hoping for - 1080p60 gameplay on both PS4 and Xbox One.

An initial look at the PS4 patch reveals substantial upgrades in performance, with demanding scenes operating at far higher frame-rates than the day one code. An early gunfight in a large control room initially suffered from regular bouts of screen-tear and heavy frame-rate drops, creating judder and highly variable controller response. By comparison, after the latest 1.02 patch is installed, frame-rates stick closely to the desired 60fps refresh with only a few minor dips in smoothness that manifest as occasional brief pauses, which are short and rare enough not to cause offence.

The same level of optimisation is found in throughout the game: small fire-fights play out without a a hitch, while locations sporting long draw distances mostly retain the target 60fps - something that definitely wasn't the case previously. Tearing remains present but far, far less of an issue than it was previously, with only a few odd torn frames sneaking in when the engine can't quite hit its target 16ms frame-time.

The Witcher 3 is a game of many firsts. Above all for CD Projekt Red, it has the distinction of launching on three platforms at once, pushing for PC, Xbox One and also its first Sony format - PlayStation 4. Also breaking new ground is a more open-world design than we've seen before in the series, widening the scope of Geralt's adventure as we enter a sprawling third act. We've had a cursory glance at how console versions hold up in performance terms, but factoring in a PC release with plenty of visual bonuses, how do the consoles compare?

Purely in visual terms, PS4 and Xbox One miss out on PC's ultra-grade settings in several areas, but the game still looks complete on each. At its core, REDengine 3 drives a high level of foliage detail on console - perhaps the greatest density of plant-life since the original Crysis, rendering trees in at a surprisingly long range. Factoring in time of day, weather systems and rolling clouds, The Witcher 3's physically-based lighting model also impresses, with shadows spread dynamically from each swaying branch, and light shafts flitting between each leaf. In the right light, the final result helps even the bleakest points in No Man's Land's marshes to achieve a great sense of atmosphere.

This isn't doable without a little help from outside tech. The Umbra 3 plugin in particular is crucial to performance, defining the parameters at which objects become visible, and culling detail intelligently (using a frustum) on PS4 and Xbox One to keep pop-in as discrete as possible. This works with SpeedTree, another middleware engine that handles foliage placement across the world, essentially randomising tuft positions to give the world its distinctive, flourishing look.

The Witcher 3 is a roaring success
in the critical sphere, and despite some rough technical points
it's one of the finest RPGs of this generation to date. However, PlayStation 4 and Xbox One each throw a curve-ball in their delivery of the game, particularly when it comes to the subject of performance. For those still mulling over which console version to go for, we can confirm both consoles offer world detail, shadows, lighting and alpha effects to a matching standard - but it's resolution and more importantly frame-rate that sets the two apart.

The first distinction is that PS4 caps its frame-rate at 30fps (with v-sync), while Xbox One runs with no cap at all to allow a variance between 30-40fps during play. This means Sony's machine is in theory capable of a smoother experience - with the cadence of frame delivery forced to a consistent rate. However there are issues on the Sony side side as well, and the net result is that neither feels truly smooth, though for very different reasons.

To start with cut-scenes, a firm 30fps line is held on PS4 during an early griffin encounter, and in practice this gives us smoother motion compared to the 35fps read-out on Xbox One (higher frame-rate does not automatically mean a better experience overall, something we've covered in-depth before). However, the problem here is that if the PS4's frame-rate drops below this number, it instantly locks to 20fps. It's an instant switch, much like the double-buffer method of v-sync seen in Metal Gear Solid 4 on PS3. In one later scene involving heavy ice effects, it's notable that Xbox One does glance this 20fps figure at a similar moment, but unlike PS4 it's able to waver up and down the scale more freely. Meanwhile, Sony's platform is stuck at this value for long stretches of a scene.

AMD has officially unveiled its next-generation graphics memory solution - HBM: high bandwidth memory. Sporting an enormous increase in throughput over existing GDDR5 technology, along with equally impressive power-efficiency plus impressive space-saving attributes, HBM is set for release on AMD's next flagship graphics cards, with the firm confirming that we should see these on sale within two months. Radeon R9 390X, any one?

We attended a conference call presentation last week, given by Joe Macri, CTO of computing and graphics at AMD. He talked about the reasons behind the development of HBM - specifically that while GPUs were becoming increasingly more powerful, the existing GDDR5 memory system wasn't scaling in line. He explained that 7gbps chips are available now, with 8gbps modules in the pipeline, but there's little future in the tech: the amount of power needed to increase memory bandwidth doesn't scale in a linear fashion - the faster GDDR5 becomes, the more power hungry it is. GPUs tend to have hard TDP (thermal design power) limits, and going forward, it doesn't make sense to funnel large amounts of power through to the memory system, when more is achieved by diverting it to the GPU core.

On a broader level, GPU performance is increasing at a rate that GDDR5 cannot match, potentially increasing the chance of memory bottlenecks. A new solution is required, and that's where HBM comes to the fore. As opposed to the GDDR5 system of individual modules soldered to the board and connected to the GPU's memory controller, HBM offers up a much more refined solution. Individual memory modules are stacked one on top of another, connected by 'through-silicon-vias' (TSVs) and separated by microbumps. A single GDDR5 chip on a 32-bit interface offers up 28GB/s of throughput. In contrast, an HBM stack is 1024 bits wide, with over 100GB/s of bandwidth (AMD partner Hynix has a more precise 128GB/s metric), achieved with a significant drop in voltage too. Efficiency is also increased by sitting both the GPU core and the HBM stacks on an 'interposer' which brings the two elements much closer together.

]]>http://www.eurogamer.net/articles/digitalfoundry-2015-amd-reveals-hbm-future-of-graphics-ram-tech
http://www.eurogamer.net/article.php?article_id=1755225Tue, 19 May 2015 14:00:00 +0100Should you install The Witcher 3's day one patch?

UPDATE 19/5/15 14:03: We've now had the opportunity to test the 1.01 patch on PlayStation 4, and can report that the FMV stuttering issue introduced in the Xbox One day one patch is much less of an issue on the Sony platform - video playback is much more consistent. Engine visual settings appear to be a match between the two consoles but curiously, the PS4 version does appear to run with a capped 30fps, giving a more consistent update than its Xbox One counterpart. We can also confirm native 1080p resolution throughout. We'll have head-to-head performance tests online as soon as possible.

Original Story: After a weekend of testing The Witcher 3 on Xbox One, it's fair to say installing its day one patch (version 1.01) is something of a double-edged sword. On the one hand, the 588MB file improves frame-rates slightly during play, while fixing minor bugs scattered across the game. In many ways it's a more polished experience with the patch - notably we have less geometry pop-in during cut-scenes, fewer instances of flickering shadows, and a great many more tweaks elsewhere.

But the downsides pack a punch too. It's apparent after switching between the game's default and patched states that these improvements come at a cost. Chief among these is the aggressive stuttering during pre-rendered cut-scenes. Essentially, encoded video files are used to portray the game's bigger plot points - such as the opening scene, re-caps after loading a save, and the dramatic end to the tutorial - while the game's engine is used for smaller beats in the story.

Not so much a remaster, but more of a full-on remake, Oddworld: New 'n' Tasty is a completely new take on the fondly remembered PS1 puzzle-based platformer, Abe's Oddysee. Built up from scratch with brand new artwork and a modern 3D graphics engine, developer Just Add Water faithfully replicates the gameplay of the original release, while adding new features to better utilise the more modern hardware. Scrolling environments add new twists to old puzzles, while the use of 3D visuals allows for sweeping camera angles and additional details that further flesh out the world over what was possible on the first-gen PlayStation hardware.

The game was originally released last year on PS4 to much acclaim, earning the Eurogamer Recommended seal of approval, but Just Add Water hasn't stood still since then. New 'n' Tasty followed on PC and Mac, with the Xbox One release hitting at the end of March, while PlayStation 3 sneaked out last month. The key to the game's platform proliferation is the utilisation of the highly regarded Unity engine - and this is the first title that we're aware of that gives us the opportunity to see how the tech stacks up across console and computer. For the record, Just Add Water tells us that the game uses a customised version of Unity 4.3, incorporating a few enhancements from the 4.5 and 4.6 iterations of the tech. It's not the most recent iteration of the engine, but it does allow for the utilisation of deferred rendering, massively increasing the amount of light sources available in any given scene.

As promised, native 1080p is deployed across both PS4 and Xbox One, with anti-aliasing provided by a customised variant of FXAA, working well in smoothing off the presentation without too many side effects. The lack of complex sub-pixel imagery ensures that shimmering and jaggies are kept under control, while the light texture blurring properties of FXAA are actually fairly well suited to the style of the game, favouring softer imagery in the style of pre-rendered CG.

Hot off the heels of last year's excellent PlayStation 3 and Vita remaster package, Final Fantasy X and X-2 land on PlayStation 4 with a string of upgrades - a remaster of a remaster, if you will. It's not been long since our last travail through Spira: precisely 13 months dividing this latest release from the last two. In that time, Square Enix has continued the same remastering process by updating even more assets and effects. But having appeared on three Sony home consoles in a row, plus a handheld, does the new PS4 version bring the definitive editions of these classic titles?

The first impression is clear: this is fundamentally the same as the PS3 version in terms of most core details, though specific NPCs and creatures do benefit from being rebuilt from scratch. Curiously, some receive more attention than others. To start, the PS4 release keeps the updated character models of last-gen; the main cast is untouched here - a mixed blessing given how greatly Tidus and Yuna's differ from their PS2 appearance. However, texture map resolution is also increased across Al Bhed tribesmen, while fur detail on fiends now gets a noticeable bump in quality.

While short of the complexity of the main crew, certain idle villagers and fiends also sport fuller, more rounded geometric meshes. Quite how it's determined which characters in Spira qualify for the latest remastering treatment is unclear; the lucky ones now have fingers, while others still sport PS2-style block hands. Overall though, just about any assemblage of characters on-screen, from best to worst, now fit in better when placed side-by-side, and the divide in model quality is less stark when compared to the existing PS3 version. Square Enix's attention is not spread evenly across the game, but it's certainly a step forward.