performance – KitGuruhttps://www.kitguru.net
The KitGuru SystemThu, 24 May 2018 16:41:36 +0000en-UShourly1https://wordpress.org/?v=4.9.6PUBG’s first Xbox One performance patch has arrivedhttps://www.kitguru.net/gaming/matthew-wilson/pubgs-first-xbox-one-performance-patch-has-arrived/
https://www.kitguru.net/gaming/matthew-wilson/pubgs-first-xbox-one-performance-patch-has-arrived/#respondThu, 22 Mar 2018 09:26:06 +0000https://www.kitguru.net/?p=367996Since its launch in December, the Xbox One version of PlayerUnknown’s Battlegrounds has struggled with performance issues. A short while ago, the folks at PUBG Corp began outlining its plan to fix things up, and last night, they took the first step. Patch 11 began rolling out for PUBG on Xbox One yesterday, which begins to address performance issues.

With patch 11, the developers “have been focused on continued performance optimisation and stability improvements”. Some changes to inventory management have also been made so that players can change their load outs a bit quicker.

Here are the patch notes:

Implemented texture changes to improve graphical performance

Optimized vehicle profiles to improve frame rate

Additional fixes to reduce crashes

The layout of the inventory has been improved, adding focal points to identify selected areas, a clearer button guide and tooltips

Improved selection of weapon slots and attachments – players can change the focus and selection by using the D-Pad

Quick scrolling has been added by using LT/RT buttons

There is still plenty of additional work to be done. Back on the 8th of March, the studio publicly said that they are “simply not satisfied with the game’s current console performance”. With that in mind, this will be a key focus for the next few months. Other changes coming up will include changing building materials and foliage, optimising game characters, optimising particle effects, optimising object collision, changing in-game assets and balancing workloads evenly across CPU cores in order to boost frame rates.

KitGuru Says: PUBG on the Xbox One is currently a bit of a struggle to play due to its frame rate, but that hasn’t stopped the game from being popular. Hopefully over time, the game will be able to hit and maintain its 30fps target throughout matches, which will in turn make the game more fun to play.

]]>https://www.kitguru.net/gaming/matthew-wilson/pubgs-first-xbox-one-performance-patch-has-arrived/feed/0The Oculus Go will compete with Samsung phones for the VR performance crownhttps://www.kitguru.net/tech-news/featured-tech-news/matthew-wilson/the-oculus-go-will-compete-with-samsung-phones-for-the-vr-performance-crown/
https://www.kitguru.net/tech-news/featured-tech-news/matthew-wilson/the-oculus-go-will-compete-with-samsung-phones-for-the-vr-performance-crown/#respondThu, 01 Mar 2018 16:37:29 +0000https://www.kitguru.net/?p=365537Last year during Oculus Connect, the company took to the stage to announce its next virtual reality headset. It’s called Oculus Go, costs $199 and aims to bring VR to the masses as an all-in-one solution, with no additional hardware required. We’ve known about this HMD for several months now, but little has been revealed in terms of performance. Fortunately, John Carmack seems more than willing to answer those questions.

The Oculus Go is essentially a competitor to other mobile VR headsets, like the GearVR. Under the hood, it is powered by a Qualcomm Snapdragon 821 processor, naturally making it more powerful than something like a Samsung Galaxy S7 plugged into a GearVR. However, the performance difference will be wider than expected, as the chip in the Oculus Go is solely dedicated to running VR apps, rather than an entire phone.

Tweeting about Oculus Go’s capabilities, Carmack stated that it will be “significantly better” than a Galaxy S7 paired with a GearVR. He also added that the chip inside of the Go headset will use dynamic clock scaling set above the Galaxy S7’s minimal levels. With that in mind, the Oculus Go should perform closer to something like the Snapdragon 845 powered Galaxy S8 when running VR applications.

Obviously when the Galaxy S9 launches, that could change. However, for the time being, it looks like the Oculus Go is going to be plenty powerful enough to run some decent virtual reality apps at decent, locked frame rates.

KitGuru Says: The Oculus Go headset is still expected to launch in Q1 2018, so we may end up seeing more of the headset before the end of this month. Do you guys think an all-in-one, competitively priced solution like the Oculus Go could succeed in widening VR’s userbase?

]]>https://www.kitguru.net/tech-news/featured-tech-news/matthew-wilson/the-oculus-go-will-compete-with-samsung-phones-for-the-vr-performance-crown/feed/0Performance numbers leak out for AMD’s upcoming Ryzen 5 2600 Zen+ CPUhttps://www.kitguru.net/components/cpu/matthew-wilson/performance-numbers-leak-out-for-amds-upcoming-ryzen-5-2600-zen-cpu/
https://www.kitguru.net/components/cpu/matthew-wilson/performance-numbers-leak-out-for-amds-upcoming-ryzen-5-2600-zen-cpu/#respondWed, 21 Feb 2018 18:46:30 +0000https://www.kitguru.net/?p=364737Earlier this year, AMD debuted its CPU roadmap for 2018, kicking off with two new Ryzen 2000-series APUs, followed by new Ryzen CPUs based on the Zen+ architecture. We’ve seen the APUs already, so now it is only a matter of time before the new CPUs hit the market. That time could be quite soon too, as the Ryzen 5 2600 has appeared in benchmarks, with a notable improvement over the Ryzen 5 1600.

The Ryzen 5 2600 has appeared in the SiSoft Sandra database and Geekbench. The Ryzen 5 2600 is listed as a six-core, 12-thread processor with a 3.4GHz base clock speed and a 3.8GHz boost speed. It operates at a 65W TDP and contains 16MB of L3 cache and 3MB of L2 cache. We also know that Ryzen 2000-series CPUs will be made on the 12nm process with a refined version of the Zen architecture known as Zen+.

The processor was tested on a system with 16GB of DDR4 memory and Windows 10 Pro on the AMD Myrtle platform, which is an in-house platform AMD uses to test its new CPUs. on Geekbench, The Ryzen 5 2600 scored 4,269 points in the single-core test and 20,102 points in the multi-core test.

This is a greater than 10 percent boost on single-core performance compared to the Ryzen 5 1600. It is also a greater than 20 percent difference on multi-core performance in this test.

KitGuru Says: Performance numbers are starting to make their way out, which would indicate that a launch isn’t too far off. Are many of you planning on upgrading your CPU this year? Are you holding out for the Ryzen 2000 series?

]]>https://www.kitguru.net/components/cpu/matthew-wilson/performance-numbers-leak-out-for-amds-upcoming-ryzen-5-2600-zen-cpu/feed/0Reader Review: ASRock Fatal1ty H270 Performance Motherboardhttps://www.kitguru.net/components/motherboard/matthew-wilson/reader-review-asrock-fatal1ty-h270-performance-motherboard/
https://www.kitguru.net/components/motherboard/matthew-wilson/reader-review-asrock-fatal1ty-h270-performance-motherboard/#respondWed, 07 Feb 2018 16:23:58 +0000https://www.kitguru.net/?p=363324Throughout December and early January, we ran a series of daily giveaways for the holiday season. These were all ‘reviewer’ giveaways, in hopes that readers would come back and share their thoughts on what they won with us and other KitGuru visitors. We’ve had some great submissions so far and today, I’m pleased to share the next reader review with you. This one comes from Ant Bird, who won an ASRock Fatal1ty H270 Performance motherboard from us.

There are no hard and fast rules for reader reviews, they can be written, or in video form, or both. Ant chose to present his review in a lengthy 17-minute YouTube video, which you can see below:

For hardware, Ant paired the ASRock Fatal1ty H270 Performance motherboard with an Intel Core i5 6600K, 16GB of HyperX DDR4 RAM and a Cooler Master TX3i air cooler. In all, Ant says he really liked the board and for the price, it seems like a good option considering the number of features it has. However, the inclusion of just one SATA cable in the box was a bit of a let down.

KitGuru Says: Thanks to Ant for sharing the review with us!

]]>https://www.kitguru.net/components/motherboard/matthew-wilson/reader-review-asrock-fatal1ty-h270-performance-motherboard/feed/0Italy’s antitrust watchdog is investigating Apple and Samsung over ‘planned obsolescence’ allegationshttps://www.kitguru.net/lifestyle/mobile/matthew-wilson/italys-antitrust-watchdog-is-investigating-apple-and-samsung-over-planned-obsolescence-allegations/
https://www.kitguru.net/lifestyle/mobile/matthew-wilson/italys-antitrust-watchdog-is-investigating-apple-and-samsung-over-planned-obsolescence-allegations/#commentsFri, 19 Jan 2018 13:05:48 +0000https://www.kitguru.net/?p=361097It looks like government bodies are starting to take notice of the ‘planned obsolescence’ allegations often levied at smartphone companies. This week, Italy’s antitrust body confirmed that it has opened up an investigation into Apple and Samsung in an effort to determine whether or not these companies actively slow down older devices in order to push users into upgrading.

If the investigation concludes that Samsung and Apple sent out updates that may have a negative impact on performance without warning customers, then both would be in violation of four separate articles of Italy’s national consumers’ code. This would result in multi-million euro fines.

In a statement published by Business Insider, a spokesperson for Italy’s antitrust watchdog said that Samsung and Apple are suspected of having “a general commercial policy taking advantage of the lack of certain components to curb the performance times of their products and induce consumers to buy new versions”.

Apple in particular has come under fire recently over slowing down older iPhone models. In an effort to preserve the health of ageing lithium-ion batteries in the iPhone 6, performance was actively throttled without warning the user. Apple will be fixing this in the future by giving iOS users the option to see their ‘battery health’ and decide whether or not to reduce phone performance in order to preserve the battery. Alternatively, $29 battery replacements are also being offered.

KitGuru Says: The idea that smartphone companies actively slow down older phones in order to push customers into upgrading has been around for a while. However, we’ve never seen a government get involved. It will certainly be interesting to see how this turns out. Do any of you think that phone companies actively slow down older phones to push upgrades? Or is it just a conspiracy theory?

]]>https://www.kitguru.net/lifestyle/mobile/matthew-wilson/italys-antitrust-watchdog-is-investigating-apple-and-samsung-over-planned-obsolescence-allegations/feed/2Intel releases Spectre/Meltdown firmware updates, publishes benchmark results for data centre systemshttps://www.kitguru.net/components/cpu/matthew-wilson/intel-releases-spectre-meltdown-firmware-updates-publishes-benchmark-results-for-data-centre-systems/
https://www.kitguru.net/components/cpu/matthew-wilson/intel-releases-spectre-meltdown-firmware-updates-publishes-benchmark-results-for-data-centre-systems/#commentsThu, 18 Jan 2018 10:30:57 +0000https://www.kitguru.net/?p=360863Intel’s fight against the Spectre and Meltdown bugs continues but with it, more issues are also beginning to arise. Last week, it was discovered that a recent firmware update was causing reboot issues with older Intel processors. Now Intel has confirmed that newer processors can also be affected, with Sandy Bridge, Ivy Bridge, Skylake and Kaby Lake systems all requiring more frequent reboots.

This is something that mainly affects data centres and servers, rather than your average DIY PC builder. The patch also comes with a performance impact, which is minimal in some situations but more significant in others.

In a blog post penned by Intel VP Navin Shenoy, Intel confirmed that data centre tests simulating online transactions and stock exchange interaction experienced a 4% performance impact with the latest Spectre/Meltdown patch. Meanwhile different I/O storage bechmarks showed anywhere from a 2% decrease in throughput to an 18% decrease depending on the system configuration and test setup.

Finally, Storage Performance Development Kit tests saw as much as a 25 percent impact while using single-core. Using SPDK vHost, Intel observed no performance impact. So data centres and server providers will see some performance hits depending on different workloads.

We are still in early days for these patches so things will change over time. Intel will be taking a transparent approach and publishing more updates as more tests are conducted.

KitGuru Says: The Spectre/Meltdown story isn’t over just yet. Over time, more patches will be released and more tests will be conducted, so we’ll be keeping an eye out for future updates.

]]>https://www.kitguru.net/components/cpu/matthew-wilson/intel-releases-spectre-meltdown-firmware-updates-publishes-benchmark-results-for-data-centre-systems/feed/3PUBG seems to have a lot of trouble running on Xbox Onehttps://www.kitguru.net/gaming/matthew-wilson/pubg-seems-to-have-a-lot-of-trouble-running-on-xbox-one/
https://www.kitguru.net/gaming/matthew-wilson/pubg-seems-to-have-a-lot-of-trouble-running-on-xbox-one/#respondWed, 13 Dec 2017 12:44:28 +0000https://www.kitguru.net/?p=357670Yesterday PlayerUnknown’s Battlegrounds received its Xbox One ‘Game Preview’ launch just ahead of the PC’s exit from Early Access. The console launch was set to be a big deal for Microsoft’s machine, after all, PUBG has been the biggest game on PC all year long and opening it up to the console market was set to make it even bigger. Unfortunately, technical issues have been left unresolved, meaning PUBG on Xbox One is a bit of a mess.

Battlegrounds doesn’t run particularly well on PC systems, but it is certainly playable, which has helped attribute to the game’s wide success. On Xbox One, the story is quite different. Digital Foundry put the game through its paces this week, showing frame rates dipping well below the 30fps target on both the base console and the Xbox One X.

Textures on the base Xbox One version are poor all around and the game will often dip below even 20 frames per second. The Xbox One X does a slightly better job and runs at full 4K resolution, but the frame rate still suffers. Digital Foundry reports that average frame rates for a 22 minute long game sit at 25.6 frames per second on Xbox One and 27.6 frames per second on the Xbox One X.

Those are the averages, but we see some pretty bad dips across the map, entering buildings, looting, driving, dropping from the plane and even just being in the waiting lobby all see major dips. In all, the video paints the game as something that would be quite difficult to comfortably play.

The Xbox Game Preview program is Microsoft’s equivalent of ‘Early Access’ on Steam and things have only just begun, so we can expect to see improvements over time. Still, it is disappointing to see that neither game can quite hold the 30 frames per second target.

KitGuru Says: I have a few close friends who only play on Xbox One, so I was looking forward to picking this up and playing it with them. However, in its current state, it doesn’t seem like the game would be particularly fun to play. 30 frames per second is already a huge compromise, but hitting that target for long periods of time appears to be quite rare for the console version.

]]>https://www.kitguru.net/gaming/matthew-wilson/pubg-seems-to-have-a-lot-of-trouble-running-on-xbox-one/feed/0Updated: GeForce GTX 1070Ti spotted in benchmarkshttps://www.kitguru.net/components/graphic-cards/matthew-wilson/geforce-gtx-1070ti-spotted-in-benchmarks/
https://www.kitguru.net/components/graphic-cards/matthew-wilson/geforce-gtx-1070ti-spotted-in-benchmarks/#commentsWed, 18 Oct 2017 17:08:45 +0000https://www.kitguru.net/?p=350557Update (18/10/17): We got our first peek at the performance of the upcoming GTX 1070Ti graphics card earlier this week. Now just two days later, even more benchmarks have leaked out. This time around we have some Fire Strike and Time Spy results to join the earlier Ashes of the Singularity benchmark.

Videocardz managed to unearth two GTX 1070Ti results on Fire Strike, the first offered a GPU score of 9,449 and the second had a GPU score of 9,546. For comparison, a GTX 1070 should score around 8,900 points.

On the Time Spy front, there was one GTX 1070Ti result, with a GPU score of 6,777, this places it above the RX Vega 56 in Turbo mode, which scores 6,423 points. However, it falls below the RX Vega 64, which scores 7,046 points in Balanced mode.

There will likely be a few more leaks as we get closer to launch, which supposedly takes place next week.

Original Story (16/10/17): We have been hearing more and more about the alleged upcoming release of the GTX 1070Ti over the last few weeks but now it looks like we have some tangible performance numbers to look at as well. On top of that, we have rumoured specs to go with it.

Current reports are indicating that the GTX 1070Ti will release at the end of October but there will be no reference card version. Instead, board partners like Asus, MSI and Gigabyte will rollout their own custom-cooled variants. There was a report indicating that perhaps all of these cards will ship with the same out of the box clock speeds, so performance without manual overclocking should be consistent across the board.

With that out of the way, let’s dive into the benchmark. A GTX 1070Ti result has been spotted on the Ashes of Singularity database. The result is still live at the time of writing. In the benchmark, the 1070Ti managed to score 6,200 points, averaging 65.5 frames per second across all tests at 1440p.

The system also featured an Intel Core i9-7900X, so no CPU bottlenecking would have come into play. If you browse the AoTS database, you will find some GTX 1070 and GTX 1080 results in a similar score region, though you will find some 1080’s scoring higher than 8000 points. With that in mind, it seems that the 1070Ti will compete with stock clocked GTX 1080s and some of the more heavily overclocked 1070s.

As reported by Videocardz, the GTX 1070Ti is supposed to roll out on the 26th of October, which is in just ten days time. As for specifications, rumour has it that the GTX 1070Ti will feature 2432 CUDA cores, 152 TMUs, 64 ROPs and 7.8 TFLOPs of compute power. Base clock speeds should be 1607Mhz with a boost to 1683MHz. This puts it closer to the 1080 in terms of raw performance, which is rated for 8.2 TFLOPs, with 2560 CUDA cores, 160 TMUs and 64 ROPs.

Pricing remains a mystery but given that this is a stop gap card, it will likely fall in the £400-£450 range between the GTX 1070 and GTX 1080.

KitGuru Says: At this point, it seems like the GTX 1070Ti is happening. Given how close we are to launch, we may see a few more leaks over the next week. Are any of you looking to upgrade your GPU soon? Does the 1070Ti interest you at all?

]]>https://www.kitguru.net/components/graphic-cards/matthew-wilson/geforce-gtx-1070ti-spotted-in-benchmarks/feed/14AMD RX Vega 56 gaming benchmarks and GTX 1070 comparison shows up onlinehttps://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-rx-vega-56-gaming-benchmarks-and-gtx-1070-comparison-shows-up-online/
https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-rx-vega-56-gaming-benchmarks-and-gtx-1070-comparison-shows-up-online/#commentsThu, 03 Aug 2017 13:02:08 +0000https://www.kitguru.net/?p=342639AMD finally unveiled the RX Vega 64 and RX Vega 56 a few days ago but official reviews won’t be going up for a little while longer. While reviewers won’t be able to share their findings, a source this week claiming to have an RX Vega 56 in hand has revealed some gaming benchmark numbers up against a GTX 1070, showing performance in DOOM, Battlefield 1 and more.

A report on Tweaktown claims to have the scoop on RX Vega 56 benchmarks. The site says it has ‘an industry source’ with access to the RX Vega 56 already, who tested the GPU in DOOM, Battlefield 1, Civilization VI and Call of Duty Infinite Warfare, all at 2560×1440 up against a GTX 1070.

In Battlefield 1, the ‘Ultra’ preset was used, with the RX Vega 56 reportedly scoring just over 95 frames per second. The GTX 1070 on the other hand managed 72 frames per second. While the report doesn’t explicitly state that DX12 was used, this appears to be more in line with GTX 1070 DX12 scores, rather than DX11.

In DOOM, the RX Vega is listed as scoring j101.2 frames per second with the GTX 1070 at 84.6 frames per second. This game was run at Ultra with 8x TSAA but unfortunately, it does not state whether or not the game was run in Vulkan, DX12 or DX11. Either way though, it’s another win for AMD’s new $399 Vega.

COD:IW was running on its High preset, in this game, the RX Vega 56 is listed as scoring 99.9 frames per second while the GTX 1070 managed 92.1 frames per second. Finally, in Civ VI, Vega got 85.1 frames per second while the GTX 1070 scored 72.2 frames per second.

Now it’s worth pointing out that we can’t verify these results just yet, so we won’t know how accurate these numbers are until review samples come in. However, if these scores turn out to be right on the money, then AMD’s RX Vega 56 could overthrow the GTX 1070.

KitGuru Says: If these RX Vega 56 scores are the real deal, then AMD will be in a very good position to compete with the GTX 1070. The RX Vega 56 is the cut down version too, meaning the RX Vega 64 should be even stronger and will hopefully prove to be a good competitor to the GTX 1080. AMD RX Vega 56 gaming benchmarks and GTX 1070 comparison shows up online

]]>https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-rx-vega-56-gaming-benchmarks-and-gtx-1070-comparison-shows-up-online/feed/27Sneak peek at Acer’s £9000 120hz Curved IPS Superwide Predator 21x laptophttps://www.kitguru.net/components/graphic-cards/matthew-wilson/sneak-peek-at-acers-9000-predator-21x-laptop/
https://www.kitguru.net/components/graphic-cards/matthew-wilson/sneak-peek-at-acers-9000-predator-21x-laptop/#commentsFri, 07 Jul 2017 13:54:08 +0000https://www.kitguru.net/?p=339840Last month, we had the chance to get a brief first look at the Acer Predator 21x, a one of a kind laptop featuring a 21-inch ultra-wide display. Now, we actually have one of these beasts here at the KitGuru office, so before we put up our review, we thought we would give you all the exclusive UK sneak peek!

For those who don’t know the Predator 21x is an absolute powerhouse. There is an Intel Core i7 Kaby Lake processor underneath alongside two GTX 1080s, all working together to get the most out of the 120Hz ultra-wide display. The display is IPS, though its resolution is only 2560×1080. However, when you consider the high PPI, the refresh rate and the fact that this is a laptop, 1080p ultra-wide is still very impressive.

You can check out LEO’s preview of the laptop in the video below:

KitGuru Says: While a laptop as expensive and chunky as this may not be a useful machine for most customers, it is cool to see what a company can do when they decide to just cram as much high-end hardware as possible into a single package. Stay tuned for our full review coming soon!

]]>https://www.kitguru.net/components/graphic-cards/matthew-wilson/sneak-peek-at-acers-9000-predator-21x-laptop/feed/1New benchmark leak puts RX Vega performance slightly ahead of GTX 1080https://www.kitguru.net/components/graphic-cards/matthew-wilson/new-benchmark-leak-puts-rx-vega-performance-slightly-ahead-of-gtx-1080/
https://www.kitguru.net/components/graphic-cards/matthew-wilson/new-benchmark-leak-puts-rx-vega-performance-slightly-ahead-of-gtx-1080/#commentsWed, 05 Jul 2017 13:06:32 +0000https://www.kitguru.net/?p=339622Earlier this week, AMD finally confirmed that it would be launching the RX Vega graphics card at the end of this month. The reveal is set to take place at SIGGRAPH, which kicks off on the 30th of July. However, we won’t have to wait that long to get a better idea of performance as a fresh leak this week has shown AMD’s RX Vega GPU benchmarked and pitted against Nvidia’s high-end Pascal lineup.

Over on the 3DMark database, a new benchmark entry was discovered under device ID ‘687F:C1’, which we know to be one of the upcoming RX Vega GPUs. This particular model comes with a 1630MHz core clock with an 1890MHz clock speed on the HBM2. While running 3DMark 11, this particular RX Vega version scored 31,874 points, which is around 15 percent faster than a GTX 1080 but still some ways off the performance of a GTX 1080Ti.

Still, this is a big improvement over past benchmark leaks, which pinned the RX Vega closer to a GTX 1070 in performance. The extra tweaking over the last few months appears to have allowed AMD to squeeze a significant amount of extra performance out.

That said, it is important to note that this is still an early benchmark, so it may not end up fully representing the final product. This was also a synthetic benchmark, rather than a true in-game test. Still, with RX Vega now so close to launching, the leaks should start to be more accurate.

KitGuru Says: This launch has been a long time coming and there have been plenty of ups and downs along the way. Have many of you been waiting for RX Vega before upgrading? What do you think of the current performance projections?

]]>https://www.kitguru.net/components/graphic-cards/matthew-wilson/new-benchmark-leak-puts-rx-vega-performance-slightly-ahead-of-gtx-1080/feed/28Radeon Vega Frontier Edition already in hands of users, benchmarks posted onlinehttps://www.kitguru.net/components/graphic-cards/matthew-wilson/radeon-vega-frontier-edition-already-in-hands-of-users-benchmarks-posted-online/
https://www.kitguru.net/components/graphic-cards/matthew-wilson/radeon-vega-frontier-edition-already-in-hands-of-users-benchmarks-posted-online/#commentsThu, 29 Jun 2017 18:31:59 +0000https://www.kitguru.net/?p=339283With AMD’s Radeon Vega Frontier Edition now available and shipping out to buyers, we are starting to see the first user benchmarks pop up. Interestingly enough, Vega Frontier Edition comes with two modes a ‘Pro’ mode and a ‘Gaming’ mode, so it seems AMD has kept gaming in mind with this GPU after all.

A Disqus user by the name of ‘#define’ has already got their hands on Vega FE and threw it in a system containing an Intel Core i7 4790K, a ASUS Maximus VII Impact motherboard and 16GB of RAM. From their experience, the ‘Gaming Mode’ still needs some work on the driver side as clock speeds were fluctuating a bit. However, three successful 3DMark FireStrike runs were achieved, as well as a TimeSpy run and a quick run around The Witcher 3.

In FireStrike (1080p) the Vega FE managed a graphics score of 22,963, in FireStrike Extreme (1440p) it managed a graphics score of 10,585 and finally in FireStrike Ultra (4K) it managed to get 5,336 points. In 3DMark TimeSpy, the card got a graphics score of 7,126. While we didn’t get a proper benchmark of The Witcher 3, screenshots of the game running at Ultra/1080p were posted, with the card holding well above 100 frames per second, as you would expect from a high-end GPU running at 1080p.

This is just the first batch of results to show up. AMD hasn’t sent out review samples for Vega Frontier Edition (as far as we know), so real-world buyers are going to be giving us our first look at performance over the next week.

KitGuru Says: This is just the very first batch of results so far but as more buyers manage to get the card in their hands, we should see a wider range of tests and results. What do you guys think of Vega Frontier Edition so far? It seems like a good first step but for many, the RX Vega is going to be the star of the show when it arrives.

]]>https://www.kitguru.net/components/graphic-cards/matthew-wilson/radeon-vega-frontier-edition-already-in-hands-of-users-benchmarks-posted-online/feed/7AMD’s Radeon Vega Frontier Edition launches today for $999https://www.kitguru.net/components/graphic-cards/matthew-wilson/amds-radeon-vega-frontier-edition-launches-today-for-999/
https://www.kitguru.net/components/graphic-cards/matthew-wilson/amds-radeon-vega-frontier-edition-launches-today-for-999/#commentsTue, 27 Jun 2017 15:58:33 +0000https://www.kitguru.net/?p=339083Following on from yesterday’s performance preview, today AMD officially launched the Radeon Vega Frontier Edition, delivering AMD’s latest GPU architecture to consumers for the first time. AMD is pitching this as the world’s fastest graphics card for machine learning development and advanced visualization workloads. Pricing is also a bit different from what we saw on early pre-order listings.

The first of the two Frontier Edition Vega cards to launch is the air-cooled version, which will begin shipping as soon as retailers receive stock. The more expensive liquid cooled edition will be coming at a later date but will still be within Q3. There are no reviews out just yet but judging from yesterday’s preview, this card stands up well against the Titan Xp in non-gaming workloads, with some benchmarks putting Vega in the lead by as much as 50 percent.

Just as a quick reminder, here are the Vega FE specifications:

16GB HBM2 memory.

Pixel Fill Rate: ~90 Gpixels/sec.

Memory capacity: 16 GBs of High Bandwidth Cache.

Memory bandwidth: ~480 GBs/sec.

4096 Stream Processors.

64 Compute Units.

13.1 TFLOPS FP32 performance.

26.2 TFLOPS FP16 performance.

3x DisplayPort 1.3, 1x HDMI 2.0.

300W TDP (Air-Cooled) 375W TDP (Liquid-Cooled).

Speaking about Vega’s entry into the Radeon Pro series, AMD’s Ogi Brkic said that the company is “dedicating Radeon Vega Frontier Edition to all the visionaries and trailblazers who embrace new technologies to propel their industries forward”. Brkic also noted that this card brings “the full weight” of the Vega architecture to deliver the best performance in an AMD GPU to date.

As far as pricing and availability goes, the Radeon Vega FE will be available from some etailers in select regions starting from today. The air cooled edition holds an MSRP of $999 while the water-cooled edition will be $1499 when that launches later this quarter. Those prices could fluctuate depending on the retailer, the MSRP is just AMD’s recommended value for these cards.

KitGuru Says: Consumer Vega is finally here, though the Frontier Edition is just the beginning. We are still waiting to see what AMD has in store for gamers at the end of July with the RX Vega.

]]>https://www.kitguru.net/components/graphic-cards/matthew-wilson/amds-radeon-vega-frontier-edition-launches-today-for-999/feed/3Radeon Vega Frontier Edition vs Titan Xp performance detailedhttps://www.kitguru.net/components/graphic-cards/matthew-wilson/radeon-vega-frontier-edition-vs-titan-xp-performance-detailed/
https://www.kitguru.net/components/graphic-cards/matthew-wilson/radeon-vega-frontier-edition-vs-titan-xp-performance-detailed/#commentsMon, 26 Jun 2017 15:43:05 +0000https://www.kitguru.net/?p=338951While the enthusiast gaming audience will need to wait a little longer to get their hands on the RX Vega, we will be getting our first taste of AMD’s latest GPU architecture quite soon. The Radeon Vega Frontier Edition is supposed to start shipping next week and while it isn’t necessarily a gaming card, it can run games. This week, AMD lifted the curtain a bit, allowing the world’s first Vega Frontier Edition performance details to be published.

The folks over at PCWorld managed to get their hands on an exclusive preview of the Vega Frontier Edition, putting it directly against an Nvidia Titan Xp. The preview mainly focusses on synthetic benchmarks like Cinebench. However, some gaming performance details were also discussed.

In Cinebench, Vega FE was around 14 percent faster than the Titan Xp, meanwhile it was 28 percent faster in Catia and Creo. The biggest lead came when running the SolidWorks benchmark, in which the Vega FE was 50 percent faster than the Titan Xp. Unfortunately, we didn’t get results for more widely known benchmarks like Firestrike or Heaven.

AMD did not want to reveal specific numbers when it came to gaming performance but according to the report, Vega FE was tested against a Titan Xp in Doom (Vulkan), Prey (DX11) and Sniper Elite 4 (DX12). All of the games ran at the highest settings on an Acer 3440×1440 monitor. Apparently, Vega FE should sit somewhere between the GTX 1080 and GTX 1080Ti performance-wise in gaming.

No hard numbers were released from the gaming tests but it is interesting to see the first impressions of someone who has actively used the card. With the first shipments set to go out next week, it shouldn’t take long for proper game benchmarks to start showing up.

KitGuru Says: The Vega FE might not be the gaming-oriented GPU in AMD’s new stack but it should give us a decent enough impression of what to expect when the RX Vega does eventually arrive. Are any of you still waiting for the RX Vega to launch? What level of performance are you expecting from it?

]]>https://www.kitguru.net/components/graphic-cards/matthew-wilson/radeon-vega-frontier-edition-vs-titan-xp-performance-detailed/feed/19Prey is fairly good at launch but there are some issueshttps://www.kitguru.net/gaming/matthew-wilson/prey-is-fairly-good-at-launch-but-there-are-some-issues/
https://www.kitguru.net/gaming/matthew-wilson/prey-is-fairly-good-at-launch-but-there-are-some-issues/#commentsFri, 05 May 2017 09:37:40 +0000http://www.kitguru.net/?p=332967Prey has released today and while our performance analysis is still on the way, there are some important early details for those looking to buy day-one before benchmarks hit. For starters, the game seems to be holding up a lot better than Dishonored 2 already. Ultra-wide 21:9 support is present, as is SLI/CrossFire support. There are some problems I have encountered though.

During the first hour of Prey, performance wasn’t really an issue at 1440p with a GTX 1080, which is in stark contrast to my experience with Dishonored 2. However, mouse sensitivity is still on the high side. Sensitivity came set to 10 (out of 100) by default but to get the best experience, I found myself lowering it further to 5.

For those worried about the low field of view, there will be a slider added to the options menu at some point. However, if you can’t wait, then you can adjust FOV through the game’s config file. Typically, this can be found on the C: drive in the saved games folder. Once you’ve found game.cfg, you need to look for the line “cl_hfov=85.5656” and change the number to whatever you would prefer. It does max out at 120, as you would expect.

Bethesda has warned that there are ‘some issues’ that are more noticeable with a higher FOV, so you may encounter some problems using this method.

KitGuru Says: So far, Prey seems like an improvement over Dishonored 2 but there are still some problems. So far though, I’ve not come across anything game breaking. Have any of you tried out Prey yet? Will you be picking it up over the weekend?

]]>https://www.kitguru.net/gaming/matthew-wilson/prey-is-fairly-good-at-launch-but-there-are-some-issues/feed/1AMD moving to dual-fan design for reference GPUs, RX 500 presentation leakshttps://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-moving-to-dual-fan-design-for-reference-gpus-rx-500-presentation-leaks/
https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-moving-to-dual-fan-design-for-reference-gpus-rx-500-presentation-leaks/#commentsThu, 13 Apr 2017 14:03:50 +0000http://www.kitguru.net/?p=330295It seems that all of the recent RX 500 series leaks are finally coming to a head as this week, AMD’s RX-500 series announcement presentation leaked onto the web, giving us a look at the all-new reference design and some performance numbers.

For starters, AMD is moving away from the usual ‘blower style’ reference cooler and is replacing it with a much more efficient dual-fan design. The new cooler is more akin to what you would expect to see from an add-in board partner, so it’s nice to see AMD stepping things up in this regard. As we can see from the image above, AMD is preparing four cards for release soon, the RX 580, 570, 560 and 550.

As with most new GPU presentations, some performance numbers are given. These benchmarks are run by AMD, so it will be best to wait for third-party reviewers and GPU buyers for a wider spread of comparisons and benchmarks. Still, the results seem interesting, though AMD is primarily comparing the RX 580 to the RX 380, cutting out the 400-series entirely from its comparisons. The leak comes from Chinese site Jisakutech (via Videocardz), the images aren’t the best quality- some of the graphs are harder to make out than others. However, from what we can see, AMD is pitching this as more of an upgrade to the R9 380x, skipping over any RX 480 comparisons.

The presentation also shows that AMD are targeting the RX 580 at those looking to break into 1440p gaming, without spending a fortune.

KitGuru Says: It seems that the RX 500 series announcement will be happening shortly. Are any of you currently holding off on upgrading? It might be worth waiting for new cards, especially if you are looking at buying an RX 400-series GPU.

]]>https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-moving-to-dual-fan-design-for-reference-gpus-rx-500-presentation-leaks/feed/14CES 2017: AMD Vega has potential to beat GTX 1080Tihttps://www.kitguru.net/components/graphic-cards/jon-martindale/ces-amd-vega-has-potential-to-beat-gtx-1080ti/
https://www.kitguru.net/components/graphic-cards/jon-martindale/ces-amd-vega-has-potential-to-beat-gtx-1080ti/#commentsFri, 06 Jan 2017 11:24:47 +0000http://www.kitguru.net/?p=316712AMD’s CES ultra-tease of its Vega hardware suggests that it could be as powerful if not more so than some of Nvidia’s most expensive graphics cards. It suggests that with advancements in the geometry pipeline, its new pixel engine and high bandwidth memory 2 (HBM2) it could end up being nearly 50 per cent faster than a Fury X.

AMD has been teasing information about its upcoming Vega GPU line for a while now and a couple of leaks gave us hope that it would indeed be able to take Nvidia head on in performance. AMD has now given us a lot more official information on Vega and if true, it should come in faster than a GTX 1080 and possibly even a Titan X and GTX 1080Ti.

Scott Wasson of AMD’s Radeon Technology Group, describes Vega as the biggest advance in AMD GPU architecture in half a decade and should be highly efficient, as well as incredibly powerful. PCGamer breaks down that the big jump comes from major changes to the way graphics core next works.

Vega cards will have HBM2, but AMD is calling its system a high-bandwidth cache, which we’re told lets the GPU have a virtual address space far larger than the amount of memory it actually has. This may end up being a feature that is more useful for developers than gamers, but it shows AMD is really changing up how it handles memory usage on its GPUs and that could have a strong trickle down effect with game developers too.

Anandtech’s break down is the most comprehensive, looking at aspects of the compute cores and how AMD has managed to get Vega to output 11 polygons per clock with 4 geometry engines, which is a massive increase over hardware it has produced in the past. 2.6 times that of its Fury X cards.

Other changes to the pixel engine allow AMD to forgo rendering pixels that won’t be visible in the final scene, thereby improving the efficiency of the GPU. It is also said to be ever more efficient with DirectX 12 and Vulkan APIs, which are seeing increased usage across PC gaming. Considering AMD already tends to have an advantage with those APIs and a more favourable comparison with Nvidia at higher resolutions, 2017 looks rather rosy for Vega.

KitGuru Says: With this reveal painting Vega in a solid light, if AMD can have strong production and supply of the new GPU when released later this year, 2017 could be a real turnaround. Especially considering Zen looked so strong in its recent unveiling.

]]>https://www.kitguru.net/components/graphic-cards/jon-martindale/ces-amd-vega-has-potential-to-beat-gtx-1080ti/feed/10AMD’s Zen CPUs said to compete with Intel on performance and pricehttps://www.kitguru.net/components/cpu/matthew-wilson/amds-zen-cpus-said-to-compete-with-intel-on-performance-and-price/
https://www.kitguru.net/components/cpu/matthew-wilson/amds-zen-cpus-said-to-compete-with-intel-on-performance-and-price/#commentsThu, 19 May 2016 19:46:23 +0000http://www.kitguru.net/?p=293316AMD’s Zen début is rapidly approaching, with some rumours pointing towards an October launch for the first eight-core chip. However, we still don’t know a ton about performance, outside of the fact that it met ‘all internal expectations’. Fortunately, it seems that someone over at AMD has been doing a little bit of talking, with some reports saying that Zen will compete with Intel’s Skylake in performance as well as price.

Speaking to Australian journalists recently, AMD’s corporate VP of worldwide marketing, John Taylor said: “Zen will compete with Intel on performance, power and specifications – not just price”.

As ITWire reports, Taylor also had some good words to say about the company’s new CEO, Lisa Su, who has shifted AMD’s focus to three main areas: Gaming, Immersive Platforms (VR) and Datacenters.

It is worth noting that AMD has been pretty quiet on Zen publicly recently, even in the most recent quarterly earnings reports with investors. However, given that the first Zen desktop CPU is due out before the end of this year, it would make sense for the company to start talking about it a bit more over the next few months.

KitGuru Says: Plenty of enthusiasts have been waiting for AMD’s Zen processors for years. We haven’t seen much officially but hopefully the company can live up to expectations. Are any of you thinking about picking up a Zen processor when they come out?

]]>https://www.kitguru.net/components/cpu/matthew-wilson/amds-zen-cpus-said-to-compete-with-intel-on-performance-and-price/feed/28DOOM PC Game Analysishttps://www.kitguru.net/components/graphic-cards/matthew-wilson/doom-pc-game-analysis/
https://www.kitguru.net/components/graphic-cards/matthew-wilson/doom-pc-game-analysis/#commentsMon, 16 May 2016 13:17:03 +0000http://www.kitguru.net/?p=292652DOOM is finally here, complete with a fantastic campaign, Snap Map system and plenty of multiplayer modes to jump in to but is the PC version up to snuff?

When the DOOM open beta came around, the advanced graphics options were locked out so we didn’t really know what to expect ahead of launch. However, as you can see, id Software has implemented a long list of options so you can tweak fidelity to your liking. There is nothing really missing from the list, so in all, we are off to a good start.

The game itself also looks gorgeous. The id Tech 6 engine is very capable when it comes to visuals, though a lot of surfaces do appear to be quite glossy, rather than finely detailed.

The game isn’t just filled with dimly lit rooms covered in blood, there are some outdoor areas where you see a lot more colour and later on in the campaign you will end up in hell, which looks exactly how you would expect it to.

For our testing today, we will be running DOOM on a system using an Intel Core i7 6700K, 16GB of RAM, an Asus Maximus VII Hero motherboard and a 1TB Samsung Evo SSD. For graphics cards, we will be benching a reference GTX 980Ti, a GTX 970, an R9 Fury X, an R9 390x and a R9 290.

For Nvidia, we are using driver version 365.19, which is the latest Game Ready driver release at the time of writing. For AMD we are using driver version 16.5.2, which was released with optimizations for Doom. None of our cards are overclocked for this test.

As much as I’d hoped it might, DOOM doesn’t have its own in-built performance testing tool, so I got creative and made a benchmark run of my own. I needed something easily repeatable across multiple GPUs so I decided to bench the first Hell Spawner room in the campaign as it contains plenty of enemies and explosives, resulting in a fairly demanding gameplay scenario.

id Software came out with a brand new engine for this game known as the id Tech 6 engine. It produces some excellent graphics and for Nvidia GPUs in particular, performance was great. However, the R9 290 and R9 390x really struggled, barely managing to hold 30 frames per second even at 1080p. Normally we would see these two cards trading blows with the GTX 970 but the gap was significant here. I would have also expected the Fury X to be a little closer to the GTX 980Ti in the 1080p test.

This suggests to me that this game could use some additional tweaking for AMD graphics card users. This isn’t the first time that id Software has released a game that struggles on AMD hardware, back in 2011 Rage was released, which used the id Tech 5 engine but still struggled to effectively leverage AMD graphics cards.

For now, it looks like AMD users may want to turn a few graphics settings down a bit to achieve smooth 60 frames per second gameplay.

While things aren’t super promising on the AMD side of things, the game did come out with 21:9 ultra-wide support, which is something a lot of our readers look out for. Multi-GPU support is also said to be on the way too.

Now let’s talk a bit about gameplay. You probably noticed my positive introduction to this article and there is a reason for that- I love this game. I have barely been able to pull myself away from it all weekend, the campaign is action-packed and fast paced.

I’m normally not one to brag but it also doesn’t hurt that I can hold my own in the multiplayer.

Speaking of which, the multiplayer for this game has proven quite divisive, not everyone is on-board with it. While I don’t think it does anything particularly innovative to really push the boundaries, it is addictive, causing me to put many hours in to it over the weekend.

This game does come with a Snap Map system, which allows players to easily create their own maps/levels for DOOM focused on either co-op or solo play. I didn’t play around with it too much but there are a few really good levels already available and if this takes off, it could probably end up being as big as Forge was in Halo 3.

DOOM is truly a testament to old-school shooters and it is clear that ID Software has not lost its touch at all. Whether you get on with the multiplayer or not, the campaign is well worth playing, particularly if you really enjoyed Wolfenstein: The New Order back in 2014.

KitGuru Says: DOOM is an excellent game but like most shooters, it is best played at high frame rates, which may be an issue for those on the AMD R9 200 and R9 300 series. I don’t think it is a hardware issue either, as the R9 290 and R9 390x tend to trade blows with the GTX 970 in most other titles. Hopefully the situation will improve a bit after a patch or two. That said, if you are using a Fury, or a GTX 980Ti, then you will be able to run this game at buttery smooth frame rates even with the details all cranked up.

]]>https://www.kitguru.net/components/graphic-cards/matthew-wilson/doom-pc-game-analysis/feed/7Battleborn PC Game Analysishttps://www.kitguru.net/gaming/matthew-wilson/battleborn-pc-game-analysis/
https://www.kitguru.net/gaming/matthew-wilson/battleborn-pc-game-analysis/#commentsFri, 13 May 2016 14:31:23 +0000http://www.kitguru.net/?p=292480Battleborn came out earlier this month and while its launch may have been overshadowed by the Overwatch open beta, this might not be a game you want to let fall by the wayside. Gearbox’s ‘hero shooter’ is quite different from the likes of Overwatch and Team Fortress in a number of ways, with some unique game modes, vibrant characters and a campaign mode but how does it hold up on the PC specifically? Let’s find out!

Click images to enlarge.

Battleborn comes with your basic set of graphics options, with all of the appropriate resolution, window mode and V-sync options. In keeping with the Borderlands games, Gearbox has once again implemented PhysX with this title and the overall graphical style is similar. The only thing I did find a tad disappointing is the lack of anti-aliasing options, I would prefer to have MSAA over FXAA but on the plus side, FXAA doesn’t hit performance as hard.

Battleborn isn’t the most mind-blowing game graphically but it does have a rarely used, comic-bookish art-style similar to what you will find in the Borderlands series. Textures aren’t perfect but I noticed very little aliasing despite being limited to just FXAA and everything is bright and colourful, which is a nice change of pace from the usual grim scenery we see in games.

For our testing today, we will be running Battleborn on a system using an Intel Core i7 6700K, 16GB of RAM, an Asus Maximus VII Hero motherboard and a 1TB Samsung Evo SSD. For graphics cards, we will be benching a reference GTX 980Ti, a GTX 970, an R9 Fury X, an R9 390x and a R9 290.

For Nvidia, we are using driver version 365.19, which is the latest Game Ready driver release at the time of writing. For AMD we are using driver version 16.5.2, which was released with optimizations for Battleborn and Doom. None of our cards are overclocked for this test.

Since Battleborn doesn’t have its own in-built performance testing tool, I chose to benchmark an early section from the mission titled ‘The Algorithm’ as it is an early mission and easy to repeat to ensure a fair test across cards. In my run, I set off plenty of explosions and tried to get as many enemies on the screen as possible to create a demanding scenario.

Battleborn is designed to be an accessible game that can run well on a wide-range of hardware and as you can see from our results, Gearbox has definitely achieved that. Every card on the list is able to hold its own, even at 1440p and 4K, which is always great to see but not too surprising given that we are dealing with the aged Unreal 3 Engine.

This game does make use of PhysX, which can’t be turned off in the graphics options menu. However, I did go back and re-test the AMD GPUs to check if having PhysX set to low would make any difference- long story short, it didn’t. So feel free to whack all of the options as high as they will go.

Battleborn runs great but its missions do get slightly repetitive. A lot of the campaign boils down to shooting hordes of enemies and taking down a huge bullet-spongy boss every so often. However, the huge range of 25 hero characters to choose from helps keep things fun and changes up the gameplay a bit.

The multiplayer is where this game really sets itself apart with its own unique game modes. There are three modes available at launch, the first of which is ‘Point Capture’, where you capture enemy points on the map while protecting your own from invasion.

The second is mode is called ‘Incursion’, which plays out similarly to a tower-defence game, as you try to stop constantly spawning enemy sentries and minions from pushing through and taking over your base. You can build turrets and shield generators to slow them down but you also need to help your own sentries break into the enemy team base.

The third mode is called ‘Meltdown’, which is a race against time as two teams compete to sacrifice the most minions to ‘Magnus’ a god in the Battleborn world that wants to destroy whoever loses. This mode also has points where you can build turrets and shield generators to give your team a boost and bolster defences.

Similarly to games like Dota 2, you start each multiplayer match with your hero at level 1, you then need to kill minions and other players (depending on the game mode) to rank up and select new skills.

The game itself is fun if you still enjoy the Borderlands humour and art-style. The campaign missions are entertaining but multiplayer matches can take a while to find/join and it does feel like the type of game that would be better played with a friend or two, rather than solo. One final thing that may concern people is that even in private solo missions, you do need to be connected to the internet as the saves are all kept server-side.

KitGuru Says: Battleborn has been overshadowed a bit by Overwatch this month despite the fact that the two games are so very different. Gearbox’s first-person shooter has some unique elements to it and it should run perfectly fine on most systems, if you are in need of a new game to play with friends, then this is worth giving a shot.

]]>https://www.kitguru.net/gaming/matthew-wilson/battleborn-pc-game-analysis/feed/2AMD Radeon Pro Duo benchmark results leak ahead of launchhttps://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-radeon-pro-duo-benchmark-results-leak-ahead-of-launch/
https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-radeon-pro-duo-benchmark-results-leak-ahead-of-launch/#commentsMon, 25 Apr 2016 20:05:31 +0000http://www.kitguru.net/?p=290668Right now, rumour is pointing towards AMD finally launching its dual-Fury graphics card, the AMD Radeon Pro, later this week and now that looks even more likely as some benchmark results for the card have leaked on to the web this evening. As we reported last week, at least one retailer has listed the Radeon Pro Duo for release tomorrow and now we have an early idea of what performance to expect against a GTX 980Ti.

Performance numbers for the Radeon Pro Duo were released by Expreview. The graphics card was tested across several modern games including The Division, Rise of the Tomb Raider, The Witcher 3 and Grand Theft Auto V at both 1080p and 4K resolutions.

Being a dual GPU card, performance will often depend on games having decent CrossFire support. That said, the Radeon Pro Duo seems to perform fairly well across these games and by the looks of it, anti-aliasing was left on in each game, which would cause an unnecessary hit to performance at 4K.

KitGuru Says: AMD’s dual-Fury graphics card was unveiled alongside the Fury X way back at E3 last year, which was around ten months ago. It has taken a while but it looks like the card is finally going to hit the market. Do any of you happen to run a dual-GPU rig? What do you think of these performance results for the Radeon Pro Duo?

]]>https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-radeon-pro-duo-benchmark-results-leak-ahead-of-launch/feed/29AMD VR Performance featuring Sapphirehttps://www.kitguru.net/components/graphic-cards/jon-martindale/amd-vr-performance-featuring-sapphire/
https://www.kitguru.net/components/graphic-cards/jon-martindale/amd-vr-performance-featuring-sapphire/#commentsThu, 31 Mar 2016 08:21:01 +0000http://www.kitguru.net/?p=287458It has been a long time since anyone asked, “can it run Crysis?” but there is no doubt that the recommended specifications for high-end Oculus Rift and HTC Vive virtual reality are expensive. An i5-4590 and a R9 290 or GTX 970 are priced at a level that only a small per centage of all Steam users can match, so for many people, VR will mean an upgrade as well as shelling out hundreds on a headset.

So with that in mind, Sapphire got in touch to offer us the chance to test all their high end graphics solutions. We wanted to see how they stacked up when it comes to displaying dual images of virtual reality right into your eyeballs. They sent us:

Sapphire R9 FuryX.

Sapphire Nitro R9 Fury.

Sapphire R9 Nano.

Sapphire Nitro R9 390X.

Sapphire Nitro R9 390.

Watch via our VIMEO Channel (Below) or over on YouTube at 1080p60 HERE

To see how these fared, we made use of Valve’s VR performance test, which runs a scene from the Aperture Robot Repair demo, as if you were using a VR headset.

The graphics drivers installed at the time of testing were the AMD 15.12 Crimson, with the 16.3 hotfix applied.

Sapphire Nitro R9 390

Sapphire Nitro R9 390X

Sapphire Nitro R9 Fury

Sapphire R9 Nano

Sapphire R9 Fury X

Methodology

Although we stated that we used Valve’s SteamVR Performance Test to find out how each of these GPUs fare under VR conditions, it is worth mentioning that this benchmark works differently than most of the ones we use. Unlike 3DMark or similar, the SteamVR Performance Test uses adaptive quality settings. That means that it consistently changes the quality of the visuals, based on how the graphics card is dealing with the scene.

While that might seem redundant for old benchmark score hunters, there is a very good reason for this in virtual reality: dropping below 90 frames per second is bad news as it can cause a feeling of nausea. A lot of games will employ a similar method of keeping the frame rates high.

That said, the benchmark still puts out a score at the end, based on the level of fidelity maintained throughout the demo and a note to let you know how often the system dropped below 90 frames per second and how often the CPU held things back. There is also a sliding scale of how “VR Ready,” your system is, from not ready, through capable, to “ready” at the green end of the spectrum.

While all of the cards we tested here should have no problem handling the demonstration, they will give us an idea of how VR capable each card is and what sort of detail levels buyers looking to get into early VR adoption can expect.

Standard Results

Fury X, Fury and Nano

390x and 390

As you would expect, the Fury X walks away with the top marks in this benchmark, while the less powerful, Hawaii Core-based 390x and 390 cards fall a little behind. They all register as VR ready, which is a relief considering the latter cards are based on the same 290 architecture that Oculus and HTC recommend.

There were some instances of sub-90 FPS with the latter card, though they were very few and far between.

Overall, all cards performed well, but what is interesting to note is the difference in overall graphical fidelity. Despite being able to deliver a near-comparable smoothness to the Fury X, all of the other cards forced a drop in graphical fidelity in order to achieve that.

That means that while a R9 390 will get you to a state of being able to play virtual reality games, there is something to be said for running a more powerful graphics card under the hood. Not only will it mean that – like more traditional titles – you will be able to run the experience at a higher visual fidelity, and therefore with a greater sense of realism, but you have more of a buffer against lowered frame rates.

FRAPS data

As much as the above information is useful though, it is not quite as detailed as we would like to see from our benchmarks. It does not give us any indication of how often these cards are capable of outputting more than 90FPS, which in itself can act as a buffer to prevent any visually hefty sections from cannibalising fidelity to maintain that 90FPS baseline.

So to help indicate that a bit better, we also ran FRAPS during each benchmark run to get a better insight into how each of the GPUs fared. Plotting out the frame rates into a graph shows us what those fidelity sacrifices made possible in terms of frame rate.

The data from this graph is a little more interesting and further shows what the SteamVR Performance test does in terms of automatically adjusting fidelity to maintain frame rates above 90fps. Although we do see the 390 fall just below that recommended minimum of 90 FPS, all the other cards take it in their stride.

There is some (traditionally) anomalous data in the form of the Fury beating out the Fury X in terms of its maximum FPS and the 390x having comparable performance to the Nano, but these instances can be put down to automatic adjustments in fidelity. Because the 390x and Fury are operating at lower graphical quality than the more powerful cards, they are able to achieve similar frame rates.

This wouldn’t be something you would see in a traditional benchmark, but in this one further emphasises that frame rate is more important than visuals in VR.

Closing Thoughts

The big take home from this testing session should be that all of the cards we looked at were capable of delivering solid VR performance, albeit at slightly different visual fidelities.

Although we did not achieve any really useful benchmarking data, we also tested each graphics card with a variety of games, including Windlands, Hover Junkers and Space Pirate Trainer. Each was able to play the games very comfortably, with very few instances of them dipping below 90FPS and even then only by one or two. The Fury and the Fury X held the most consistent gaming experience with VR, at this early stage.

There is another obvious advantage to having a faster card. While the latest generation of consumer grade VR headsets do not sport the 4K resolution+ screens we would need for a near-lifelike virtual reality experiences, they are detailed enough that having higher graphical settings in a game can make a difference with how immersive it feels.

The Fury X was able to deliver the best looking, smoothest experience in our testing. If you have the funds available, the Fury X is the best AMD card you can buy today for VR gaming.

Special thanks goes to Sapphire for sending us these cards to benchmark. We hope to follow up soon with a similar article featuring all Nvidia hardware.

KitGuru Says: All of the hardware on test today delivers a good VR experience, however the Fury X is the card to buy, if you have the money in the bank. The R9 390 is really the minimum you will want to accept, especially as more intensive VR titles are released later in the year. You risk a substandard VR experience and that is arguably worse than no VR at all. Nobody wants to feel nauseous when experiencing VR, so making sure you have the hardware to achieve 90fps+ at all times is critical.

]]>https://www.kitguru.net/components/graphic-cards/jon-martindale/amd-vr-performance-featuring-sapphire/feed/16Hitman PC game analysishttps://www.kitguru.net/gaming/matthew-wilson/hitman-pc-game-analysis/
https://www.kitguru.net/gaming/matthew-wilson/hitman-pc-game-analysis/#commentsMon, 28 Mar 2016 12:46:57 +0000http://www.kitguru.net/?p=287841Hitman has been one of Square Enix’s staple franchises over the years but this time around, they are handling Agent 47 a little differently.

Rather than launching a full game, we are getting ‘episodes’ which are due to be released on a monthly basis.

However, this is also one of the first triple-A titles to launch with DirectX 12 support which is quite exciting, so let’s dive in and see just how well it runs.

Click images to enlarge.

Our first point of call is this graphics options menu, which isn’t quite as comprehensive as the one you will find in The Division but most of the bases are covered with options for FXAA or SMAA, texture quality, SSAO, shadow resolution and you can even switch between the DirectX 11 and DirectX 12 APIs.

The game itself is fairly impressive as far as visuals go. There are some muddy textures here and there but that is to be expected of a third person game since everything is designed for you to look at from a distance.

Where this game really shines is in level design, even the tutorial levels are well thought out with plenty of crowds and assassination options. These crowds will have an effect on frame rate though so that is worth remembering.

One area where the game is let down is hair quality, it just doesn’t really look up to par with the rest of the game.

I have managed to play through all of the levels available in the first episode of Hitman right now and I haven’t encountered any crashes graphical glitches like shadow flicker but I have encountered issues with cut scenes freezing or not playing at all.

Today, I will be benchmarking Hitman on a system featuring an Intel Core i7 6700K, 16GB of G.Skill DDR4 RAM, a 1TB Samsung EVO SSD and an Asus Maximus VIII Hero motherboard. For graphics cards, I will be using a reference GTX 980Ti, an MSI GTX 970 4G, a Sapphire R9 290 Vapor-X, an XFX R9 390X Ghost Edition and finally, the newest addition to the collection, an R9 Fury X. None of the cards are overclocked in our tests.

For the release of this game both AMD and Nvidia launched updated drivers. On the Nvidia side, we are running driver version 364.51 and for AMD, we are using Crimson Software version 16.3. Our results were collected using the in-game benchmarking tool but I will also be discussing the real-world gameplay experience in the text below.

DirectX11 performance is fairly solid across the board while the GTX 980Ti remains king in our 1080p tests, the Fury X wasn’t too far behind and closed the gap at 1440p. The R9 390x and R9 290x also pull in excellent performance numbers across the board, pushing the GTX 970 right down to the bottom of our charts.

I would disregard the minimum frame rates displayed from the benchmark tool. As you can see, all of the cards were brought down to 12 or 13 frames per second, with some falling into the single digits, it definitely seems like a bug with the benchmark itself and it is not the only one I came across.

You may have noticed at this point that there is no graph for DirectX 12 performance. Believe me, I’m just as disappointed as you are, unfortunately, I ran into complications with the DirectX 12 version of the game. For starters, it was capped at 60 frames per second for some of our GPUs and then uncapped for others despite not changing any settings.

On top of that, I faced some freezing and crashing issues while attempting to benchmark DirectX 12 with Nvidia GPUs. However, the experience was a bit smoother on the AMD side of things.

While the benchmark’s minimum frame rates may be a bit over exaggerated, I can tell you that this game does suffer from random frame rate dips into the 40’s and 30’s at times in both DirectX 11 and 12. This tends to happen in more crowded zones and it seems to affect both Nvidia and AMD GPUs though I will admit that AMD’s side seemed to handle crowded areas better overall.

This could be down to the fact that this is an AMD ‘Gaming Evolved’ title but nonetheless, it was an interesting observation.

Performance issues aside, Hitman is a fairly enjoyable experience. While these slight performance problems are present they don’t last long enough to truly ruin the gameplay itself, which remains fun throughout the entire first episode.

The gameplay itself is as fun as always. Levels are designed to be like a sandbox, filled with multiple ways for you to get your target and take them out. There are also challenge modes to see how many ways you can complete a mission, which adds some replay value.

That said, if you are in it for the story then the whole experience is quite short. I was done with the first episode, which contains the tutorials and the first proper level of the game in around three hours. By the time you start getting into the really good stuff the game ends, which is a shame.

There will be more content releases on a monthly basis throughout the year so there is plenty more to come. On top of that, what you can play right now is really fun but I am not entirely convinced that an episodic format really works for a game like Hitman.

KitGuru Says: While Hitman does have its share of performance issues, the gameplay is solid- it is just a shame that there isn’t enough of it for the time being.

]]>https://www.kitguru.net/gaming/matthew-wilson/hitman-pc-game-analysis/feed/6Tom Clancy’s The Division PC game analysishttps://www.kitguru.net/gaming/matthew-wilson/tom-clancys-the-division-pc-game-analysis/
https://www.kitguru.net/gaming/matthew-wilson/tom-clancys-the-division-pc-game-analysis/#commentsSat, 12 Mar 2016 09:53:15 +0000http://www.kitguru.net/?p=286344Tom Clancy’s The Division is finally out after a few years of hype building at events like E3 and even a few beta tests. The ambitious open-world game has some high system requirements on PC but does the game live up to them? Today we will be analysing the PC version of The Division to see just how well it runs.

First, let’s take a look at the graphics options menu:

The Division was developed by Ubisoft Massive and as you can see from the screenshots above, a lot of detail has gone into the PC version. There is three full pages of different sliders, effects and toggles for you to play around with, covering everything from distance scaling, view distance, world details and lighting to standard options like textures and anti-aliasing.

Honestly, I am very impressed with the sheer amount of options available in the PC version. You will find more to mess around with here than you will in many other open-world games on PC. However, it is worth mentioning that this is a Nvidia GameWorks title, so some options likely won’t work as well on AMD graphics cards. For our benchmarks today, GameWorks effects will be switched off.

The graphics look good too. However, the depth of field effects can come off a little too strong, making some parts of the world come off as blurry at times. This is likely intentional as textures can often suffer when looked at up-close in 3rd person games as they are designed to be viewed with the camera at a distance. You can fix this by moving the focus/soften slider from its 70% default to 100% though this will impact frame rate.

The New York setting is fantastic as well, with a ton of attention to detail placed throughout the map, which is apparently built on a 1:1 scale too – this really helps build a sense of scale when you are walking around the world. At launch, you get access to Manhattan island but apparently the map will be expanded in future paid expansions.

That said, the launch map is not small by any means, there is so much to explore here. I have put around five hours into the game myself and I’ve barely begun to scratch the surface.

The game looks great but none of that matters if it doesn’t run well. Today, I will be benchmarking Tom Clancy’s The Division on a system featuring an Intel Core i7 6700K, 16GB of G.Skill DDR4 RAM, a 1TB Samsung EVO SSD and an Asus Maximus VIII Hero motherboard. For graphics cards, I will be using a reference GTX 980Ti, an MSI GTX 970 4G, a Sapphire R9 290 Vapor-X and an XFX R9 390X Ghost Edition. None of the cards are overclocked in our tests.

For this game’s launch, Nvidia did release a Game Ready driver which was later pulled due to widespread issues. At the time of writing, Nvidia has released BETA driver 364.51 which we will be using for our tests today. This is a non-WHQL certified driver and needs to be obtained directly from the GeForce website. On the AMD side, we will be using Radeon Software version 16.3.

Performance isn’t great at Ultra settings. While the benchmark shows fairly high average frame rates, I found that during regular gameplay, performance would often dip below 60 while walking around the streets of New York at 1080p on all cards with the exception of the GTX 980Ti, which was able to hold things above 60 frames per second more often than not even at 1440p.

Given that Ultra performance isn’t all it’s cracked up to be, I tried turning things down a notch to the high preset and re-benching the GTX 970, R9 390x and the R9 290:

As you can see, settling for the High preset rather than ultra brings a huge boost in performance. Honestly, the game doesn’t look all that different on this preset either, so I would definitely recommend going with this setting instead for a much smoother experience at 1080p.

The game had a couple of issues on launch day like distracting texture pop-in and some poor anti-aliasing in some areas but a 600MB patch was added to the game overnight and that seems to have fixed a lot of these problems. Performance hasn’t improved but you will encounter less graphical issues in the game. That said, these problems do seem to persist in the game’s in-built benchmarking tool for whatever reason so Ubisoft would do well to take a look at that.

Now let’s talk about the actual gameplay a little bit. During the multi-year build-up to this game’s launch, we didn’t really know what to expect but now it is pretty clear. The Division is a first person action RPG, almost like Diablo. You will go through missions with friends, fight a lot of enemies, level up and grind for rare gear that will give you higher DPS. You will even get to see your damage numbers get higher as you go, which is always satisfying.

You could also compare The Division to Destiny, which wouldn’t be an incorrect comparison to make. I have sunk 300 hours into Destiny on the Xbox One since its launch and there are a lot of similarities between it and The Division but there are some key differences as well. Ubisoft’s game features what I think is a much better open-world, there is an actual story to be told here and the base upgrading and crafting mechanics really help The Division stand out.

It also doesn’t hurt that The Division is actually available on PC and can run higher than 30 frames per second, neither of which apply to Destiny.

If you don’t like the repetitive gameplay of titles like Diablo, Destiny or even many MMOs, then The Division might not be for you. However, if you love leveling up, collecting new gear, taking on challenging missions and watching your damage numbers climb higher, then this game is well worth a shot. It might not be perfect but it is undoubtedly fun.

KitGuru Says: The Division is an excellent game in my eyes and one that I can see myself putting a lot of hours into. It is quite demanding on PC systems, which seems to be a running trend with Ubisoft titles as of late but getting 60 frames per second is easily achievable even on mid-range GPUs, making for a smooth, fun experience.

]]>https://www.kitguru.net/gaming/matthew-wilson/tom-clancys-the-division-pc-game-analysis/feed/3Importing a Samsung Galaxy S7 gets you a faster phonehttps://www.kitguru.net/components/graphic-cards/jon-martindale/importing-a-samsung-galaxy-s7-gets-you-a-faster-phone/
https://www.kitguru.net/components/graphic-cards/jon-martindale/importing-a-samsung-galaxy-s7-gets-you-a-faster-phone/#commentsTue, 08 Mar 2016 11:19:28 +0000http://www.kitguru.net/?p=286343Where you buy your smartphone from doesn’t usually matter, but when it comes to the Samsung Galaxy S7, it does. That’s because the UK’s version will come fitted with an Exynos 8890 processor, while U.S. buyers will have the same handset fitted with a Qualcomm Snapdragon 820. While that wouldn’t matter if they performed the same, the Qualcomm chip actually takes a noticeable lead in most benchmarks.

There isn’t a huge difference between the two chips in terms of design – they both use the same 14nm process – but it is there and shows that buying the S7 in the UK is not the best place to do it. If you’re going to and care about having your device be as cutting edge as possible – which when you’re spending upwards of £500 on a phone, you want it to be – figuring out an import might make more sense.

As the Antutu benchmark results show, the Exynos 8890 equipped version of the Galaxy S7, is slower than Apple’s A9 chip, by a few thousand points in the CPU test. However, the Qualcomm Snapdragon 820 is faster than the Apple chip, which is a big selling point for Samsung, as a major competitor for Apple in terms of smartphone sales.

Far more important though are the GPU performance results, which show not just a small increase in performance between the Exynos and Qualcomm chips, but a near 50 per cent increase in power. If you’re a smartphone gamer who likes impressive visuals, that is an enormous difference and suggests buying an S7 in the UK – if you are part of that audience – would be a poor choice.

This isn’t the first time that Samsung has dabbled in mixing processors depending on the region. Indeed it was more uncommon when it didn’t, but that hardly makes this an easier pill to swallow.

KitGuru Says: Would any of you consider importing an S7 to get access to the faster and more graphically capable processor?

]]>https://www.kitguru.net/components/graphic-cards/jon-martindale/importing-a-samsung-galaxy-s7-gets-you-a-faster-phone/feed/10New AMD Zen details have begun to surfacehttps://www.kitguru.net/components/cpu/matthew-wilson/new-amd-zen-details-have-begun-to-surface/
https://www.kitguru.net/components/cpu/matthew-wilson/new-amd-zen-details-have-begun-to-surface/#commentsWed, 20 Jan 2016 18:46:12 +0000http://www.kitguru.net/?p=281513With AMD set to release its Zen architecture processors towards the end of this year, the company has begun giving out more details during its quarterly earnings call. This week, AMD reported its quarterly earnings and in that call, CEO Lisa Su revealed a few more details on the Zen architecture along with what we can expect and when to expect it.

Right now, AMD is in talks with PC makers to begin building systems using its codenamed Summit Ridge processors, which are based on the Zen architecture and may well be released under the ‘FX’ brand during late 2016. These Summit Ridge CPUs will be the first high-performance desktop CPUs from AMD to use Zen, while chips for servers will appear in early 2017. Nothing was said about Zen APUs though we did hear some rumours recently– there is also no word on Zen for laptops just yet.

KitGuru Says: AMD is gearing up for a lot of launches in 2016, with the debut of Socket AM4, DDR4 support and the new Zen architecture. Are any of you looking forward to seeing how AMD’s new CPUs perform?

]]>https://www.kitguru.net/components/cpu/matthew-wilson/new-amd-zen-details-have-begun-to-surface/feed/107How does Fallout 4 run on PC?https://www.kitguru.net/gaming/matthew-wilson/how-does-fallout-4-run-on-pc/
https://www.kitguru.net/gaming/matthew-wilson/how-does-fallout-4-run-on-pc/#commentsTue, 10 Nov 2015 10:38:49 +0000http://www.kitguru.net/?p=274641One of the most highly anticipated games of the last couple of years has been Fallout 4, ever since the famous Kotaku leak of 2013. Now, the game has finally arrived, bringing us back to the wasteland, but how does the PC version hold up? I managed to get access to the game yesterday afternoon, so I’ve spent quite a few hours with the game to bring you my performance impressions.

Click images to enlarge.

Fallout 4 is running on a tweaked version of the Creation Engine, which was used in Skyrim back in 2011. As a result, most of the graphical features remain familiar, though an updated lighting system has been implemented, along with some better forms of anti-aliasing. Another important thing to note is that while Skyrim was capped at 60 frames per second, Bethesda has done away with that this time around, which is great for those running high refresh rate monitors.

Unfortunately it seems that the initial launcher doesn’t support the 21:9 aspect ratio, which is important for those running ultra-wide monitors. There is some fiddling you can do in the INI file to force an ultra-wide resolution, though some in-game aspects like the HUD don’t scale properly. This may be fixed later down the line so keep an eye out.

If you have read any of my PC port pieces in the past, like the one for Arkham Knight, Mad Max or Metal Gear Solid V, you will know that I am running an Intel Core i7 5820K, 16GB of RAM and a GTX 980Ti in my personal rig. I’m not on the hardware review team, so I don’t have access to the GPUs you will find in some of our other reviews. However, if a system as high-spec as mine struggles to run a game, then we will know something has gone quite wrong with optimization.

Now we may as well get this out of the way early, despite upgrades to the engine, Fallout 4 is not breaking any graphical boundaries. I ran the game with everything turned up as high as it would go at 2560×1440 and while the bright, colourful opening scenes were eye pleasing, that quickly goes away once we hit the wasteland.

A lot of textures are quite muddy and lack the higher levels of detail we have grown accustomed to thanks to games like The Witcher 3. This is particularly noticeable when we come across plant life in the world, or when you take a look at character eyes or clothing.

However, while textures aren’t great, Bethesda has made some excellent changes to the environment. Areas with lots of fog look great and the new lighting system is gorgeous during the day, so there is some give and take. Modders will be quick to sort out any graphical disappointments soon anyway.

Now let’s talk a bit about performance. In preparation for this piece today, I installed Nvidia’s Game Ready driver for Fallout 4 to ensure I was getting the best ‘day one’ experience I could get. Unfortunately if there was one word I could use to describe Fallout 4’s frame rate it would be ‘varied’. In the opening scenes of the game, you start off in your character’s house pre-war. In this small area, I was mostly seeing my frame rate sit at around the 105 frames per second mark, though at times it would spike up to 130 or dip as low as 80 frames per second.

Eventually, you get out in to the wasteland, which doesn’t help matters much, with the frame rate sitting at around 90 frames per second for the most part but once again varying from the high 130’s to as low as the mid 50’s on rare occasions. I found this to be most noticeable during the day, so it appears to be something to do with the new lighting system.

Things settle down considerably during the night, with my frame rate sitting steady at around the 88 to 90 frames per second mark, with very few sudden dips or rises. Hopefully Bethesda can make a few optimization adjustments over the next few weeks to stabilize the frame rate a bit more.

Just to be clear, it’s not like the game runs poorly, my frame rates are high. Unfortunately, the consistency isn’t quite up to standard just yet, sudden massive frame rate gains and dips will have a noticeable effect on smoothness throughout the game and can hurt the overall experience.

Another thing to keep in mind is that this test is only being ran on one graphics card, so it doesn’t really paint the full picture. Your system may perform very differently. However, if you are having issues, there is a handy forum post available with a ton of tweaks and potential fixes for the game ahead of the first patch.

There are a couple of bugs to watch out for, I haven’t encountered any graphical glitches so far but I have come across two major issues with the game. Firstly, there seems to be a problem with the terminal system, as I have found my character completely stuck multiple times after using them. I became so frustrated with this bug that I stopped using terminals throughout the world to gather additional lore and backstory.

I have also encountered a bug where the Power Armour animation gets stuck. You can see both bug instances in the two images above. I consider both of these issues to be game breaking, as I couldn’t move or save the game, forcing me to quit and reload. This has happened to me around five or six times so far and I have lost significant progress each time.

Now that all sounded a bit negative but I must say, I am really enjoying my time with Fallout 4. I’m about six hours in and I already feel invested in the story. On top of that, the new shooting mechanics are excellent, so much so that I rarely resort to the VATS system at all. Crafting is a major addition to the game, which helps you collect resources, the system is deep, so it is well worth exploring and making the most of.

If you are a fan of Bethesda-style open world RPGs, then Fallout 4 is worth playing and I am confident you will enjoy it. Just keep in mind that there are some issues right now that can hurt the overall experience.

KitGuru Says: Fallout 4 is a fun game that has made quite a few improvements on the tried and true Bethesda formula. Performance could use some stabilizing and graphics aren’t mind-blowing, but the gameplay and story are both engaging and the world is big enough to supply you with hours upon hours of exploration.

]]>https://www.kitguru.net/gaming/matthew-wilson/how-does-fallout-4-run-on-pc/feed/11How does Metal Gear Solid V hold up on the PC?https://www.kitguru.net/gaming/matthew-wilson/how-does-metal-gear-solid-v-hold-up-on-the-pc/
https://www.kitguru.net/gaming/matthew-wilson/how-does-metal-gear-solid-v-hold-up-on-the-pc/#commentsWed, 02 Sep 2015 17:00:33 +0000http://www.kitguru.net/?p=265905Konami gave us plenty of cause for concern in the run up to the launch of Metal Gear Solid V: The Phantom Pain. The unexpected drama that has occurred between Hideo Kojima and Konami in recent months cast some doubt over the final product, particularly where the PC port is concerned. So how does the final game hold up? We decided to take the PC version of Metal Gear Solid V for a spin to see exactly how it performs.

We have only had a handful of Metal Gear Solid games appear on the PC throughout the series’ history, with Metal Gear Solid V obviously being the most recent and most ambitious. We have seen the FOX Engine in action briefly thanks to last year’s Metal Gear Solid: Ground Zeroes, which performed very well on the PC but did that level of care carry over in to The Phantom Pain?

Before we get in to specific performance numbers, let’s take a good look at the graphics options menu. Unfortunately, navigating the in-game menus lacks mouse support, so you are going to have to use the keyboard. I have never been a big fan of this implementation but it is a pretty minor issue.

The graphics menu is decent enough with plenty of options, though for some reason anti-aliasing is missing as a separate option. It is likely bundled in with post-processing so if you feel like any extra AA is needed beyond the ‘extra high’ preset, then you will need to force it through your GPU driver.

Your basic shadows, textures, lighting and effect options all go up to ‘extra high’. For the purposes of our tests today, I cranked everything up to the max, with the exception of motion blur, which I like to leave off, particularly since it doesn’t tend to look good in video form.

]]>https://www.kitguru.net/gaming/matthew-wilson/how-does-metal-gear-solid-v-hold-up-on-the-pc/feed/6Corsair Hydro H80i GT Liquid CPU Cooler Reviewhttps://www.kitguru.net/components/cooling/henry-butt/corsair-hydro-h80i-gt-liquid-cpu-cooler-review/
https://www.kitguru.net/components/cooling/henry-butt/corsair-hydro-h80i-gt-liquid-cpu-cooler-review/#commentsMon, 06 Apr 2015 09:05:51 +0000http://www.kitguru.net/?p=242557Today we are going to take a look at another of the latest all-in-one water cooling units from Corsair, the Hydro H80i GT. This model has been designed to improve on the existing H80i which has found favour amongst a number of enthusiasts.

Corsair are very highly regarded in the all-in-one liquid cooling market, and offer a comprehensive range of products which encompass the needs of a wide spectrum of users. The Hydro H80i GT fits in the middle of their range, and is designed for those users looking for decent performance, and are restricted to a single 120mm radiator for cooling purposes.

We are very excited to see what the Hydro H80i GT has to offer in our tests, and how it compares to existing models in terms of raw performance.

Specification

Radiator dimensions: 154mm x 123mm x 49mm

Fan dimensions: 120mm x 120mm x 25mm

Fan speed: 2435 +/- 10% RPM

Fan airflow: 70.69 CFM

Fan static pressure: 4.65 mmH2O

Fan noise level: 37.7 dB(A)

Compatibility (Intel): LGA 1150, 1155, 1156, 1366, 2011, 2011-3

Compatibility (AMD): Sockets FM1, FM2, AM2, AM3

]]>https://www.kitguru.net/components/cooling/henry-butt/corsair-hydro-h80i-gt-liquid-cpu-cooler-review/feed/2Corsair Hydro H100i GTX Liquid CPU Cooler Reviewhttps://www.kitguru.net/components/cooling/henry-butt/corsair-hydro-h100i-gtx-liquid-cpu-cooler-review/
https://www.kitguru.net/components/cooling/henry-butt/corsair-hydro-h100i-gtx-liquid-cpu-cooler-review/#commentsWed, 01 Apr 2015 11:03:08 +0000http://www.kitguru.net/?p=242553Today we are going to take a look at one of the latest all-in-one water cooling units from Corsair, the Hydro H100i GTX. This model has been designed to build on the success of the existing Corsair H100i which has proved a massive hit with gamers and enthusiasts worldwide.

Corsair are very well regarded in the all-in-one liquid cooling market, and offer a comprehensive range of products which encompass the needs of a wide spectrum of users. The Hydro H100i GTX fits in their range near the top, just below the flagship H110i GT.

We are very interested to see what the Hydro H100i GTX has to offer in our tests, and how it compares to existing models in terms of raw performance.