Today we’re benchmarking Just Cause 4 with a boatload of different GPUs to help you determine if your graphics card will handle this brand new title, and if need be, work out a suitable upgrade option.

Just Cause 4 makes use of the Avalanche's Apex game engine, though it is a newer updated version featuring diverse and extreme weather effects, including blizzards, sandstorms, tornadoes and more. The game engine has also improved physics-based rendering, a new animation system and refined AI, so NPCs can be smarter to attack and be more threatening.

This all sounds great, but unfortunately the game has been received by mixed to average reviews in the few days it's been out. Cited reasons include that the UI is terrible, the graphics are unimpressive and worst of all, the performance is horrible. In the short time we had to play it over, while a little clunky at times, the game seems like a lot of fun and we're interested to see more. But to enjoy it we're using an RTX 2080 Ti at 1440p as this is required to maintain above 60 fps at all times, yeah it’s that bad.

If you’ve got a GTX 1060, RX 580, or worse, this game is going to leave you annoyed. For testing we're loading in at the start point, jumping in a car and driving that off the bridge and into the jungle. The test lasts 60 seconds and it paints an accurate picture of the kind of performance you can expect.

We played the game to the point where you call in reinforcements as the battling NPCs add more load on the system. Included are 1080p, 1440p and 4K results for a massive range of graphics cards. We’ve also got some additional testing at 1080p with new and old graphics cards using mid-range quality settings.

The benchmark test bench consisted of a Core i7-8700K clocked at 5 GHz with 16GB of DDR4-3400 memory. For the GeForce GPUs the 417.22 driver was used and for AMD, the Radeon Adrenalin 18.12.1.1 driver. Let’s get into the results.

Benchmarks

With all the graphics settings maxed out the game doesn’t look amazing and in spite of that with a GTX 980 you won’t see 60 fps on average at 1080p, same is true for the Fury X. Those with older-mid range options such as the GTX 970 and R9 390 can enjoy frame dips into the mid 30s. If you have a GTX 960 or R9 380, well, good luck.

Looking at the current generation GPUs we have a few more options that were able to keep frame rates above 60 fps. The GTX 1060 6GB and RX 580 8GB failed to average 60 fps, though the Radeon GPU got mighty close.

Higher-end models such as Vega 64 and the GTX 1080 were unable to keep frame rates above 60 fps in our test, too. Ideally for that you’ll want an RTX 2070 or better, for playing at 1080p, it’s like DXR is enabled (funny, but not funny).

1440p is basically out of the question for previous generation GPUs. The Fury X and GTX 980 Ti were struggling with average frame rates in the mid 40s. So those playing at 1440p will require an RTX 2080 Ti to maintain over 60 fps at all times, while the 2080 and 1080 Ti are good for around 55 fps for the 1% low. Lower-end models such as the GTX 1060 and RX 580 are basically a write off here, so I can’t wait to see the 4K results.

Honestly, we couldn't expect any different after the 1080p and 1440p results. The RTX 2080 Ti does ok here, as in not great but playable. Pretty crazy that we are seeing Assassin’s Creed Odyssey-like performance in a game that looks nowhere near as good and Odyssey was bashed for its poor optimisation.

So what if you drop all settings down to medium, disable SSAO, and change the anti-aliasing method to FXAA?

Managing settings and reducing visual effect to medium boosts performance by around 30%, but that’s not enough for most of these older GPUs. The much loved GTX 750 Ti averaged just 27 fps at 1080p using heavily dialed down quality settings. Even the current generation GTX 1050 was pretty horrible with regular dips below 30 fps.

The RX 570, R9 390 and GTX 970 couldn’t keep frame rates above 60 fps at all times with the lower settings either. Finally, the GeForce GTX 580 (circa 2010) had to be dropped from the testing as it suffered massive graphical artifacts in this title.

Closing Thoughts

We can understand why gamers are upset. In many ways Just Cause 4 seems like a downgrade from the 3-year-old Just Cause 3. Character models and animation detail appear to be much the same, while the environment in some areas look a little better, but many others don’t and some are flat out far worse.

The water effects or lack thereof are so bad in Just Cause 4. Surely this has to be a mistake, I mean, it really is laughable how bad the water looks. Just Cause 3 was amazing in comparison, with boats creating wakes, there were waves, ripples, and so on. The explosions don’t look as good either in our opinion, they lack the detail of the previous title. The only impressive aspect is the weather system, some of the effects are quite good.

To make matters worse, performance shows the game is not properly optimized. We can’t directly compare performance between the 3rd and 4th installments in the Just Cause series, but on average we'd say you’re looking at 40-50% better performance using identical settings with Just Cause 3.

Compared to modern titles, the performance is similar to what we observed in Assassin’s Creed Odyssey and we bashed that title for being unoptimized and it looks way better. Technically speaking, Just Cause 4 seems like a mess at launch. The developer is already trying to address some of the major concerns as seen in this post, but with less than a week since release, it's clear the game could have used more time in QA before being delivered to gamers. It's a real shame since the game itself seems decent and quite a bit of fun.

The poor performance isn’t down to heavy CPU or memory usage either. Basically you’ll get the same results with an older quad-core like the 7700K, and from what I can tell the Ryzen 7 2700X is on par with the Core i9-9900K. The game's engine age is starting to show as just one or two threads are heavily loaded, with the rest doing little work. Still, we appear to be GPU limited as CPU overclocking sees no real performance gain over stock.

For the most part 8GB of system memory will do and the demand on VRAM isn’t that high either. About 4.5 GB at 4K, 3.5 GB at 1440p and 3.3 GB at 1080p. Unless you're a big fan of the series, we’d avoid this title for the time being.