So, during this past Christmas break, my sons and I played quite a bit of Fortnite Battle Royale (them mostly in Xbox One, and me on my iMac, dual booting High Sierra and Bootcamp Windows 10). For a few days, we were playing it so frequently in Win10, that I just left Windows running without rebooting back into macOS. My iMac's specs are as follows: late 2012, 21.5" 1080p screen, 2.9 GHz i5, 8GB RAM, 1 TB HD, nVidia GT 650M 512MB GPU. I'm running nVidia's Web Drivers (up-to-date as of this morning).

We have Google Fiber gigabit internet, which is hard-wired straight into my Gigabit-equipped iMac.

The reason that I tried out the Windows version is because my initial experience running the game in macOS was so negative that it had me concerned that maybe my Mac was too long in the tooth to play it well. What I found out, instead was a little bit different.

Here are my observations after playing Fortnite on both OS's:

1. The loading screens and transitions are much smoother and reliable in Windows. In High Sierra, I experience random errors, (mostly failure to join a game pop-ups). Sometimes, I would have to quit out of Fortnite, then quit out of the Epic Launcher, both of which are laborious and slow to wait for. My son, at one point, while waiting to join a game in High Sierra, asked me if my Mac was frozen up.

2. After having played around with the settings to achieve the best results, I learned that it will only play on lowest settings at 640x480 in High Sierra, with a frame rate that varies wildly between 5 FPS to 60 FPS, even on loading screens. On Windows, I can run on 720p, with all settings lowest, except for draw distance, which I max out), for obvious reasons. These settings give me a frame rate from between 40-60 FPS. Even with the in-game frame cap turned up to 120 FPS, we noticed no difference in measured frame rate.

3. Once in-game, the differences between the two OS's mostly melted away, giving an enjoyable playing experience. The only hitch I encountered was a slight stutter when scoping in with the game's sniper rifle, an action which was smooth in Windows.

4. Regarding menu performance under High Sierra: one thing I quickly learned not to do is to bring up the menu to chance graphic settings once in-game. Doing so introduced the same transition slowness I noticed in #1 above. Sometimes, the game would freeze up and I would was forced to command-tab out of it, and restart my Mac.

5. Please notice that I am not shocked that a Mac of mine's vintage experiences problems playing the game. Also, please note that I am fully aware that the game is in Early Access, which means it's not fully optimized. What I hope to do is present mine and my sons's experiences playing the game. The differences between the performance in Windows 10 and macOS High Sierra are disappointing, and I can only hope that further Metal development brings performance gains more on-par with Windows 10.

COMING SOON – Finding the Ark of the Covenant by Brian Roberts, in the iBook Store on iTunes, a new investigation into the Hebrew’s Most Sacred Relic!

I'm focused on keeping Metal support in UE4 moving forward with the rest of UE4 (it's a big team, so it keeps changing a lot!) so optimisation of the games is handled by others. As such there may well be more going on, but below I've summarised the obvious things that come to mind. I'll also note that we pay a penalty of 10-20% just for running on macOS/Metal rather than Windows/D3D11 which is often the difference between one resolution or quality level and another.

Irishman, on 16 January 2018 - 08:11 AM, said:

So, during this past Christmas break, my sons and I played quite a bit of Fortnite Battle Royale (them mostly in Xbox One, and me on my iMac, dual booting High Sierra and Bootcamp Windows 10). For a few days, we were playing it so frequently in Win10, that I just left Windows running without rebooting back into macOS. My iMac's specs are as follows: late 2012, 21.5" 1080p screen, 2.9 GHz i5, 8GB RAM, 1 TB HD, nVidia GT 650M 512MB GPU. I'm running nVidia's Web Drivers (up-to-date as of this morning).

5. Please notice that I am not shocked that a Mac of mine's vintage experiences problems playing the game. Also, please note that I am fully aware that the game is in Early Access, which means it's not fully optimized. What I hope to do is present mine and my sons's experiences playing the game. The differences between the performance in Windows 10 and macOS High Sierra are disappointing, and I can only hope that further Metal development brings performance gains more on-par with Windows 10.

The lack of VRAM on that GPU is really going to hurt on macOS, we use more VRAM on macOS so we'll be paging a lot more between CPU & GPU which is bad. That's a necessary evil to avoid other inefficiencies caused by trying to map a D3D11-oriented engine on to the Metal API.

Irishman, on 16 January 2018 - 08:11 AM, said:

1. The loading screens and transitions are much smoother and reliable in Windows. In High Sierra, I experience random errors, (mostly failure to join a game pop-ups). Sometimes, I would have to quit out of Fortnite, then quit out of the Epic Launcher, both of which are laborious and slow to wait for. My son, at one point, while waiting to join a game in High Sierra, asked me if my Mac was frozen up.

Behind the loading screen on macOS it will be doing a heck of a lot more shader compilation than it has to under Windows and unfortunately that is currently a highly serial single-threaded process.

Irishman, on 16 January 2018 - 08:11 AM, said:

2. After having played around with the settings to achieve the best results, I learned that it will only play on lowest settings at 640x480 in High Sierra, with a frame rate that varies wildly between 5 FPS to 60 FPS, even on loading screens. On Windows, I can run on 720p, with all settings lowest, except for draw distance, which I max out), for obvious reasons. These settings give me a frame rate from between 40-60 FPS. Even with the in-game frame cap turned up to 120 FPS, we noticed no difference in measured frame rate.

The varying frame-rate will stabilise somewhat if you play on the same build for long enough as the local shader cache builds up entries. The price is longer load times of course. Metal (like both D3D12 & Vulkan) exposes the developer to the reality of how shaders are actually compiled for GPUs and expects developers to optimise around this. Unfortunately most engines are built on the abstraction provided by D3D11 which inherited the D3D9 model of separate shaders with near-zero runtime compilation cost, achieved by the driver vendors investing the man-hour equivalent of tens of millions of dollars to aggressively optimise their runtime shader compilers and make their D3D-driver fully asynchronous (so games don't block when calling D3D) via substantial multi-threading. Often they optimised their shader compilers to generate code that is *trivial* to patch when render-state (like render-target or texture-formats) change or even outright replace the games shaders with their own "specially optimised" version. The new APIs force/encourage driver vendors to optimise their shader compilers to generate "perfect" GPU shader code, even if it makes shader compilation much slower as that is now the game developers problem and not directly the vendors. That is not to say the vendors don't care - merely that the APIs send you down a particular implementation route.

Another cause of fluctuating frame-rates is that D3D11's GPU resource management is also heavily abstracted away from the developer with the vendor able to do a lot of under-the-hood optimisations as they control the implementation. Metal puts that all on the game developer a bit more like Vulkan (though the API & semantics are quite different) and right now we aren't as efficient at allocating resources as D3D which can cause hitches on the CPU.

Irishman, on 16 January 2018 - 08:11 AM, said:

3. Once in-game, the differences between the two OS's mostly melted away, giving an enjoyable playing experience. The only hitch I encountered was a slight stutter when scoping in with the game's sniper rifle, an action which was smooth in Windows.

That'll be shader compilation or resource allocation which is more expensive on Metal than D3D. If the game plays well in-game on macOS then really that's the important part.

Irishman, on 16 January 2018 - 08:11 AM, said:

4. Regarding menu performance under High Sierra: one thing I quickly learned not to do is to bring up the menu to chance graphic settings once in-game. Doing so introduced the same transition slowness I noticed in #1 above. Sometimes, the game would freeze up and I would was forced to command-tab out of it, and restart my Mac.

Well, yep, that'll be the engine recompiling all the shaders & shader-pipelines

Interesting. Do you think that when/if UE4 is optimised for DX12, that should benefit Metal as well?

It is a bit more complicated than that. You can optimise D3D12 by changing D3D12-specific code, which won't benefit Metal, or by changing the common, higher-level code, which might. If we change high-level code to benefit D3D12 or Vulkan it could help Metal but it very much depends on what we are changing and how. What I will say is that performance will continue to improve incrementally in future releases but I can't promise how that will translate to individual games nor will I promise any radical improvements.

It is a bit more complicated than that. You can optimise D3D12 by changing D3D12-specific code, which won't benefit Metal, or by changing the common, higher-level code, which might. If we change high-level code to benefit D3D12 or Vulkan it could help Metal but it very much depends on what we are changing and how. What I will say is that performance will continue to improve incrementally in future releases but I can't promise how that will translate to individual games nor will I promise any radical improvements.

Thanks, Mark, for putting our experience into some context other than "Macs suck for gaming! What did you expect?"

COMING SOON – Finding the Ark of the Covenant by Brian Roberts, in the iBook Store on iTunes, a new investigation into the Hebrew’s Most Sacred Relic!

I'm focused on keeping Metal support in UE4 moving forward with the rest of UE4 (it's a big team, so it keeps changing a lot!) so optimisation of the games is handled by others. As such there may well be more going on, but below I've summarised the obvious things that come to mind. I'll also note that we pay a penalty of 10-20% just for running on macOS/Metal rather than Windows/D3D11 which is often the difference between one resolution or quality level and another.

The lack of VRAM on that GPU is really going to hurt on macOS, we use more VRAM on macOS so we'll be paging a lot more between CPU & GPU which is bad. That's a necessary evil to avoid other inefficiencies caused by trying to map a D3D11-oriented engine on to the Metal API.

Behind the loading screen on macOS it will be doing a heck of a lot more shader compilation than it has to under Windows and unfortunately that is currently a highly serial single-threaded process.

The varying frame-rate will stabilise somewhat if you play on the same build for long enough as the local shader cache builds up entries. The price is longer load times of course. Metal (like both D3D12 & Vulkan) exposes the developer to the reality of how shaders are actually compiled for GPUs and expects developers to optimise around this. Unfortunately most engines are built on the abstraction provided by D3D11 which inherited the D3D9 model of separate shaders with near-zero runtime compilation cost, achieved by the driver vendors investing the man-hour equivalent of tens of millions of dollars to aggressively optimise their runtime shader compilers and make their D3D-driver fully asynchronous (so games don't block when calling D3D) via substantial multi-threading. Often they optimised their shader compilers to generate code that is *trivial* to patch when render-state (like render-target or texture-formats) change or even outright replace the games shaders with their own "specially optimised" version. The new APIs force/encourage driver vendors to optimise their shader compilers to generate "perfect" GPU shader code, even if it makes shader compilation much slower as that is now the game developers problem and not directly the vendors. That is not to say the vendors don't care - merely that the APIs send you down a particular implementation route.

Another cause of fluctuating frame-rates is that D3D11's GPU resource management is also heavily abstracted away from the developer with the vendor able to do a lot of under-the-hood optimisations as they control the implementation. Metal puts that all on the game developer a bit more like Vulkan (though the API & semantics are quite different) and right now we aren't as efficient at allocating resources as D3D which can cause hitches on the CPU.

That'll be shader compilation or resource allocation which is more expensive on Metal than D3D. If the game plays well in-game on macOS then really that's the important part.

Well, yep, that'll be the engine recompiling all the shaders & shader-pipelines

Interesting read, thanks for sharing those insights!

I've noticed with my MBP's Iris Plus 650, Fortnite in general achieves pretty decent frame rates (mostly between 40 and 70 FPS) under macOS, except for severe framedrops every few minutes, which others seem to have also experienced on Macs with Intel GPUs.

Is there something specific to Intel's GPU architecture that might cause this, or could it be a driver issue?