Most beans

Best Mac Computer for Gaming 2015
This article will focus on providing readers with the best Mac (both desktop and laptop) for gaming at various price points. The contents of the article are as follows:

Gaming on a Mac? Is This a Joke? - I will discuss why the Mac is a valid gaming platform and give links to some places you can find Mac games.

Understanding Computer Components in Relation to Gaming - I will explain the 3 main components (CPU, graphics card and RAM) and how they relate to gaming.

Best Mac Laptop for Gaming - I will go through the best pricepoints for Mac laptops in relation to gaming performance.

Best Mac Desktop for Gaming - I will go through the best pricepoints for Mac desktops in relation to gaming performance.

Read on to learn more!

Gaming on a Mac? Is This a Joke?No. Mac's are very capable of running a lot of great gaming titles these days, and Mac gaming has never been better then it is right now. There have never been so many current, big name titles on the Mac, then there is today. That being said... the Mac is not a platform that Apple has designed necessarily for gaming, and as such you'll have to pay a lot more for a decently powerful Mac that you would have to if you bought a dedicated gaming PC running Windows. Also, there are many big-name titles that still haven't made it over to the Mac.

Mac gaming is a small but growing community that is started to gain serious traction in the computer gaming world. Here are a few places to find games for your Mac:

Graphics Processing Unit (GPU): The Most Important Factor
The primary factor of any gaming computer is the strength of the video card. No amount of RAM or CPU power can compensate for a weak video card (also known as a GPU - graphics processing unit). This means that a fast i7 chip will struggle with games if it is not paired with at least a semi-capable GPU. Mac's come equipped with GPU's from three different vendors:

Intel Graphics: Not to be confused with Intel CPU's, Intel has a wide range of integrated GPU's that come coupled with their CPU's. Intel's processing units are power efficient and capable of light gaming but they fall far behind in performance compared to a discrete GPU from Nvidia or AMD. Intel's integrated graphics are featured in all Mac computers, with the exception of the Mac Pro.

Nvidia Graphics: Nvidia is the largest GPU manufacturer on the planet with GPU's ranging from tiny Android devices all the way up to massive workstation computing units. Nvidia currently has GPUs in the mid range iMac models in the form of the 750M and 755M.

AMD Graphics: AMD manufacturers a wide range of products, including GPU's, and are the second largest GPU manufacturer in the world. Apple has selected AMD as the current provider of high-end graphics options for the Mac. The iMac 5K, 15" Retina Macbook Pro and Mac Pro all utilize AMD graphics as the best configurable graphics option.

When comparing Intel, Nvidia and AMD graphics it should be a general rule of thumb that AMD and Nvidia will always be superior to Intel in terms or graphics (and therefore gaming) performance. Currently Apple does not offer any high end options from Nvidia, and as such AMD graphics are the most powerful option available to Mac computers.

Central Processing Unit (CPU): Important Up to a Certain Level
The CPU is responsible for processing a wide range of different computer functions, including many parts of a game such as the AI, physics, game logic and more. It is important the a CPU achieves a base level of performance in order to not bottleneck the GPU when it comes to a game's total performance. To keep things simple I am going to say that any i5 or i7 CPU that is faster then 2.5 GHz is "good enough" for gaming on a Mac. Faster speeds will benefit, but once the CPU is at the 2.5 GHz or faster level, it is better to spend money on a faster GPU instead of a faster CPU.

RAM: Just Make Sure You Have Enough
RAM (Random Access Memory) is where your computer loads the game files for quick access when you are playing a game. 8 GB is the recommended amount of RAM for most modern games. Many games are even playable on 4 GB's of RAM. Your computer requires a certain amount of RAM in order to load the game files into, and once it has enough it does not (and cannot) use anymore. 16 GB of RAM will give you zero performance benefit over 8 GB in 99% of gaming situations.

Cost Has Nothing to Do With Power
A common misconception with a lot of computer buyers is that more expensive equals better performance. Spending more money can be more powerful, but it's extremely important that you look at the components that make up a computer and decide whether or not it makes sense for you. For example, the new Macbook that Apple released (the super-thin one) costs around $1500 and is TERRIBLE at gaming, and that's okay. It is not a gaming machine and should not be used as one. The $700 Mac Mini (see below) destroys it in terms of power.

Best Mac Gaming Laptop for the Budget (Macbook, Macbook Air, Macbook Pro)This section will focus on providing the best budget points for getting a Mac laptop for gaming. If gaming is your primary focus I don't recommend spending in between these price points, as there is no real gaming performance gain in between them.

Best Laptop for $1300: 13" Retina Macbook Pro with 2.7 GHz i5 CPU and Intel Iris Graphics 6100: Apple Store Link
This Mac is really the minimum of what it takes to run modern games at somewhat acceptable settings. The Intel Iris 6100 graphics are considerably faster then the previous generation Iris graphics (found in the Mac Mini).

Expect to play modern games at low-medium settings at 1080p resolution.

Best Laptop for $2500: 15" Retina Macbook Pro with 2.5 GHz i7 CPU and AMD Radeon R9 M370X: Apple Store Link
This is the only Mac laptop with a dedicated GPU. It will perform much faster then the Intel Iris Graphics 6100 and is the most powerful Mac laptop you can buy. When compared with desktop Mac computers (see below) this will perform roughly identical to the $1800 iMac.

Expect to play modern games at medium-high settings at 1080p resolution.

Best Mac Gaming Desktop for the Budget (Mac Mini, iMac, iMac 5K, Mac Pro)This section will focus on providing the best budget points for getting a Mac desktop for gaming. If gaming is your primary focus I don't recommend spending in between these price points, as there is no real gaming performance gain in between them.

Best Desktop for $700: Mac Mini with 2.6 GHz i5 CPU and Intel Iris Graphics: Apple Store Link
This is the cheapest price point that has a capable CPU and somewhat capable graphics. The 2.6 GHz CPU coupled with the the Intel Iris graphics will offer a playable performance from most modern games (the most demanding titles will struggle to run, even at low settings). Note: you will have to purchase a monitor, keyboard and mouse in order to use the Mac Mini

Expect to play games at low settings at 1080p resolution.

Best Desktop for $1500: iMac with 2.9 GHz i5 CPU and Nvidia GeForce GT 750M: Apple Store Link
This is the first Mac desktop that contains a dedicated graphics card and will perform much faster then both the Mac Mini and the 13" Macbook Pro.

Expect to play games at medium-high settings at 1080p resolution.

Best Desktop for $2000:iMac 5K with 3.3 GHz i5 CPU and AMD Radeon R9 M290: Apple Store Link
This iMac will perform considerably faster then the $1500 version and surpass the 15" Retina Macbook Pro in performance. The M290 will be able to power nearly all modern games with decent graphics settings.

Expect to play games at high settings at 1080p resolution.

Best Desktop for $2550: iMac 5K with 3.5 GHz i5 CPU and AMD Radeon R9 M295X: Apple Store Link
This iMac is the most powerful gaming iMac having roughly 15% more graphics power then the $2000 iMac. Paying extra for the faster CPU is not recommended as it will be of no benefit in virtually all games. Note:the M295X graphics card must be specifically selected during the configuration process.

Expect to play games at high-ultra settings at 1080p resolution.

Best Desktop for $3000+: None (Honorable Mention: Mac Pro with 3.3 GHz Xeon CPU and Dual AMD FirePro D700)
The step up from the iMac line is the Mac Pro. The problem with the Mac Pro for gaming is that it is meant as a productivity machine and not a gaming machine. The CPU and GPU in the Mac Pro are certainly powerful, but they are both extremely expensive parts that aren't really designed for gaming. The $4000 Mac Pro with 3.3 GHz Xeon CPU and D700 graphics (fastest available) is only slightly faster then the top end iMac in gaming, while costing $1500 more and not coming with a beautiful 27" display.

If the Mac user is running bootcamp the dual GPU's can be combined to nearly double graphics performance, but if the user is willing to utilize Windows for gaming, then why spend $4000 on a Mac Pro? Get a gaming PC for $1000 instead that will have equal gaming performance.

The Mac Pro is a similar to a powerful Clydesdale horse, while a gaming computer is like a race horse. Both types of horses are fast/powerful in their own way, but they have completely different strengths and uses.

'Twas the night after Christmas, when all through the siteNot a poster was typing, not even a Snakethe Battle Cat was stuffed by the misses with hairIn hopes that his hairballs would soon bring gasps for airThe posters were nestled snug in their chairsWhile dreams of Mac Games danced ‘neath their hairsAnd DaveyJJ with his maple leaf and Frost with his star,Wished each other Merry Birthmas, knowing more fun was not farWhen on Frost's keyboard there arose such a clatterAfter he sprung from his bed to see what was the matter.Away to the Moderator's Window he flew like a flashclicked Warn and Flag Spammer and a Russian's hopes were soon dashedThe monitor on his half-rested face did glow,Brightened the sneer over another spammer laid low.When, what to his startled eyes should appearBut an army of WoW Gold sellers, who DDoSed him and cheered.What hairball flew by that was so fuzzy and fatThey knew in a moment it must be the Battle Cat.More rapid than Frigidman and Tuncer he came,And he whistled, and shouted, and called the Moderators by name:“Now! Whaleman, now! Brixton, now! Tesseract and Eric,“On! Davey, on! Frost, on! macdude and Janich!”To the top of the page! To the top left bar!Now ban away! Ban away! Ban away all!As dead sprites before the RAM buffer vanishThe copper of the army’s fat pipes did tarnishSo back ‘cross the ocean the spammers flewEach conspicuously named for a country– even the Vatican too.And then in a twinkling, Tuncer appeared with a sparkTo open the Mac Game Store with sales before dark!As the earliest posters began to ariseCheap Mac Games tempted them and danced ‘fore their eyes!With Aspyr, and Feral, and Virtual Programming tooSoon hard drives were filling with gaming so true!He laughed as he sold them in spite of SteamBut his servers were taxed and lo they did scream.With a wink of an eye and a twist of his headFrigidman let Tuncer know there was nothing to dread.He spoke not a word, but went straight to his work,Then kicked IPB with a neat little smirk.And laying a finger on his enter keyHe deployed a quick fix for all to see.With the servers humming and forum safethe Battle Cat swallowed the hairballs he had used to strafe.He sprung to his cat tree, to his team gave a whistle,And away they all flew, like the down of a thistle!But I heard Frost exclaim, ere he turned out his lights –Happy Christmas to all, and to all a good night!

I'm focused on keeping Metal support in UE4 moving forward with the rest of UE4 (it's a big team, so it keeps changing a lot!) so optimisation of the games is handled by others. As such there may well be more going on, but below I've summarised the obvious things that come to mind. I'll also note that we pay a penalty of 10-20% just for running on macOS/Metal rather than Windows/D3D11 which is often the difference between one resolution or quality level and another.

Irishman, on 16 January 2018 - 08:11 AM, said:

So, during this past Christmas break, my sons and I played quite a bit of Fortnite Battle Royale (them mostly in Xbox One, and me on my iMac, dual booting High Sierra and Bootcamp Windows 10). For a few days, we were playing it so frequently in Win10, that I just left Windows running without rebooting back into macOS. My iMac's specs are as follows: late 2012, 21.5" 1080p screen, 2.9 GHz i5, 8GB RAM, 1 TB HD, nVidia GT 650M 512MB GPU. I'm running nVidia's Web Drivers (up-to-date as of this morning).

5. Please notice that I am not shocked that a Mac of mine's vintage experiences problems playing the game. Also, please note that I am fully aware that the game is in Early Access, which means it's not fully optimized. What I hope to do is present mine and my sons's experiences playing the game. The differences between the performance in Windows 10 and macOS High Sierra are disappointing, and I can only hope that further Metal development brings performance gains more on-par with Windows 10.

The lack of VRAM on that GPU is really going to hurt on macOS, we use more VRAM on macOS so we'll be paging a lot more between CPU & GPU which is bad. That's a necessary evil to avoid other inefficiencies caused by trying to map a D3D11-oriented engine on to the Metal API.

Irishman, on 16 January 2018 - 08:11 AM, said:

1. The loading screens and transitions are much smoother and reliable in Windows. In High Sierra, I experience random errors, (mostly failure to join a game pop-ups). Sometimes, I would have to quit out of Fortnite, then quit out of the Epic Launcher, both of which are laborious and slow to wait for. My son, at one point, while waiting to join a game in High Sierra, asked me if my Mac was frozen up.

Behind the loading screen on macOS it will be doing a heck of a lot more shader compilation than it has to under Windows and unfortunately that is currently a highly serial single-threaded process.

Irishman, on 16 January 2018 - 08:11 AM, said:

2. After having played around with the settings to achieve the best results, I learned that it will only play on lowest settings at 640x480 in High Sierra, with a frame rate that varies wildly between 5 FPS to 60 FPS, even on loading screens. On Windows, I can run on 720p, with all settings lowest, except for draw distance, which I max out), for obvious reasons. These settings give me a frame rate from between 40-60 FPS. Even with the in-game frame cap turned up to 120 FPS, we noticed no difference in measured frame rate.

The varying frame-rate will stabilise somewhat if you play on the same build for long enough as the local shader cache builds up entries. The price is longer load times of course. Metal (like both D3D12 & Vulkan) exposes the developer to the reality of how shaders are actually compiled for GPUs and expects developers to optimise around this. Unfortunately most engines are built on the abstraction provided by D3D11 which inherited the D3D9 model of separate shaders with near-zero runtime compilation cost, achieved by the driver vendors investing the man-hour equivalent of tens of millions of dollars to aggressively optimise their runtime shader compilers and make their D3D-driver fully asynchronous (so games don't block when calling D3D) via substantial multi-threading. Often they optimised their shader compilers to generate code that is *trivial* to patch when render-state (like render-target or texture-formats) change or even outright replace the games shaders with their own "specially optimised" version. The new APIs force/encourage driver vendors to optimise their shader compilers to generate "perfect" GPU shader code, even if it makes shader compilation much slower as that is now the game developers problem and not directly the vendors. That is not to say the vendors don't care - merely that the APIs send you down a particular implementation route.

Another cause of fluctuating frame-rates is that D3D11's GPU resource management is also heavily abstracted away from the developer with the vendor able to do a lot of under-the-hood optimisations as they control the implementation. Metal puts that all on the game developer a bit more like Vulkan (though the API & semantics are quite different) and right now we aren't as efficient at allocating resources as D3D which can cause hitches on the CPU.

Irishman, on 16 January 2018 - 08:11 AM, said:

3. Once in-game, the differences between the two OS's mostly melted away, giving an enjoyable playing experience. The only hitch I encountered was a slight stutter when scoping in with the game's sniper rifle, an action which was smooth in Windows.

That'll be shader compilation or resource allocation which is more expensive on Metal than D3D. If the game plays well in-game on macOS then really that's the important part.

Irishman, on 16 January 2018 - 08:11 AM, said:

4. Regarding menu performance under High Sierra: one thing I quickly learned not to do is to bring up the menu to chance graphic settings once in-game. Doing so introduced the same transition slowness I noticed in #1 above. Sometimes, the game would freeze up and I would was forced to command-tab out of it, and restart my Mac.

Well, yep, that'll be the engine recompiling all the shaders & shader-pipelines

Dude, you call yourself a father? You call yourself a role model to your child??? You should be playing Marathon with her in your lap. It's never too soon to introduce children to the venerable classics. I want her first words to be "Frog blast the vent core!" Now go, you know what to do.

I know, I know. But I also have to teach her to focus on her back log, and I was playing The Witcher when she was born! It is quite evident from this forum, that back logs wasn't a problem when our parents were raising us, so we never learned about it. But now that I finished that game, I can teach her how to game properly!

I am annoyed that Apple refuses to make a tricked out desktop gaming rig with a cool case that has flaming apples on the sides and a special version of OS X where NOTHING is flat. I'd like this system to be a small tower but big enough to accommodate a full sized GPU card. The system should feature easy access for upgrades and repairs. The system specs should feature top end components selected with gaming in mind. It should come with 3 years of AppleCare standard. I would prefer it to be priced just slightly over the Mac Mini line. Apple has more money than they know what to do with. It is time that they gave back to the community, the neglected gaming community in particular.

DiRT Rally for Mac works great. But there are not several video settings (Advanced Ambient Occlusion, Smoke Shadows & Advanced Blending).

Smoke Shadows & Advanced Blending are special Windows specific features written by Intel that are not exposed on Mac OS X.

I don't recall exactly why Advanced Ambient Occlusion isn't available right now but it's either also an Intel feature like above or the it could have been causing GPU restarts / kernel panics and would have been removed for stability reasons.

jeannot, on 17 November 2017 - 12:30 PM, said:

Apparently, the Mac version of the game can't be set to windowed mode. The PC version can.
While the game runs fine (and it should, as I basically have the fastest Mac), performance under boot camp is 30-50% better.

Our application takes advantage of running in "detached mode" so that the window compositor is bypassed which should result in a smoother experience. A requirement on running in detached mode is that the application is fullscreen. As soon as you enable Windowed mode you can't use detached mode.

With regards to performance, I would be interested as to which GPU you are testing on, for AMD GPUs anything below 20% performance disparity is very unusual for us to see on any of our Metal games. Depending on the setting, if framerate is above 120fps (which is quite possible on DiRT Rally with it's engine), we are capped at 120 by the OS, irrespective of the Vsync interval or cap set by the application, which would massively skew comparisons to Windows at lower settings.

With recommended mode we run a stable and consistent capped framerate in all scenarios, and we have done this for every single supported Mac in the range, which we see as far more important that maximum framerate, when running on a 60Hz screen. By making the frame rate stable and not rendering frames you'll never see on a 60Hz screen it allows us to thermal temperatures down which in turn means you'll get consistent performance over extended gameplay. This means a cooler Mac and your batteries will last longer too if you're a laptop.

If you think you have a bug with DiRT Rally then please do contact our support with a support report attached and we'll investigate your issue.

2016 is over halfway through and the Macintosh line of computers has been all but abandoned by Apple it seems. The Macbook Pro last saw an update in May of 2015, which is eons ago in the tech world. Despite the current 16 month wait now for an update, there seems to be nothing imminent on the Mac rumors websites. Apple seems to be all in on the iPad Pro and Emoji’s this year, leaving the company too stretched to do anything at all with their computer line.The MacRumors Mac Buyer's Guide as of August 7th, 2016

Old and Outdated Hardware is the Common Theme

It would be too lengthy of an article to go through every product line to explain why it is outdated, so I will focus on the Macbook Pro. The Macbook Pro, the signature Apple notebook experience, does not feature that latest and greatest hardware in nearly any way. Apple’s failure to update to Intel Skylake’s architecture means slower compute power, slower graphics, worse battery life, and no Thunderbolt 3.

CPU: The MBP’s CPU is based on the Broadwell architecture, which was replaced by Skylake back in late 2015 and Intel’s “Kaby Lake” architecture is due out this fall. In a few months Broadwell will be 2 generations behind.

Graphics: The MBP’s graphics chip (Intel Iris 6100) has been replaced has the Iris 540 in the Skylake generation (With options for even faster integrated graphics such as the Iris 550 and 580). The Iris 540 is approximately 15% faster then the HD 6100. Graphics speed is something that the Retina screens desperately need.

RAM: Intel Skylake platform also brings DDR4 RAM, which has substantially higher speeds than DDR3. RAM speed is particularly important when using integrated graphics (which the Macbook Pro uses in nearly all configerations)

Body/Design: The Retina Macbook Pro was released in 2012 and since then has not received any body/design updates. I don't think that the machine looks bad or dated by any means but 4 years on the same platform is a little too long and Apple could incorporate at least some minor design updates to improve the machine such as: smaller screen margins, thinner, bigger battery, re-designed cooling system, slot for M.2 drive, etc.

A simple socket update to Skylake would not have been a challenging proposition. All of the major PC OEM’s such as Dell, HP, and Lenovo have been shipping Skylake since as early as Fall 2015. Apple saw fit to update the new single-port Macbook to Skylake, but didn’t want to spend the time to upgrade the Pro/Air. Apple is the only major PC manufaturer to not update their product line to Skylake.

Last Updated 1.5 Years Ago...

The entire Mac lineup (with the exception of the Macbook) has a "Don't Buy" rating from MacRumors meaning that updates are supposed to be imminent. Here is when the Mac linup was last updated:

Macbook: 111 days (relatively recently - the only Mac with this status)

iMac: 300 days

Macbook Pro: 447 days (this marks the longest update gap in recent history for the Macbook Pro. The previous longest gap was 294 days)

Macbook Air: 518 days (also the longest gap in recent history for the Macbook Air. The previous longest gap was 350 days since the Macbook Air was re-designed in 2010)

Mac Mini: 662 days (Surprisingly, this isn't the longest gap - which was 723 days. The Mac Mini has often been a disowned device by Apple)

Mac Pro: 963 days (December 2013 was the last update. Creative professionals who are willing to drop $5000+ on a desktop are thrilled to be purchasing 3 year old hardware I'm sure)

Average numbers of days for last Mac update: 500 days (nearly 1.5 years).

The iPad Pro is Receiving the Marketing Push from Apple

The iPad Pro is the product that is receiving the most marketing push from Apple when it comes to the professional space. Take a look at Apple’s recent video ad for the iPad Pro:

One has to wonder if Apple is intentionally holding off on updating it’s Mac product line so as to give the iPad Pro more time in the limelight. The iPad Pro is great for content consumption and very basic content creation. Anything beyond that requires a real computer.

Apple’s Approach to the Mac Makes it Seem Like They Lack the Resources

One of the most prominent excuses that I hear for Apple not upgrading the Mac product line is that the iPhone makes them so much more money. It is a true statement that the iPhone trumps the Mac in terms of total sales (the iPhone represented 65% of Apple’s revenue versus 10.1% for the Mac in Q2 2016. Source) but the Mac is still making Apple billions of dollars each year (and billions more then the iPad makes them, which represents 8.5% of their revenue). The problem with this argument is that it makes the assumption that Apple does not have the resources to target multiple platforms simultaneously.

Apple is one of the most valuable companies in the world with vast resources for any number of projects. Apple’s revenue dwarfs other technology companies such as Dell, Lenovo, Google, and HP and yet it seems like those are the companies that are coming out with interesting new designs and innovation every year. What has Apple done to excite in the technology market in 2016? Some new Emoji's for iOS?

Instead of innovation we see many of their product lines stagnating while they push product updates towards the lines that they think we should care the most about. The new one-port Macbook received an architecture update to Skylake this spring . Also in the spring we saw the release of the 9.7” iPad Pro. This fall we will surely see the release of the iPhone 7. The Macbook Pro, Macbook Air, iMac, Mac Mini, and Mac Pro are all sitting on out dated hardware in at least one way. Apple it is time to care about those who use Macs for productivity.

Tim, in the extremely slim chance that you see this, please speed up the Macbook Pro update. I need a new work laptop, and the Dell XPS 13 is looking mighty tempting in the meantime.

The Settlers 7 will apparently just stop working this coming october. So nice of them to have such a horrible DRM system that they can just give up on a game and reject it from all who bought and enjoy playing it still.

I say, if there's sufficient cash for it, go for #3. If not, kill the IMG site and kill the MGS forums and merge the IMG forum with the MGS.

An active, friendly community forum for a dead news site and a dead forum for an active retail store site, all run by the same entity, makes extremely little sense both from a business and a workload perspective. I would like to see IMG survive, but if worse comes to worst, taking the active site and pairing it with the active forum seems the best option.

EDIT: And if you DO do that... for the love of all that is holy, don't just tell everyone here to move. This place's long history of discussion and problems solved makes it a cornucopia for searches. That's a big asset.

I wonder when Nvidia is going to come out of the closet with some proper macOS support again.

I mean, they're sort of stuck in a gray area.

For not actually having any business with Apple, they seem to be pretty decent. You can drop any reference Pascal card into a Mac Pro and have it basically just work.

I do wish Apple would just get over it and start using Nvidia chips again. I get it, they're a terrible partner, but for the sake of choice and variety on the platform (as well as just having a better product overall), I don't think the current stance of "never Nvidia" is really helpful.

John Carmack of Id Software fame recently posted his experiences with Steve Jobs on his Facebook page.

It makes for a very interesting read:

Quote

Steve Jobs

My wife once asked me “Why do you drop what you are doing when Steve Jobs asks you to do something? You don’t do that for anyone else.”

It is worth thinking about.

As a teenage Apple computer fan, Jobs and Wozniak were revered figures for me, and wanting an Apple 2 was a defining characteristic of several years of my childhood. Later on, seeing NeXT at a computer show just as I was selling my first commercial software felt like a vision into the future. (But $10k+, yikes!)

As Id Software grew successful through Commander Keen and Wolfenstein 3D, the first major personal purchase I made wasn’t a car, but rather a NeXT computer. It turned out to be genuinely valuable for our software development, and we moved the entire company onto NeXT hardware.

We loved our NeXTs, and we wanted to launch Doom with an explicit “Developed on NeXT computers” logo during the startup process, but when we asked, the request was denied.

Some time after launch, when Doom had begun to make its cultural mark, we heard that Steve had changed his mind and would be happy to have NeXT branding on it, but that ship had sailed. I did think it was cool to trade a few emails with Steve Jobs.

Several things over the years made me conclude that, at his core, Steve didn’t think very highly of games, and always wished they weren’t as important to his platforms as they turned out to be. I never took it personally.

When NeXT managed to sort of reverse-acquire Apple and Steve was back in charge, I was excited by the possibilities of a resurgent Apple with the virtues of NeXT in a mainstream platform.

I was brought in to talk about the needs of games in general, but I made it my mission to get Apple to adopt OpenGL as their 3D graphics API. I had a lot of arguments with Steve.

Part of his method, at least with me, was to deride contemporary options and dare me to tell him differently. They might be pragmatic, but couldn’t actually be good. “I have Pixar. We will make something [an API] that is actually good.”

It was often frustrating, because he could talk, with complete confidence, about things he was just plain wrong about, like the price of memory for video cards and the amount of system bandwidth exploitable by the AltiVec extensions.

But when I knew what I was talking about, I would stand my ground against anyone.

When Steve did make up his mind, he was decisive about it. Dictates were made, companies were acquired, keynotes were scheduled, and the reality distortion field kicked in, making everything else that was previously considered into obviously terrible ideas.

I consider this one of the biggest indirect impacts on the industry that I have had. OpenGL never seriously threatened D3D on PC, but it was critical at Apple, and that meant that it remained enough of a going concern to be the clear choice when mobile devices started getting GPUs. While long in the tooth now, it was so much better than what we would have gotten if half a dozen SoC vendors rolled their own API back at the dawn of the mobile age.

I wound up doing several keynotes with Steve, and it was always a crazy fire drill with not enough time to do things right, and generally requiring heroic effort from many people to make it happen at all. I tend to think this was also a calculated part of his method.

My first impression of “Keynote Steve” was him berating the poor stage hands over “This Home Depot popsnizzle” that was rolling out the display stand with the new Mac, very much not to his satisfaction. His complaints had a valid point, and he improved the quality of the presentation by caring about details, but I wouldn’t have wanted to work for him in that capacity.

One time, my wife, then fiancée, and I were meeting with Steve at Apple, and he wanted me to do a keynote that happened to be scheduled on the same day as our wedding. With a big smile and full of charm, he suggested that we postpone it. We declined, but he kept pressing. Eventually my wife countered with a suggestion that if he really wanted “her” John so much, he should loan John Lassiter to her media company for a day of consulting. Steve went from full charm to ice cold really damn quick. I didn’t do that keynote.

When I was preparing an early technology demo of Doom 3 for a keynote in Japan, I was having a hard time dealing with some of the managers involved that were insisting that I change the demo because “Steve doesn’t like blood.” I knew that Doom 3 wasn’t to his taste, but that wasn’t the point of doing the demo.

I brought it to Steve, with all the relevant people on the thread. He replied to everyone with:

“I trust you John, do whatever you think is great.”

That goes a long way, and nobody said a thing after that.

When my wife and I later started building games for feature phones (DoomRPG! Orcs&Elves!), I advocated repeatedly to Steve that an Apple phone could be really great. Every time there was a rumor that Apple might be working on a phone, I would refine the pitch to him. Once he called me at home on a Sunday (How did he even get my number?) to ask a question, and I enthused at length about the possibilities.

I never got brought into the fold, but I was excited when the iPhone actually did see the light of day. A giant (for the time) true color display with a GPU! We could do some amazing things with this!

Steve first talked about application development for iPhone at the same keynote I was demonstrating the new ID Tech 5 rendering engine on Mac, so I was in the front row. When he started going on about “Web Apps”, I was (reasonably quietly) going “Booo!!!”.

After the public cleared out and the rest of us were gathered in front of the stage, I started urgently going on about how web apps are terrible, and wouldn’t show the true potential of the device. We could do so much more with real native access!
Steve responded with a line he had used before: “Bad apps could bring down cell phone towers.” I hated that line. He could have just said “We aren’t ready”, and that would have been fine.

I was making some guesses, but I argued that the iPhone hardware and OS provided sufficient protection for native apps. I pointed at a nearby engineer and said “Don’t you have an MMU and process isolation on the iPhone now?” He had a wide eyed look of don’t-bring-me-into-this, but I eventually got a “yes” out of him.

I said that OS-X was surely being used for things that were more security critical than a phone, and if Apple couldn’t provide enough security there, they had bigger problems. He came back with a snide “You’re a smart guy John, why don’t you write a new OS?” At the time, my thought was, “frak you, Steve.”.

People were backing away from us. If Steve was mad, Apple employees didn’t want him to associate the sight of them with the experience. Afterwards, one of the execs assured me that “Steve appreciates vigorous conversation”.

Still deeply disappointed about it, I made some comments that got picked up by the press. Steve didn’t appreciate that.
The Steve Jobs “hero / popsnizzlehead” rollercoaster was real, and after riding high for a long time, I was now on the down side. Someone told me that Steve explicitly instructed them to not give me access to the early iPhone SDK when it finally was ready.
I wound up writing several successful iPhone apps on the side (all of which are now gone due to dropping 32 bit support, which saddens me), and I had many strong allies inside Apple, but I was on the outs with Steve.

The last iOS product I worked on was Rage for iOS, which I thought set a new bar for visual richness on mobile, and also supported some brand new features like TV out. I heard that it was well received inside Apple.

I was debriefing the team after the launch when I got a call. I was busy, so I declined it. A few minutes later someone came in and said that Steve was going to call me. Oops.

Everyone had a chuckle about me “hanging up on Steve Jobs”, but that turned out to be my last interaction with him.
As the public story of his failing health progressed, I started several emails to try to say something meaningful and positive to part on, but I never got through them, and I regret it.

I corroborate many of the negative character traits that he was infamous for, but elements of the path that led to where I am today were contingent on the dents he left in the universe.

Hi IMG, hope you don't mind me creating a new thread for this; but a lot of people have been asking for an update, and now that we have some I want to make sure it reaches the people who asked.

We know you are hotly anticipating Deus Ex: Mankind Divided on macOS. So we are pleased to confirm that not only are we now in the final stages of development, but also that Deus Ex: Mankind Divided will be polished with Apple’s Metal 2 graphics technology.

Please be aware that despite our best efforts, we couldn’t get Mankind Divided to run well enough on Intel and Nvidia graphics cards. Therefore, the initial release will support AMD graphics cards only.

The game is coming to the Feral Store and Steam. Full system requirements and a release date will be revealed closer to launch.