I’d like you to be right, but recent iOS devices are holding out pretty well with their Secure Enclaves and as far as I know PS4 is still secure against homebrew. DRM is extremely difficult, but it does seem to be possible, at least for vendors with full control of both the hardware and software stacks.

It’s not clear from the publically available information if these hacks are actually allowing the iOS devices to load and run third-party hacked versions of iOS (which I would call a complete break of their DRM) or just allow the hackers to retrieve a full copy of the device’s contents. I’d certainly grant that the latter definitely counts as a major breach in the DRM even if it falls short of total pwnage of the system security in the vein of the PS3 hacks.

It seems strange Khronos should focus on mGPU, since basically nobody cares about it in DX12

You should rephrase it. No one really cares about DX12 in general, especially if you consider non graphics applications

Multi GPU access is very useful for GPGPU tasks, I'd say ware more than for gaming, since using multiple GPUs for gaming is often a major overkill.

I think the big advantage for gaming would be if any gains could be seen for laptops that have discreet (but not top of the line) GPU's. Being able to gain a few extra FPS by employing both integrated and discrete GPU's simultaneously would be pretty awesome. Not sure if that would bear out in reality, but something to daydream about

IIRC one of the use cases proposed was moving post-processing effects over to the integrated GPU. This was demonstrated in 2015 but I'm not sure if it has been implemented in actual games at all.

This extra capability is interesting because, though multiple discrete GPUs are still relatively uncommon and are usually found only in the most expensive gaming systems, it's very common for systems to have one discrete GPU and one integrated GPU.

If coding this into games is pretty much seamless, it's going to make the Ryzen APUs even more attractive.

Direct X12 supports it but I haven't seen it in the wild yet. Vulkan still requires GPUs to be similar for mGPU.

And now it's facing a worthy opponent. Lock-in approach will eventually die out, same as happened with the Web and multiple lock-ins of the past (ActiveX, Silverlight, Flash, you name it). But obviously MS, Apple and Sony will try to resist.

It's only facing a worthy opponent if Vulkan is adopted and right now Vulkan adoption is not good. Basically the conundrum is that game developers are using more off the shelf engines and at that point the backend doesn't really matter too much.

I think there's only a handful of games that even use it? The Talos Principle is one, if I remember correctly. The level of DX lock-in in the industry is significant, and probably isn't going away anytime soon, at least for AAA titles. Indies are more likely to support Linux in general.

While it's not exactly very common, there are some AAA titles too. Check out Doom, the results are pretty great.

And now it's facing a worthy opponent. Lock-in approach will eventually die out, same as happened with the Web and multiple lock-ins of the past (ActiveX, Silverlight, Flash, you name it). But obviously MS, Apple and Sony will try to resist.

I believe you're naive. Apple, the poster child of the lock-in approach you deride, is extremely successful. Lock-in is here to stay. Even the web is pseduo-locked-in by monoculture of the webkit engine. CUDA trumps OpenCL, DirectX is thriving across xbox/windows, Metal on ios/macos, Gsync on displays, etc.

And now it's facing a worthy opponent. Lock-in approach will eventually die out, same as happened with the Web and multiple lock-ins of the past (ActiveX, Silverlight, Flash, you name it). But obviously MS, Apple and Sony will try to resist.

I believe you're naive. Apple, the poster child of the lock-in approach you deride, is extremely successful. Lock-in is here to stay. Even the web is pseduo-locked-in by monoculture of the webkit engine. CUDA trumps OpenCL, DirectX is thriving across xbox/windows, Metal on ios/macos, Gsync on displays, etc.

I am really, really hoping that HDMI 2.1 VRR finally kills off GSync in favour of an open standard for dynamic refresh rate, so that I no longer have to worry about display choice constraining GPU choice and vice versa. I’m holding off on a new TV until there are OLEDs with HDMI 2.1 VRR support for that reason. I’m hoping that while nVidia has massive swing in the computer monitor market, they won’t be able to force the TV market to bow to the proprietary nature of GSync.

DRM is broken by design and can not be fixed with any computer or API. This is just more broken "magic" sprinkled on top to fool CEOs and PHBs that it is supported.

An extreme example, but Newtonian physics is "broken" - we are in a relativistic world and you can never get Newtonian physics models to match the world perfectly. But it works well enough that even a dog can catch a ball.

It's not DRM has to work perfectly. It has to make it hard enough for the normal non-technical user to pirate that it's easier to just pay money. Whether it breaks the user in other ways (I too remember when Sony shipped a rootkit) is kinda immaterial.

It seems strange Khronos should focus on mGPU, since basically nobody cares about it in DX12

You should rephrase it. No one really cares about DX12 in general, especially if you consider non graphics applications

Multi GPU access is very useful for GPGPU tasks, I'd say ware more than for gaming, since using multiple GPUs for gaming is often a major overkill.

I think the big advantage for gaming would be if any gains could be seen for laptops that have discreet (but not top of the line) GPU's. Being able to gain a few extra FPS by employing both integrated and discrete GPU's simultaneously would be pretty awesome. Not sure if that would bear out in reality, but something to daydream about

It doesn't matter what we think it's good for. The same developers that for years couldn't be bothered to produce OpenGL/DX conformant code, forcing both Nvidia and AMD to provide workarounds in application profiles, are now uninterested in doing the legwork to implement mGPU (surprise, surprise).This was something that was done by drivers behind the curtains (Crossfire/SLI) and now it needs to be done on a per-title, or at least per-game engine basis.

DRM is broken by design and can not be fixed with any computer or API. This is just more broken "magic" sprinkled on top to fool CEOs and PHBs that it is supported.

An extreme example, but Newtonian physics is "broken" - we are in a relativistic world and you can never get Newtonian physics models to match the world perfectly. But it works well enough that even a dog can catch a ball.

It's not DRM has to work perfectly. It has to make it hard enough for the normal non-technical user to pirate that it's easier to just pay money. Whether it breaks the user in other ways (I too remember when Sony shipped a rootkit) is kinda immaterial.

As a more related example, encryption is broken by design*). Sure, brute-forcing may take so much time and/or resources that it's not worth the effort; but any encryption can be broken. The same for DRM too - if it just deters enough users from trying to break it, then it does its intended function. This is not to say I'd be a fan of DRM by any measure, but "broken by design" isn't really correct.

*) In before the OTP lunatics chime in - yeah, in theory OTPs are secure. But then we've left with the problem of key exchange.

Apple, the poster child of the lock-in approach you deride, is extremely successful.

Their success doesn't translate in their interest in their own platforms. macOS has been stagnating for years (according to macOS users themselves, I'm not using it myself). And meidocre hardware doesn't really translate into good gaming experience.

To put it simply, Apple don't care about gaming, that's why they don't pay attention to lock-in actually hurting them here. When they care, they drop lock-in. Look at the recent example of Apple joining Alliance for Open Media to back AV1, despite their recalcitrant insistence on using closed codecs until recently.

Why such sudden change? Because they figured, that they care about video, and lock-in will hurt them big deal.

Can I delegate 100% of the work to one GPU, and delegate only the video output to another? I ask because I have a nice HD CRT monitor I would like to plug back in and that seems a good way to get around the lack of analog out support on newer cards.

The dream is to be free of DirectX. That is the only thing that keeps PC gaming locked in orbit around Windows.

Well, that and the fact that Linux is a rounding error in the consumer space. Most regular people aren’t interested in an OS that looks like a third rate Windows or Mac OS clone or the “check the forums” support model.

The dream is to be free of DirectX. That is the only thing that keeps PC gaming locked in orbit around Windows.

Well, that and the fact that Linux is a rounding error in the consumer space. Most regular people aren’t interested in an OS that looks like a third rate Windows or Mac OS clone or the “check the forums” support model.

Yeah. And I'd add that PC gamers, even though they are a smart group of folks, don't want to get a Ph.D. in computer science for running Linux nor do they wish to spend twice the money for half the power for gaming on a Mac.

Yeah. And I'd add that PC gamers, even though they are a smart group of folks, don't want to get a Ph.D. in computer science for running Linux nor do they wish to spend twice the money for half the power for gaming on a Mac.

Your assessment is quite late to the party. It's not '90s - '00s anymore when Linux gaming was rather rough. Today gaming on Linux is pretty easy. Sure, it's a smaller market, but a growing one, and PC gamers use it increasingly.

nor do they wish to spend twice the money for half the power for gaming on a Mac.

Mac hardware is nowhere near to what you can get for Linux, which is basically any high end hardware that games would want for demanding games. So macOS isn't even in the same league. Plus forget about using Wine on it. Lot's of macOS users switch to Linux because they can use Wine to run DX11 games already, but tough luck doing it with outdated OpenGL 4.1 that macOS is stuck with.

Gaming will come to Mac, right after Apple starts making hardware that isn't laughably underpowered.

I'd say that traditionally, the macbook family has had better graphics hardware than /most/ budget laptops. But these days, you can get 1050 and 1050Ti cards in $700-800 laptops and 1060 cards in $900-1200* laptops.

But apple still suffers because of their insistence on metal (which makes sense for iOS/macOS synergy) instead of more common APIs like OpenGL and Vulkan. So it's a lot of work for developers to support macOS, for a small subset of the market (macOS users make up only 1.33% of Steam according to hwsurvey, but Linux even further behind at 0.28%).

Gaming will come to Mac, right after Apple starts making hardware that isn't laughably underpowered.

I'd say that traditionally, the macbook family has had better graphics hardware than /most/ budget laptops. But these days, you can get 1050 and 1050Ti cards in $700-800 laptops and 1060 cards in $900-1200* laptops.

But apple still suffers because of their insistence on metal (which makes sense for iOS/macOS synergy) instead of more common APIs like OpenGL and Vulkan. So it's a lot of work for developers to support macOS, for a small subset of the market (macOS users make up only 1.33% of Steam according to hwsurvey, but Linux even further behind at 0.28%).

Macs are not designed for gaming nor marketed to that demographic. A person who is in market for a cheap gaming laptop will never look at a MacBook nor would someone who is looking for a premium productivity laptop look at a gaming laptop. Buying a Mac with intention of running any semi-modern game at a decent resolution is like buying a Ferrari to tow trailers.

And now it's facing a worthy opponent. Lock-in approach will eventually die out, same as happened with the Web and multiple lock-ins of the past (ActiveX, Silverlight, Flash, you name it). But obviously MS, Apple and Sony will try to resist.

One can hope; MacOS's paltry support of OpenGL and just non-existent support for Vulkan is an endless source of frustration for me.

Why? I can understand OpenGL though anything relevant that needs OpenGL access (like CG apps etc.) work fine. As for Vulkan nothing actually uses it so it shouldn't even affect you as a Mac user anyway.

And now it's facing a worthy opponent. Lock-in approach will eventually die out, same as happened with the Web and multiple lock-ins of the past (ActiveX, Silverlight, Flash, you name it). But obviously MS, Apple and Sony will try to resist.

One can hope; MacOS's paltry support of OpenGL and just non-existent support for Vulkan is an endless source of frustration for me.

Why? I can understand OpenGL though anything relevant that needs OpenGL access (like CG apps etc.) work fine. As for Vulkan nothing actually uses it so it shouldn't even affect you as a Mac user anyway.

Who says they're a user? As a software developer, Apple's lack of support for OpenGL has been a source of frustration for me (since software that works on every other platform can't work on Apple without jumping through their hoops for 3D rendering). And I haven't owned an Apple product besides an iPod.

I'd say that traditionally, the macbook family has had better graphics hardware than /most/ budget laptops.

You really need to stretch that argument for it to be valid.

Apple hasn't shipped dGPUs as a 'standard' for a long time; you've needed a mid-range 15'' machine or above for dGPU to even be an option.

Otherwise you're limited to Intel's standard IGPU thats found in most laptops, or Iris IGPU (which remains inferior to even the weakest dGPU).

I've read that Apple pushed intel to beef up their IGPs, leading to Iris and Iris Pros, and they also tended to avoid the lower EU SKUs in their lineup.

However, I did say "traditionally". I don't think they have had any GPU advantage over mainstream windows laptops for a while now, especially given their decision to snub nvidia and go exclusively intel/AMD on GPUs.

Now the iPhones and the A10/A11 chips still have pretty good GPUs, but ARM, Qualcomm, Samsung, etc are not standing still when it comes to GPUs, either.

As for Linux, Nvidia, AMD and Intel have same-day drivers. There is even same-day patches for RADV (unofficial) Vulkan driver for AMG gpus.

On Linux there ia also way more games with Vulkan, and even some are Vulkan-only.

Game developer get a bonus. All those drivers but Nvidias are opensource, so debugging to and through the driver is easy. Going deeper through OS, and till hw is reached is easy. Hand devs can know exactly what's happening.

On Windows, only biggest teams get access to source code of drivers. MacOS is totally hostile.

Yet. Games need good QA, and Vulkan as a second API, on a given platform, doubles that cost. So that's main show stopper. If it can be single API, it will be huge win for game development.

I saw that news but this is honestly not a good apples to apples comparison. OpenGL, especially Apple's implementation of it, is obviously going to be slower. A better comparison is between the Vulcan wrapper vs a game written directly on Metal. If the performance difference is small, that would give devs the peace of mind only target Vulcan instead of needing to port to Metal as well. Otherwise, you will still be incentivized to port to Metal to extract that last bits of performance gains.

All great news...now we just need developers, especially the ones making popular engines to fully adopt vulkan and dx12 and leave opengl and dx11 and prior versions behind.

We see alot of games with dx12 "features" but they are just a few function calls running on top of a dx11 engine. We have seen what happens when developers truly use these new apis to full effect and it's time to move on.

I understand why they are still holding back and supporting the older APIs. Not everyone has a gpu that supports the full necessary feature set of vulkan and dx12. Nvidia especially has poor dx12 driver support but that could change very quickly.

Once developers make the shift people with $200 GPUs will be seeing performance that you only get with $600 right now...no im not considering current gpu price inflation in that I'm pretending the mining craze never happened.

Imagine being able to play a modern high end game at 4k60 maxed out on an$200 GPU or full fat VR at a proper 90hz with super sampling enabled so the image jaggies free and pristine.

These apis can give us that. I remember when I had a amd 290x and I tested AMD's MANTLE API on Thief. Under dx11 at 1440p maxed out I went from about 60fps to well over 140 when MANTLE was enabled and MANTLE was just a test to show how well a new low level API could work. Vulkan & DX12 are far beyond MANTLE'S capabilities.

Let's make this happen devs. Lower the barrier for high end gaming and you will sell more games. Hardware vendors will benefit as well because pc gaming will become more affordable so more people will join in. Things like VR will take off at a faster pace because the cost of the pc needed will be cut in half and due to higher sales volume you will be able to lower prices of HMD's and the controller's that go with them.

Changing over to something as simple as a new performance enhancing API could lead to a new golden age in pc gaming all because of the walls it tears down.

This extra capability is interesting because, though multiple discrete GPUs are still relatively uncommon and are usually found only in the most expensive gaming systems, it's very common for systems to have one discrete GPU and one integrated GPU.

If coding this into games is pretty much seamless, it's going to make the Ryzen APUs even more attractive.

Direct X12 supports it but I haven't seen it in the wild yet. Vulkan still requires GPUs to be similar for mGPU.

Gears of war 4 supports mGPU but I'm not sure which of the 3 modes. Well...it supports the mode that is basically the same as sli/crossfire but not mixing of different discrete gpus

There is nothing preventing developers from using any of the modes and the most commonly cited reason for them not supporting it is because "it's hard".

DX 11 and opengl essentially handled all the hard work for the developers (combined with the drivers) but since the newer APIs are more "to the metal" the developers are responsible

Since they could have each gpu render part of the scene and have different things stored in vram it could provide a massive boost in performance. Unfortunately because it's difficult to do nobody is actually doing it. It's unfortunate because for the first time in the history of mGPU support it's actually worth having more than 1 gpu and developers have decided not to take advantage.

Perhaps when the next round of majors engines like ue5 are available we will finally see support but who knows. Every major engine in use today with the exception of idtech 6 was designed for dx11 not 12 or vulkan. If they have dx12 support it's almost always a dx11 engine with some dx12 function calls. Games like Hitman, tomb raider, & quantum break are good examples. Even Doom and Wolfenstein 2 werent designed from day 1 for vulkan but they did manage to use some of it's more powerful features to improve performance significantly.

Just imagine if they were designed exclusive for Vulkan and there were zero remnants of older opengl code in there. Same goes for tomb raider...or Hitman with all it's ai on screen at once. Assassin's Creed would benefit massively as well as would r6 siege with all it's physics calculations or ghost recon wildlands or any game with a massive open world. I had really hoped the new far cry would be vulkan or dx12 only

The sooner we ditch dx11 and other older APIs the better off everyone playing on PC is. As long as your gpu is 4 years old or newer you should be just fine, in fact you should see a massive performance boost. On my old 290x I went from roughly 70fps average in doom at 1440p maxed to a 137 average

Yeah. And I'd add that PC gamers, even though they are a smart group of folks, don't want to get a Ph.D. in computer science for running Linux nor do they wish to spend twice the money for half the power for gaming on a Mac.

Your assessment is quite late to the party. It's not '90s - '00s anymore when Linux gaming was rather rough. Today gaming on Linux is pretty easy. Sure, it's a smaller market, but a growing one, and PC gamers use it increasingly.

nor do they wish to spend twice the money for half the power for gaming on a Mac.

Mac hardware is nowhere near to what you can get for Linux, which is basically any high end hardware that games would want for demanding games. So macOS isn't even in the same league. Plus forget about using Wine on it. Lot's of macOS users switch to Linux because they can use Wine to run DX11 games already, but tough luck doing it with outdated OpenGL 4.1 that macOS is stuck with.

Yeah, installing Windows from scratch is a huge pain in the ass. It's gotten less bad in recent years, but some people seem to think that it's some magical, user-friendly wonderland because an OEM takes care of the hard part for you.

For the past 10+ years, installing most Linux distributions on all but the most exotic hardware is a cakewalk.

Yeah. And I'd add that PC gamers, even though they are a smart group of folks, don't want to get a Ph.D. in computer science for running Linux nor do they wish to spend twice the money for half the power for gaming on a Mac.

Your assessment is quite late to the party. It's not '90s - '00s anymore when Linux gaming was rather rough. Today gaming on Linux is pretty easy. Sure, it's a smaller market, but a growing one, and PC gamers use it increasingly.

nor do they wish to spend twice the money for half the power for gaming on a Mac.

Mac hardware is nowhere near to what you can get for Linux, which is basically any high end hardware that games would want for demanding games. So macOS isn't even in the same league. Plus forget about using Wine on it. Lot's of macOS users switch to Linux because they can use Wine to run DX11 games already, but tough luck doing it with outdated OpenGL 4.1 that macOS is stuck with.

Yeah, installing Windows from scratch is a huge pain in the ass. It's gotten less bad in recent years, but some people seem to think that it's some magical, user-friendly wonderland because an OEM takes care of the hard part for you.

For the past 10+ years, installing most Linux distributions on all but the most exotic hardware is a cakewalk.

I second that. A I work in smaller software shops, installing OS, is common event.

Linux? Install from USB, run automatic update. Done.Windows? Install from USB. Make sure AV is enabled. Update AV. Enable update for all MS products. Run automatic update. Instal GPU drivers (cause MS won't ship OpenGL.....), check what other drivers need to be updated on OEM website. Check if network printer is detected, and configure if needed.