NVIDIA and AMD Fight over NVIDIA GameWorks Program: Devil in the Details

The AMD Argument

Earlier this week, a story was posted in a Forbes.com blog that dove into the idea of NVIDIA GameWorks and how it was doing a disservice not just on the latest Ubisoft title Watch_Dogs but on PC gamers in general. Using quotes from AMD directly, the author claims that NVIDIA is actively engaging in methods to prevent game developers from optimizing games for AMD graphics hardware. This is an incredibly bold statement and one that I hope AMD is not making lightly. Here is a quote from the story:

Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favor of NVIDIA products. . . . Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization.

The example cited on the Forbes story is the recently released Watch_Dogs title, which appears to show favoritism towards NVIDIA GPUs with performance of the GTX 770 ($369) coming close the performance of a Radeon R9 290X ($549).

It's evident that Watch Dogs is optimized for Nvidia hardware but it's staggering just how un-optimized it is on AMD hardware.

Watch_Dogs is the latest GameWorks title released this week.

I decided to get in touch with AMD directly to see exactly what stance the company was attempting to take with these kinds of claims. No surprise, AMD was just as forward with me as they appeared to be in the Forbes story originally.

The AMD Stance

Central to AMD’s latest annoyance with the competition is the NVIDIA GameWorks program. First unveiled last October during a press event in Montreal, GameWorks combines several NVIDIA built engine functions into libraries that can be utilized and accessed by game developers to build advanced features into games. NVIDIA’s website claims that GameWorks is “easy to integrate into games” while also including tutorials and tools to help quickly generate content with the software set. Included in the GameWorks suite are tools like VisualFX which offers rendering solutions like HBAO+, TXAA, Depth of Field, FaceWorks, HairWorks and more. Physics tools include the obvious like PhysX while also adding clothing, destruction, particles and more.

However, there are some major differences with GameWorks compared to previous vendor-supplied code and software examples. First, AMD and several game developers claim that GameWorks is a “black box” with only API calls used to access the GW functionality. The term “black box” is used to indicate that little is known about what is going on inside the GameWorks libraries themselves. This is because GameWorks is provided as a set of libraries, not as a collection of example code (though this is debated by NVIDIA later in our story). Because of its black box status, game developers are unable to diagnose buggy or slow game code when using GameWorks and that can lead to issues with different hardware.

A section of AMD's developer website with example code for download.

You might be wondering already why this is different than something like PhysX? Looking at GPU accelerated PhysX only, that particular plugin ONLY runs on NVIDIA hardware. Adding it or changing the implementation does not negatively affect the performance of the AMD or non-PhysX code path. Many of the GameWorks toolsets (basically everything except PhysX and TXAA) though do in fact run on both AMD and NVIDIA hardware if they are implemented by the developer. That means that visual effects and code built directly by NVIDIA is being used on AMD GPUs in GameWorks enabled titles like Watch_Dogs. You can see immediately why this could raise some eyebrows inside AMD and amongst the most suspicious gamers.

This is different than what has been the norm for many years. In the past, both AMD and NVIDIA have posted code examples on their websites to demonstrate new ways of coding shadows, ambient occlusion and other rendering techniques. These could be viewed, edited and lifted by any and all game developers and implemented into their game engine or into middleware applications. GameWorks is taking this quite a bit further by essentially building out a middleware application of its own and licensing it to developers.

The obvious concern is that by integrating GameWorks with this “black box” development style, NVIDIA could take the opportunity to artificially deflate performance of AMD graphics cards in favor of GeForce options. That would be bad for AMD, bad for AMD users and bad for the community as a whole; I think we can all agree on that. AMD points to Watch_Dogs, and previous GameWorks titles Batman: Arkham Origins, Call of Duty: Ghosts and Assassin’s Creed IV: Black Flag as evidence. More interestingly though, AMD was able to cite a specific comparison between its own TressFX hair library and HairWorks, part of the library set of GameWorks. When using TressFX both AMD and NVIDIA hardware perform nearly identically, even though the code was built and developed by AMD. Using HairWorks though, according to numbers that AMD provided, AMD hardware performs 6-7x slower than comparable NVIDIA hardware. AMD says that because TressFX was publicly posted and could be optimized by developers and by NVIDIA, its solution provides a better result for the gaming community as a whole. Keep in mind this testing was done not in a real-world game but in internal testing.

NVIDIA's updated Developer site.

AMD has made other claims to support its theory on the negative impact of GameWorks. The fact that NVIDIA’s previously available DirectX 11 code samples were removed from NVIDIA developer site is a damning statement as it would indicate NVIDIA’s desire to move away from them to GameWorks exclusively. But, as we show on the following page, these code samples were not removed but simply relocated into a difference section of NVIDIA's developer site.

With all of this information it would be easy to see why stories like the one at Forbes found their way across the Internet. However, as I found out after talking with NVIDIA as well, there is quite a bit more to the story.

Ummm... you mean... $159? Yep, that'll buy you a band new Gold rated 1300 watt EVGA SuperNova power supply on Newegg RIGHT NOW after a small $35 dollar rebate! Even a 1500 watt brand name Silver rated PSU can be had for just $299... I suppose you COULD spend $449 dollars if you wanted to, but why?

Also note it's nearly two times Fire Pro S10000 at 1.48Tflops DP with 6GB and also $3000. Again it takes 2 to catch TitanZ which is far less watts also. Titan Z is for DP people who happen to game on the side, not for gamers who have no idea what DP is for.

The fanboyism is off the charts.
If you understood how adaptive sync (what freesync has become) and g-sync work you'd realise how wrong you are. The very fact that VESA has adopted the adaptive sync as part of the displayport spec shows how important that little freesync demo was. AMD came along with the tongue-in-cheek name of free-sync because it is technology that already existed and can be implemented independent of the GPU when NVidia was touting it as completely revolutionary (when it already existed in notebooks as a power saving feature) and requiring proprietary hardware on particular graphics cards.

Not a fanboy, just someone who knows how to objectively analyse what's happening unlike someone.

i believe what people want to see the most is how freesync run real games. i'm not takes sides here. when nvidia come up with g-sync they show real demo running real games. that's how they convince people that the tech is real. for AMD part they only show that windmill video. yes there is no external monitor exist yet to run freesync but why amd can't show real games under freesync using the same demo unit that they show us? we want to know about AMD implementation that will deal with screen tearing and input latency at the same time

Except you should realize VESA is not one of open standards. It's a body that writes "ALL" display standards. Exclude all VESA standards, none of your displays from PC to phone will work. That's why FreeSync getting in is a huge thing. It's optional in DP 1.2a, which probably means it will mandatory in DP 1.3.

Let me know when AMD gets around the SCALER problem that NV ran into which caused them to make this nasty terribly expensive FIX called Gsync. They stated they made it because scaler's couldn't do the job. That's so terrible of them isn't it? I mean what kind of company goes around jerks holding up the train? Oh, the good kind. Did you complain when AMD came up with AMD64 when Intel wouldn't until forced? We'd probably be on Itanic right now if

Freesync will cost Scaler makers R&D and then it becomes ummm, chargesync? NOT-so-FREEsync? Whatever, and that's if they can get them to do it at all. Not to mention we have to see it work, as gsync is pretty proven tech and everyone liked it. They won't do the R&D for free and will pass whatever costs they incur directly to end users whether AMD likes that or not. Only fools work for free while trying to run a business. I'll be shocked if they don't put a premium on it while they can also. No different than blowing up the radeon pricing over mining. Nobody can be forced to use the new Vesa sticker on their monitors. They can just ignore it and use gsync or nothing.

The fact that AOC, Philips, Viewsonic, BenQ, Asus & Acer are all coming with models means something too. They realize freesync is likely a year away or more and only if Scalers up their game, and it isn't a known commodity vs what they can sell RIGHT NOW with Gsync for the next year. Only samsung/LG are really left of the big names. I call that WIDE support from everyone. Did you read the PCper article? I'm pretty sure he and everyone else pointed out the scaler issue NV ran into and that AMD will have to convince them, while NV apparently couldn't. Maybe AMD will have more luck now that they see Gsync can steal their Scaler sales ;) But again, that should mean we thank NV not chastise them.

was certainly true in the past, but since the end of 2013 their drivers are really great and frequent, multi gpu scalling, optimisation, fixes frame pacing, now multi rez support for eyefinity, this was the one reason i didnt get eyefinity, because i had the monitors, and couldnt use them, and getting 2 others was money spent for no reason, now that i gave away the old monitors, the driver solve the issue xD.
so i know that many ppl are in the same position and would benifit alot from this eyefinity update.

the fact is AMD drivers were crap up to Q3 of 2014, the other fact is AMD Driver are much better for the last 8 months or so.
as i said above more frquent better features, better scalling, faster fixes and optimisations.
so i dont think they are still small and under funded, something probably happened, and it's probably linked to the success of Mantle.

I don't think it has much to do with Mantle, in fact Mantle can't yet be called a "success." Its not used by enough developers and it hasn't had enough time to really mature, but with that said, Mantle certainly has changed the API paradigm for the better, we're getting rumblings out of Redmond, WA. AMD managed to wake the lumbering giant for a (more than likely) expedited release of DX12. I think it just has more to do with the fact that AMD is really at the top of their game right now realizing that if they don't push the limits and really challenge their competitors, they won't survive. I'm really happy with how far AMD has come in the past year, its been very exciting to watch AMD realize drivers that are arguably better than Nvidia in many areas. Personally I find "GeForce Experince" to be a terrible application for overclocking. I much prefer AMD's overclocking implementation.

You know nothing about Nvidia's drivers obviously. GeForce Experience does no overclocking whatsoever. It is a really great application that automatically updates the graphics driver when a new one comes out, which is at least once a month. And it also recommends the game's graphics settings based off of your computer hardware so that you get the best experience possible. It also controls shadowplay and the lighting effects of the card itself. You had no idea what GeForce Experience was did you?

I don't really think that's true actually... AMD simply needed a fire lit under their asses to get into gear (re: frame rating) and they've shown just how talented they truly are by going from inoperable frame pacing to excellent frame pacing in an industrially small amount of time of just a few months - quite an accomplishment really.

You are kidding right? Freesync was good enough that it was accepted into the VESA standard of displayport. Yeah, that means an AMD technology will be in every displayport starting with the new revision. Basically, G-sync is pretty much dead unless you like paying 150$ more for a proprietary chip built into a monitor that is a good thing. I love Nvidia, in fact I own 2 780's, but their proprietary codes and technologies are crippling the pc gaming industry. Hell their deal with microsoft to keep pushing direct x is one of the reasons our gpu's can only do 25% of what they are capable of.

First let it be known that I have no preference for either Nvidia or AMD, I buy simply what suites my needs best at the time.

New and powerful tech? Like? You clearly have no idea how much AMD has contributed to the x86, GPU, and API industries, forcing their competitors to advance the industry rather than let it stagnate. For example, the 295X2 made such an impact that Nvidia was FORCED to POSTPONE the release of TitanZ BECAUSE of AMD right? lol What are you, like 13 years old?

Without AMD, Nvidia would have no competition and therefore almost no incentive or initiative to produce all of the innovations you probably think are so wonderful. You obviously have no clue or understanding of economics or business let alone the dynamics involved in the GPU and graphics industry. When making comments like the one above you just sound like an ignorant fanboy, which is really unflattering and embarrassing for yourself and everyone that has to read it in this forum. Maybe you should take your misplaced enthusiasm for Nvidia and disdain for AMD somewhere else where informed and educated PCper readers don't have to read your ridiculous rhetoric that in no way contributes to a constructive dialogue.

You are apparently unaware DX 12 was being worked on for a few years already and that mantle didn't cause this to happen it was the natural course as of action. Also that OpenGL already has the ability to drop draw calls etc? So mantle wasn't needed as the other two api's were already well on the way before mantle.

Since titan Z isn't aimed at gamers I fail to see you point, we have no idea what the delay was. You work for NV? It seems clear to me that if you're not aiming at gamers, 295x2 wouldn't even be in your thoughts. They didn't even change the clocks after the delay, everything stayed the same. If they had raised them 100mhz or 200 maybe we could assume it was due to AMD but they did nothing but push it off it seems.

What did they change that shows they were FORCED to delay? ROFL@ you blasting the other guy while doing the same thing you accuse him of (glorious AMD and their contributions to mankind....LOL). hardocp just tested Bf4 again (770 article posted a day or two ago), and it was basically break even. Is that all Mantle gets you? NV made the game's use of mantle moot with a simple dx11 update. I can see why they say they don't fear it. IF they don't have time to read his/her rhetoric, surely they have none to read your personal attacks of the guy either. He/she is 13yrs old, you're basically calling him/her stupid (he/she not one of the "informed and educated people), he/she clueless, the person is an ignorant fanboy...ROFL.

oh i will love to see if you are still singing nvidias praises when AMD goes under and suddenly NVidia is charging a thousand dollars for the then equivalent of a 750ti

(followed by stagnation as nvidia without anyone to compete against suddenly realises they can now get away with small performance increases with each generation if they tie it into some alleged new technology that all games suddenly start using)

it doesnt, the issue is still here, and Ryan doesnt seem to grasp the problem and understand it.
the probleme with gameworks, even if the developper pay the licence for the code and have access to it, the developper is under NDA and cannot tell AMD the code for them to fix it, so in the end nvidia will fix the code for the developper for the AMD hardware without AMD being able to optimise it the way they want, but the way Nvidia or the devs want.
and if you guys dont think this is a real issue then lol, or should AMD pay for the licence for every effect at every game to be able to optimise it ?
the implementation of gameworks is retarded.

Time well spent. Well articulated, concise, and full of pertinent information on both sides of the issue.

An EXCELLENT read. This article further supports my belief as to why PCper is the premier computer/technology sites online today, making strides in everything from editorials like this one, to truly unbiased articles and we can never forget you and your team's contribution to the ground-breaking introduction of a new GPU benchmarking methodology that actually led to a paradigm shift in the way that GPU's are now tested. Its hard to believe that in the recent past the FRAPS metric had the final say in most GPU benchmarking.

This does sound like AMD doing what they can to create controversy to make it look like they are getting unfairly treated, But they are no victim in any of this as they have done same thing. The tressfx thing last I used it, wasn't even over nvidia/amd cards, amd cards had a pretty massive advantage when i last used it, 50-55% to 90-100% load difference with it off/on. Kinda wonder if AMD is so focus on mantle they don't want to bother with DX optimizations anymore?

1 it's not true tressfx taxe nvidia and AMD equaly.
2 you can disable it
it's not t(he case of gameworks library, depanding on how it's implemented and how many features, if you think that 1 tressfx could be annoying, imagine 10 techs running simultanously, each running 2% better, you end up with 20% boost.

That's assuming all 10 techs (even if they were running simultaneously) are running at 100% load, I think you'd have to take a mean/average approach to reach a more accurate number as a 20% boost would be an absolute best case for the hypothetical scenario you've come up with - although despite this, your point as merit I believe.

That's assuming all 10 techs (even if they were running simultaneously) are running at 100% load, I think you'd have to take a mean/average approach to reach a more accurate number as a 20% boost would be an absolute best case for the hypothetical scenario you've come up with - although despite this, your point as merit I believe.

That's assuming all 10 techs (even if they were running simultaneously) are running at 100% load, I think you'd have to take a mean/average approach to reach a more accurate number as a 20% boost would be an absolute best case for the hypothetical scenario you've come up with - although despite this, your point as merit I believe.

It took less that a week for Nvidia to update it's drivers to run TressFX as good as it was running on AMD hardware. Many Nvidia users/fanboys where laughing after that, saying that TressFX was running better on their cards than on AMD cards.

So, if you want to say a big fat lie, at least say something believable or try another site, maybe the Cosmopolitan site, where people have no clue about hardware and will believe whatever you say to them.

Quote"Kinda wonder if AMD is so focus on mantle they don't want to bother with DX optimizations anymore?"

Reply: They can't do that due to the limited number of titles that will actually use Mantle, Maybe on those titles they'll be less bothered about how DX run's the game but before then they need to iron out the kinks so that shouldn't be any time soon.

well ryan overall good article, still has here and there weird issues.
like when you said Mantle takes time from DX developement, while you said on another portion that it's the job of the devs to make sure the game works correctly, like if AMD is asking devs to release an unfinsihed DX game, it's not their job it's the studio's job to manage resources and time to see if they can afford adding Mantle to DX or not.
and AMD is in competition with Nvidia, how could anyone remotly ask AMD to trust their competition, to run an optimised library for them, and doesnt use it to keep perf in check and boost whenever they need a wonder driver PR.
personaly i think that games using gameworks should be codded for regular vendors then coded exclusivly for nvidia, this way users can chose to enable or disable gameworks rather than being forced to run it, because in all honesty it's fishy.
or even better a developper that want gameworks should get Mantle too, and have 2 versions, would be epic xD

and ALSO i dont know if gameworks is what made watch dogs so UGLY, but you need to see video comparaison of E3 2012 and the one released it's really uglier.

Quote "and ALSO i dont know if gameworks is what made watch dogs so UGLY, but you need to see video comparaison of E3 2012 and the one released it's really uglier."

It's nothing to do with gameworks, What happened with the promotional videos is a fairly common practice in the industry. The early peek stuff often looks a lot better than the end product because there using all there available resources to develop a small example of the end product for the promotion video. Those resources won't then be enough to produce the same effects for the full open world game.

yea well the thing about watch dogs was it's graphics, you had the feeling it might be the next gen game, then with the release it turnd out to be another washed up colors, very common graphics, i was disapointed.
and i was expecting an immersive game with good scenario, but since they delayed it, they added some weird crap, like challenges and weird online mode that stripped the game from the immersion, what the hell is a robot spider is doing there...come on
overall this game was a disapointment for me, now i fear the same for The Division Tom clancy, it's another Ubisoft game, looks stunning and promessing at the E3, also was delayed and probably end up with a crappy game at the release.
i really think it's Ubisoft that forces the studios to do stupid stuff on their games, and for that i dont like them, and with gameworks i like them even less.

here is a video comparaison between E3 2012 et the release version PC Ultra settings 1080p.
look at the weather, look at the lighting, look at the colors, fog, reflections look at the details, damn even look at the physx E3 version the jacket goes well with the movement the wind etc the relsease it's almost still, and the cerise sur le gateau look at the stutter at sec 0:41.

i mean nothing in the release is like the footage, the game have been degraded dramaticly, sriously can a player sue ubisoft for false marketing over this ? because i would love to see a class action suit against ubisoft for misleading and scamming players over and over again.

Yeah, definitely AMD behaving immaturely, but it is somewhat fishy (not saying anything done purposely) that watch dogs does run so poorly on AMD hardware. They obviosuly optimized it the most for nvidia and probably completely ignored AMD. Can't blame nvidia so much for that though. Still sucks to see PC gaming going this way.

As for the gameworks thing it's obviously nvidia going for a cash-grab. Don't know how much it costs or whether nvidia would just outright deny AMD a license to the source code. In the end, it's likely bad for small budget developers. Sad to see stuff like this pop up just as game engines are suddenly becoming affordable.

Also is the big money still in console game sales? Also sad to hear...

If its like whole physx thing, AMD refused to license it, and then came out and did this. Not long after that AMD whined about not being able to use physx on their cards but never made mention that could licensed it and added be didn't.

You are forgetting something about PhysX. With PhysX Nvidia wants to have the right to have an opinion about what hardware you are using. If there is anything in your PC case that doesn't say "Nvidia", PhysX support will be disabled.
In the case of PhysX Nvidia is punishing it's own customers if they are not loyal enough.

We are talking about a company that will not hesitate to be unethical if this means getting an (unfair) advantage over the competition.

Um, unethical? Seriously? Creating proprietary API's is unethical? Buying the rights to a technology (PhysX) and wanting to license it is unethical? Well geez guys, how about every company just close up shop because apparently breathing is unethical now. Give me a break. It's not an unfair advantage, it's called being competitive. Besides, why is PhysX still being brought up? How many games actually ended up using it in any meaningful manner? Not many. If I had to guess, I would say that it is due to the fact that consumers didn't buy into it enough for developers to justify implementing it.

Normally I don't fan the flames on posts like this, but the unethical word being thrown around just rubs me the wrong way. Do yourself, and us, a favor, and learn what unethical business practices are. Hint: Enron is a good place to start looking. Lehman Brothers too.

So, if you have something to say about what I wrote, stop trolling and answer based on what I wrote, don't put your words into my mouth. It is easier to put words in someone else's mouth, but it is also meaningless.

You know what's unethical, if you have both an AMD, and NVidia card in your box, PhysX is still disabled because it can sense you have an AMD card also installed, regardless if you have paid for and installed NVidia hardware as well.

You know what's unethical, if you have both an AMD, and NVidia card in your box, PhysX is still disabled because it can sense you have an AMD card also installed, regardless if you have paid for and installed NVidia hardware as well.

Good read, very unbiased. I think it is great amd is doing this now before it gets out of hand. If nvidia is doing what amd is accusing them of, then nvidia is heading down a real slippery slope. As a gamer, I see that as really bad. All we need is amd to do the same thing they are accusing nvidia of. I don't want to have two systems to run different games because they don't run properly on both gpu brands.

Years ago I liked NVidia, but their piggy proprietary selfishness turned me off. Still does. I like the AMD shaders much better, though the stuttering can be irritating. I'd like to try a dual titan against my Ares 2 but the absurdity of the price makes it impossible. I'm told that Nvidia is "smoother" than AMD, but lacking a side by side comparison of my own I cannot say. I'd like to see AMD prosper given they tend to be more open with their creative process, and I'd like them to invest more in the driver team.

when was the last time you used your AresII
have you tried your aresII with 14.xx catalyst drivers ? because the stutter on crossfire been gone for half a year now, framepacing on the drivers solved frametime for AMD, many games now have more stutter on nvidia than AMD card.
why does ppl still live in 2012 problemes ? keep up with the change

While I don't like a hardware company making middle ware that may benefit their hardware I'm not sure if this is as bad as AMD makes it seem. The fact that all the current generation consoles use AMD GPU means they aren't going to get left behind.

One thing that bothers me about the extra licencing option where developers can get and update the source code is that at no point does Nvidia take improvements back. This promotes not only that every developer that tries to use this (should be a time saver) has to buy the source code if they want it to run well on AMD but also optimise it themselves. This model while able to get AMD cards performing well is still treating AMD badly.

But on the other hand if AMD wants to it can take its code examples, build itself a middle ware and open source the lot, producing a competitor that both AMD, Nvidia and all the game publishers can contribute to. That way performance can be optimised and every game and vendor benefits as best as possible from the collaborative effort and reducing developer time to get these effects working. Seems like a simple enough fix. A free competitor like this would eat Nvidia's lunch and force gameworks out.

It is time for someone with big GigaBucks to step in and help one of the mobile SOC GPU makers to re-enter The desktop GPU market, under the conditions that All Drivers/Middleware/APIs be developed as an open source project, or use the current open standard Khronos API/drivers/Middleware. The hardware developed to be sold for the discrete GPU market, and server markets with no limits, other than the limits stated above. In other words said Big GigaBucks outfit that loans the GigaBucks, can take these discrete GPUs and utilize them as open GPGPU accelerators, for their Open Server based web, computing, and VR gaming SKUs! This will free the hardware up from proprietary software/API/Middleware manipulation, and the manufacturer will have to compete on the ability of the hardware alone, the Manufacturer(helped with the loan) will also have the right to sell these GPUs on the open market, but the drivers/APIs/Middleware will have to be Open sourced. No Open sourced drivers/APIs/Middleware, and no development Low interest/No interest loan.

Even as a AMD investor, I don't see AMD having any logic sense on this.

Do you or are you able to understand "building block" approach? When people choose using building block approach, do the building bloack providers provide all the details of the building blocks other than specs, how to use, and the detail interface?

Do you or are you able to understand "IP?" When people choose using IP, do the IP providers provide all the details of the IP other than specs, how to use, and the detail interface?

Why not splitting the game console business with nVidia?

The illoigal employee needs to be held accountable for damaging AMD's image.

Stop wasting time on non-sense and go back to work to produce meaningful earnings, not $0.02 a quarter!!!

"It's always a question of compromise about the effect, how it looks, and the performance it takes from the system. On PC, usually you don't really care about the performance, because the idea is that if it's not [running] fast enough, you buy a bigger GPU. Once you get on console, you can't have this approach." (hinting you should buy the 1000 dollar Nvidia GTX Titan, the only video card at the time that did run the game maxed out 60fps @ 1080P)

-Watch Dogs: No official statements yet (I believe?). But reviewers didn't receive the PC version early, unlike the PS4 version. TotalBiscuit from YouTube stated that he got the PC version 10 hours before launch (instead of a week early). Off course Ubisoft did know how crap the PC version performance (and stuttering) was at the launch and didn't want any bad reviews at launch apparently. A lot can be said about Watch Dogs but it still runs like total crap, even on a 5000 dollar gaming-PC.

PC Perspective: First of all thanks for this great podcast and article! Please get in contact with Ubisoft and ask them what's there problem.

This would go a long way in Nvidia's favour if they hadn't played dirty by offering source and non-source code versions and just offered a source code version. No one, especially the tech industry, likes silly free market concepts being abused like this. Does AMD also do this with Mantle?

AMD blows. They always have driver issues and quality control problems. Plus their new stuff runs hot as hell. Nvidia runs cool, quiet, WITHOUT WC, and their driver teams are superior.
Crossfire was a disaster for AMD and took years to fix.
AMD should stop whining. They were crowing about Mantle for months and weren't lifting a finger to use it Nvidia hardware.
Hypocrites much?

I feel like I don't want to analyse anything, just based purely on the past performance of these two companies AMD(really ATI) and the VESA (video electronics standards association) ...I have zero trust in their capabilities.

Who cares, watchdogs is an unoptimised turd. Even when Nvidia had its hand in the game, it needs sli titans just to pump out 60fps at ultra @ 1080p. It is pathetic, the game looks like crap compared to other AAA titles, get your shit together Nvidia

Ignoring everything about this, and purely judging on past performance, why should I trust something solely on the fact that it was adopted by the VESA as a standard?

In my opinion that group is incompetant and slow, and has been a failure. So just purely from that standpoint, I would want to try something different... now about the reality of which of these standards is better or worse, that's another story...