John of DSOGaming writes: "The PC version of Batman: Arkham Knight is simply a big clusterF. And from the looks of it, the game reverts back to DX10 when the NVIDIA Gameworks effects are activated. And no, that’s not a joke. This actually happens, and show how big of a mess the PC version of Batman: Arkham Knight actually is."

Except Arkham Knight on PC is missing Ambient Occlusion, the DOF is not used as much, the rain textures on Batman and the Batmobile seem to be missing ??? All those things are present in the console versions

Holy hell, this just keeps getting worse and worse. Forgive the language, but how could you f*ck it up so bad?

PC version has inferior rain shaders even though the Nvidia trailer brags about the enhanced rain effects. Lacks proper ambient occlusion. Might have a texture streaming problem. Most definitely lacks the bokeh DOF effects. On top of that, it's locked at 30 fps and it even drops below that value on most of the mid-end cards. Even if you have a GTX 970 or GTX 980 you've got to be very lucky as it seems like the game is quite picky about the spesific hardware it's working on.

That, my friends, is a total catastrophe. That I'm afraid to tell is a disgrace of a port that should've never been existed, and a ruthless mockery to the PC gaming community. It might as well be the only time that a game which is advertised under the "Nvidia Gameworks" flag falls behind its console counterpart.

I have a GTX 980 coupled with an i7 4790k and I had to buy the PS4 version to fully appreciate the game's visuals. Bloody unbelievable.

Keep in mind that most games don't have nearly as many programmers on the job as there are artists, modellers, and sometimes designers.

In this particular instance porting the game should rely mostly on programmers, as the main task is re-purposing an existing code-base and assets to work with PC (should be abstractions in place, especially as Arkham Knight is built on a modified UE3), it shouldn't really require a large team size. Even when adding new features by way of Gameworks it should be more than feasible.

The questions to ask, if any, are "how long they were given to do this?", and "what are their collective capabilities?"; not "why are there only '12 people thoe'?".

'Gears of War', if I recall correctly (Tim Sweeney presentation from years ago - should be on Google), only had 10 programmers on it. I believe 'God Of War' had near that number too (7 in fact, though I may be wrong - another presentation). Of course team sizes have grown in recent years, but that's more on the art side than code. Still, isn't 'Hellblade' on PS4 being done with a team size of 15, including designers, artists, etc.? Of course, we can expect most devs to have tech in place (existing engine, tools, framework, libraries, etc) that improves their efficiency and lowers team requirements, but we're not talking about creating a full game here - it's a port.

Also, I'm not saying there are always only a handful of programmers making up a team, just that folks shouldn't automatically assume every 'big' game has to have been made with 100, 50, or even 20 of them.

That aside and on a personal level, treating PC releases as 2nd class citizens has always seemed fairly deplorable practice to me. PC is a financially viable place to be, so devs (or rather, publishers?) should support it and it's users well!

Did you even read any of the wikipedia page? No where in the entire information does it provide that only "12 people" work there. I sure would like to know where people keep referencing this so called info

But yes i definitely agree that it is so hard to grasp how they could mess up the port this badly

LOL. Looks like Nvidia isn't the savior that people make them out to be. Those terrible cards with garbage fans and overpricing couldn't hold a candle to AMD if AMD had decent drivers. Looks like it's nvidia's turn to get trash talked after all the years of nvidia fanboys doing it to AMD. At least when the Tomb Raider Reboot came out on PC, TressFX and AMD optimization didn't give nvidia the middle finger.

I don't need to slow my roll. The hardware/tech behind AMD's cards are better than Nvidia's. They don't produce cheaply made cards with fans that go out in 1 1/2 to 2years like nvidia. I wouldn't be surprised if intel has been doing nvidia's drivers for them ever since the days they got the right to voodoofx.

First off, Nvidia cards outperform AMD cards and have been for quite some time now. Secondly, i have owned a Nvidia card all my life since the original geforce card. And i personally have never had a fan go out on me. And last but not least, who cares who does Nvidia drivers as long as they are stable and give performance increase.

Amd has the cards with terribly loud fans and cards that run hot with the sh*tty driver support. I think u r confused. Never had the first bit of problem with any of my nvidia cards but my last 2 amds both ran stupid hot.

Seriously? I have an R9 295x2, but, it's REALLY only a R9 290x because xfire doesn't work right in basically every new game ... Why did I spent $1000 on a card that I can't fully utilize because the driver support is fucking terrible? I'll tell you one thing, I never will again.

understand your hardware download latest drivers you got a very powerful card anyways. your card does work and has come out on top of many benchmarks even with the r300 series out now. If you have any problems with the drivers make sure its not your card or even a overclock sometimes it can be a simple tweak again understand your hardware. blaming on drivers is easy hell i had no end of problems with my gtx560 i had 4 them cards do you see me bitching first No because i had a problem with a chipset does not mean Nvidia are crap... As for disagrees what can I say not many that disagree come out like you did explain why so i give you kudos for that sir.

No love for AMD on this site. I always update my drivers and look for work arounds. The community is great for coming backdoors for getting games to run better. But, still I'm referring to the driver issues with MKX and TW3 with xfire. The flickering is awful.

Also, so many games don't support it at all like The Evil Within. My point was more that I'll never buy another dual GPU or run xfire because it seems to never run right. I'd be crushing some TW3 at 1440p if it did ... :-/

Aww cool I hear you same said for Sli too more devs need work with both companys to support xfire or SLi but its easyer for many to turn blind eye. While it is True a single GPU does take away that worry A twin GPU is top tier bad ass tbh just wait it will get better for selected games. If your running at 1080p then single card high end does the job but for longterm build, Your R9 295x2 keep you safe never need worry about MSAA draining on your card like a single card can do. I cant speak for the witcher but i know MKX wasnt optimised well like the witcher plus had a certain new DRM that hinderd the game.