Crysis 2 Will Go Up To 11

Share this:

After a couple of weeks of web-wide worrying and shouting and bickering and excellently satirical editorials, you may be glad to hear that Crysis 2 is to receive its in-doubt DirectX 11 patch after all. This comes via the official forums, wherein it was officially said on an official forum, despite being officially said on other official forums that it mightn’t happen. That seals it: all is well in PC gaming tech land. For now.
There’s no release date, which is heartening – means it’s more likely it’s something substantial rather than token. “We are working to get the best out of DX11, so we’ll wait to announce the features until a little closer to release,” is all we have.

I have a DirectX 11 card at present, but I haven’t ever run anything on it that made me think/exclaim “ooh! DirectX 11 is the nuts!”, but perhaps this will. Jolly good, then. I do hear good things about DX11’s theoretical capabilities, but as I understand it there remains a sad dearth of content. If we have proof of its excellency, perhaps we can then get fully behind it.

it cant, from what ive seen in the pdf from crytek, the max is the max already in DX9, (same as with crysis 1). I’d like to b wrong, but it wont matter to me since I done sp and aint any homogenously working MP cryck(XD) for 1.2(and it wont help the framerate either)

I understand that people expected Crysis 2 to make their rig bleed, but I for one really enjoyed the campaign, and thought it looked bloody brilliant – especially for a DX9 game which ran like a dream on my PC.
DX11 may add some pretties, but at what performance cost?
The only thing I do not enjoy currently is the multiplayer – too much faffing with going invis, thermal imaging, and armour switching – I just wanna blow shit up :p

To all the un-informed gamers saying Battlefield 3 looks better because its DX11 from ground up.
YOU ARE WRONG.
BF3 looks great because its using ENLIGHTEN tech from Geomerics, which has NOTHING to do with Dx 8, 9 10 OR 11 cards :P Go visit their site

@jimjam: Enlighten scales its use of real time radiosity with the power of your CPU and GPU, it’ll only show its “full capabilities” on current hardware.
Here’s a nice demo video of what it can do btw. and I agree it is one amazing piece of tech: link to youtube.com

Lighting is one big part of what can make a game look amazing (see for example the differences between Mass Effect 1 and 2) and can even make games like Counter Strike 1.6 look a lot better. Luckily Enlighten has been integrated in a number of engines already like Frostbite 3 and Unreal Engine 3

C’mon, why would anyone use Win XP these days?
It’s OK if you have old PC but otherwise, especially if you are a gamer, you should move to Win 7 – faster, better, more effective, more stable, no problems with drivers, DX11, support for more than 3GB of RAM, and so on, and so on. WinXP is like IE6 these days.

Yep, sadly it won’t fix physics, missing interactions, dumb AI, and maps being tight corridors :( But well… depending on how much they’ll try – we may end up with a game performing better and having more details in it. Something to hide half-size textures. Time will tell…

Crysis 2 looked pretty good to me. Why are people so fussy about the DX11 stuff? Surely a pretty game is a pretty game? Genuine confusion here, is there something that DX11 does that drastically changes things?

Not that DX11 does, but many things that DX11 can do. Essentially it allows shaders to be a lot more flexible and therefore more powerful. Whether this implementation of DX11 will take advantage of that remains to be seen.

1) Hardware tessellation. This is an INCREDIBLE new tech that I’m really looking forward to seeing companies take advantage of, although I have yet to see a game do so. The best example I’ve seen of this in action is the Unigine Heaven benchmark. The ropes, cobblestones, and dragon are dramatically improved by turning this on. I would be thrilled to see Crytek use this, especially since it would provide a major improvement over the visuals on the 360 and PS3, and it never hurts to have another “neener neener look what I can do” game to brag with when some says a console can make graphics look just as good as a PC.

2) Multi-threaded rendering. Currently all rendering for an application must ultimately go through a single thread of the application, a nasty bottleneck in an era of hexacore processors. DX 11 allows multiple threads to directly call the rendering device, making multi-core games even easier to do. This should help boost performance for everyone running a multi-core processor if it’s implemented, although multi-threading is so incredibly hard to do well they may not bother.

3) Compute shaders. A DirectX version of OpenCL, CUDA, etc. As the biggest name backing any of the various APIs that provide GPU processing, and the biggest one NOT locked to a particular company’s hardware, this will allow for GPU-processed physics and AI for the masses. Not useful at all if you don’t have multiple video cards AND are already hitting 60 FPS at your current settings, since otherwise you’re making your already-fully-occupied video cards do even more work. Because of that, I don’t expect this will be implemented.

Maktaka. No offence but I think this will be just support for DX11, that means that it will run under DX11 but not use any of those features since they kinda need to be in by design. If they do have it in then it always was planned that way.

If they’re not going to use the new features of DX11, then they wouldn’t be doing this patch. Simply putting a DX11 on things does nothing, the game would continue to use the same old DX9 API calls. DX11 isn’t some magic thing you can put on your game and it makes everything awesome unless you actually design the game to use them. I suppose it would appease people like you who just want to see the term “DirectX 11” slapped on everything, even though it doesn’t do anything. So yes, I expect to see at least one of those three items above, hopefully both tessellation and multi-threaded rendering.

@pepper “No offence but I think this will be just support for DX11, that means that it will run under DX11” – No offense, but game ALREADY runs with DX11. What they want to do is to implement new DX11 features.

Because 11 is fundamentally different to previous versions of DirectX. You couldn’t just bolt the stuff 11 can do onto 10.

EDIT: And lol @ the “it goes up to 11” gags. I understand they are born out of understandable cynicism due to the lack of easily accessible evidence, but they are also born out of understandable ignorance.

Its the sustain you get with DX11 thats important. You could go away, have a bite to eat, and it would still be rendering when you came back. Anyways, I have to go and smell the glove, because someone is delivering a little stonehenge to my house later and I dont want to get trapped in a pod.

Derek: You know, we’ve grown musically. I mean, listen to some of the rubbish we did early on, it was stupid…
Marty: Yeah.
Derek: …you know. Now, I mean a song like “Sex Farm”, we’re taking a sophisticated view of the idea of sex, you know, and music…
Marty: …and putting it on a farm?
Derek: Yeah.

Reading through the comments, I’m beginning to think no one here knows what DX11 is.
It’s an API to talk to the graphics card. Which means it’s a set of classes, functions, structs, or whatever that the programmer can call upon instead of doing it themselves. But they very well could write this code themselves if they wanted too. It’s just something to make things easier to develop but not an end all be all…whatever.

I hear a lot of people thinking they can’t just slap DX11 features on DX10. I program in OpenGL and that’s EXACTLY what they do with the API. In fact it’s built in a way where I can use a much older API if I want and just include the new graphics card shaders in manually if I wanted to use them. “DX10” features in WIndows XP? Hey, it’s possible if you use OpenGL. The DX10 “features” are on the graphics card or a mathematical function, not API dependent. Microsoft just disallows them in XP and DX9. Hell, they are possible in DirectX on XP if you wanted to code the functions yourself.

I can’t believe how abstraction layers for programming have become so common that in fact people think the abstraction layer or API is somehow creating technology. Microsoft has always known what they are doing in advertising and it’s proven here where graphics card vendors advertise their cards as DX? compatible. What? It’s the other way around buddy. YOU make the hardware, DX finds a way to communicate with said hardware. If it were the other way around, why doesn’t DirectX just create a new technology to travel back in time with DX12. Then say although you can time travel with DX12 there are currently no DX12 compatible graphics cards! The entire process is absurd.

Anyone remember a game called Doom 3? A lot of the fancy tech and graphics in the game didn’t exist prior to it coming out. So instead of waiting around for someone else to do it in DX?, Carmack just programmed the stuff himself. The API used was also OpenGL, so it doesn’t really matter what API you use if you know what you are doing. You CAN make your own stuff up when you program a game you know. Games used to do that all the time, now they just wait for the new shader and plop it on a texture exclaiming “Hey look what WE made!”

Please everyone, stop focusing on marketing hyperbole like “Web 2.0” or “DX11” and realize the game is only as good as the effort the programmers put into it.

no one is talking about writing stuff in assembly. Writing AAA titles in assembly is not realistic because it would require tailoring the code for every graphics card.

Let’s be realistic. We have been comparing DirectX 9 and DirectX 11, which is commonly what developers use, for good reasons. And it is a simple fact that there are many things you can do with Directx 11 that you cannot do with DirectX 9.

Gorgol has a point. You don’t tap into a card newest technologies with an old API (say, you don’t tap into tessellation with DX9) without having necessarily to use a low level language like assembly, possibly helped by some shader language or the card manufacturer own APIs if they have them. So, I’m not sure what you are talking about. You are correct in saying that what creates the actual technology is the chips. But the APIs are the windows to that technology. Without them, you have to go off the normal programming circuit. And this is impractical-slash-impossible for the vast majority of gaming studios.

Take Tessellation, for instance. This is undoubtedly the one feature introduced in the latest cards, that could turn Crysis 2 into a mass murderer of innocent gamers as they suffer from cardiac arrest due to sheer beauty. DX11 supports it. But for it actually be used in the game, not only the graphics engine of the game (the actual graphics code and calls inside the game) needed to change, but also the whole game textures where this was to be implemented. But if they were to actually implement it with DX9, then they would need to that that and also change the whole game. They would need to create a new API just for tessellation and somehow bind it to DX9. And this new extension API, I should remind you, needed to be compatible across different card brands and models.

You don’t need to program new graphics card drivers, that’s another level of abstraction entirely. API’s like OpenGL allow you to include hardware options from the card that aren’t included in the API.

id Tech 5 is doing an effect that looks gives the same effect as tessellation in an entirely different way than DX11 does it with OpenGL and DX9.

Crysis 1 created a way to handle large forrest of destructible trees. They had to make that code up themselves. Doesn’t it stand to reason that they could create other things as well?

I’m not saying you need to get into low level machine code to do these things. Plenty of time the GPU for the graphics card has been used to perform things other than what the API allows. Just in C/C++, not peekin’ and pokin’ memory locations :).

My main point was that having a certain API is not enough to make a game better. Part of the cool factor for Crysis 1 was the huge destructible forests. Switching Crysis 1 to calling DX11 wouldn’t change that one bit.

To be fair, Metro 2033 looks even more incredible with all the DX11 features enabled. Unfortunately, it seems to run like a dog even on my Radeon 6950. Whether Crysis 2 can maintain the reportedly excellent performance with this graphics boost will have to remain to be seen.

What is nice about this news is that Crytek are clearly not just pooping out a PC version of the game and forgetting about it.

Personally I don’t care about DX11. Before I go ahead and buy it I’m just waiting for them to fix the multiplayer, and give me a freaking options menu.

I’m well aware that there’s a “fan” patch that opens up the options. But really, such a stupidly obtuse move on their part isn’t something I’m really keen on supporting. The last thing I’m intent on is devs getting the idea that they can simply neglect even the most basic features (and let’s be honest here, that’s about as basic as it gets) because “they’ll all buy it anyway, so it doesn’t matter if we leave even rudimentary features out”.

I realise that’s thoroughly nit-picky, but right now I’m at no shortage of other games I can be spending my money on otherwise, so I can afford to be picky. I’ll happily buy when they do eventually stop being so deliberately obtuse and patch it in, which at a guess will probably be around the same time as the DX11 patch.

True its a great game, but i’d rather spent my 60 bucks somewhere else. Its just like a demo, you constantly get the feeling that something is not finished. And the Single-palyer while great is very short, its defined not a bang for the buck.

“It took me 10 hours to finish the campaign, that’s way longer than most modern FPS games.”
Took 6 hours for me. And that’s without any rush. On Veteran difficulty. I even made few approaches to some moments trying things out. I don’t know how did you took 10 hours with it. There are some secret levels, or what?! ;)

“You could also factor in the replayability, as every situation can be approached in a few ways.”
There isn’t any. I tried to re-play 3 of my favorite levels but it was so dumb and pointless that I gave up after that. Game plays EQUALLY each time. If you want to talk about tactics – there are only 2 actually – Shoot and Hide. In general it’s best to use each one for a right moment, but even if you try to shoot everything one by one – it’s still boring, because there’s no tactic in this game – you just move forward.

For me maps in this game could be made as big long straight lines. It’d give identical result.

I know a lot of people complained about the lack of DX11 in the initial release, and its nice to hear that Crytek will be bringing in some graphical pizzazz for PC gamers.

However, I have to say its been refreshing how much PC gaming seems to be moving away from the tech focus in recent years. When articles pop up on RPS talking about hardware purchases like the recent one on SSDs its a little jarring. I love how incredibly game focused this site is and I almost never think about what kind of systems the writers are playing the games on.

I built my computer three years ago for $450, and I feel no real need to upgrade it. It won’t run Crysis in DX11 mode, but it will run it, and with no real impact on my game playing. I find myself much more focused on gameplay and mechanics than I used to, without worrying about system requirements. It’s one of the very few silver linings I can find in the current age of console dominance.

So bully to those with DX11 cards who can play Crysis 2 at highest settings. I’m happy that I don’t have to.

I bought it, despite, kinda not intending to. I just wanted a simple manshoot. I think SP wise it’s been decent entertainment so far, and I got it cheap off GamersGate. There’s a fair bit of scripting bugs occuring for me though, like people talking when they were dead and what not. The “simple” graphics settings are a joke too.

I tried the MP though, and I’ve read about the hacks, I couldn’t tell if people had them on but some dudes were seriously beefed up. Hard to tell if was just unlocks or cheats though. What should I look for?

Say what you want about everything else, but the whole .cfg hack thing should never have happened in the first place. It is ridiculously stupid that they decided to make use of client side config files in order to set the suit settings. That’s just a clear lack of any sort of thought for even the most basic security, that’s the only way I can put it.

Grief, this is not a problem. It should NOT be a problem, it is basic and something that we stopped doing back when multiplayer FPS’s were first becoming popular. In Quake 1. Fifteen years ago.

I seriously can’t understand how they managed to go all the way through to release thinking this was perfectly fine. Most direct console port I’ve seen don’t allow for such a thing.

Even though the engine might be better now, the game got pulled down to the same tech level as the consoles.

Battlefield 3 will be DX10+ only on the PC and it shows when you look at the trailers. Even though CryEngine 3 might be on the same level or even better than the next Frostbite, Crytek didn’t exploit that power for their game.

So, do we know that there will be no Battlefield 3 on Windows XP then? I would be alright with that, but I just can’t imagine them giving up on all the XP users. DX9 is still kind of a necessity for PC games.

edit: Yeah, Google seems to agree, no XP. That’s an interesting decision. Probably the right one at this point, I guess.

Eh, the api used has little to do with the game’s issues (most of which aren’t graphics related anyway). This changes pretty much nothing, except for, maybe, if it’s really well done, a slight fps boost? You already have all the eye candy they have in the game, it’s not like they’re adding tesselation and what not. RPS seems really out of touch.

I had no idea RPS was filled with so many cynical Luddites. Personally, I’d like to see video games on the PC continue to push the limits of hardware and software, giving us new and better graphical experiences If you’re content with 2008 console graphics and aping worn-out Spinal Tap jokes, more power to you, I guess.

Well for one thing, simply adding DX11 support doesn’t do much in itself unless the game is specifically tailored to make use of it beyond the most cursory automated changes. Which at this stage is looking a little unlikely.

But to address the other point, pushing the limits of hardware and software for no good reason was a big problem in the lead up to the current generation, and I’m not sorry that that’s stalled, at least for the moment. Devs went crazy pouring in production values on even the most niche titles thinking that “VISUALS = SALES!”, and then wondered why their limited appeal RTS that had just priced itself out of reach of most of its playerbase failed to sell very well.

It’s not about being a Luddite. But sometimes graphics genuinely are “good enough” and pushing production values and hardware requirements to extremes for diminishing returns is not a smart move.

We’ve currently reached a point in the current PC hardware cycle where devs may actually be able to push visuals further without it requiring expensive hardware, and we’re currently seeing some of that (Battlefield 3, Witcher 2 spring to mind). Things have been stagnant for a while and hardware costs have dropped enough that an upgrade cycle at this stage wouldn’t be unexpected or require a massive hardware shift for most of your core audience. I guess time will tell on that one.

I just don’t expect that adding DX11 to Crysis 2 is going to have much to show for it whatsoever. At which point it’s just going to become the butt of yet another stream of “OLOL PC ELITISTS!” jokes.

This… it’s almost like browsing an amish website or something with lots of people going
“Cars are the invention of the devil, we’re all fine with our square-wheel carriages pulled by a pack of mammuts” or something…

Improved tech year-in year-out and the open-ness of the PC platform (with stuff like sliders for graphics, modding tools and all that) are the two biggest advantages of the PC. If everyone is fine with graphics from 2005 and level-design catering to Microsoft, because they wanted to save some money and put only 512MB shared memory into that plastic box of theirs why not just get a console instead…
Tessellation alone would merit the use of DX11:link to youtube.comlink to youtube.comlink to youtube.com

Heck I love me my indie games and old-school pixel/hand-drawn art/platformers too, but there’s those titles and titles that are supposed to push the envelope… at the moment there’s a single title springing to mind which is Battlefield 3 (which will be built on DX11 and thus backwards compatible with only DX10).
If 3D graphics are ever to reach the fidelity of actual concept arts or hand-drawn backgrounds instead of box-like levels with flat surfaces everywhere, better lighting, better physics etc. shit has gotta move…

@ D3xter – I agree with you, but the reason people dont value it yet is because they’re not really being exposed to what it can do yet. The thing about graphics is that you accept whatever is the most up to date you personally have seen as being the ‘best’. You will feel the shock of this if you go back and play a very old game that you remember as having pretty good graphics, yet when you go back to it suddenly you see the gun is a few grey polygons, and the mens helmets float in space above their heads. Compared to the most up to date game you just played, it looks terrible – but as long as you have never seen anything better, and got used to that better level, then you accept the current ‘best’ as ‘actually best’.

Once the next gen is more available, and we see more and more footage, ads and gameplay of titles using things like tesselation and so on, those users saying they are happy with their current experience will begin to see, quite against their will, how jaded their lower-specced titles are beginning to look. Once you get used to the ‘better’, it becomes jarringly obvious when you are looking at ‘worse’. True, its still possible to go back and play those old games with the worse graphics, and they can be fun still, but it does leak into the satisfaction level if, for example, you feel like the gun you are using is a plastic toy, rather than a real powerful weapon. It will have an impact, but only once people start being exposed more to the next gen. At the moment videos like the ones you have there are few and far between, with few games making use of it all and even the Unigene Heaven videos only giving the tiniest taste of whats possible once creative minds start using the tools (e.g. slightly more sticky-out bricks is hardly a revelation at this time, but who’s to say what artists and devs will do with that tool?). Once more titles start to use it, thats when the hunger for it will be created. By choosing to not create titles that use these technologies, the hunger is not created and the older technologies wont be shown to be out of date – but you probably know that only too well.

You say satirical Analysis, But unfortunately, much of the rest of the internet took it entirely seriously.

However, it’s not really going to help, adding DX11 to Crysis 2 – Not with the majority of the sort of people who are whinging about it, who will mostly not be satisfied until PC’s are the only systems with good games, by law, and everything else gets shovelware and if a PC game is mentioned while talking on PSN or XBL, the console will explode like a goddamn bomb and kill everyone in the room.
Failing that, they might accept the compromise of a time machine which is used to go back in time, and remove consoles from the time-stream.

What’s the point? I need to register to the forums to read the link posted, so I haven’t. But is Crysis going to support hardware tessellation? Is it going to support threaded rendering? Does it need GPGPU?

The answer to all of this is no. So supporting DX11 is simply to shut the hell up of all the louts who scream for DX11 as if suddenly this would make their game somehow better. Not much different than the “fiasco” (not Crytec’s. see below) that was Crysis DX10 and DX9 modes which were only really noticeable in photoshop.

In order for DX11 to be meaningful, people must understand it needs for the developers to actually use it. Crysis can’t have tessellation (not at least to any noticeable level) or the game performance would suffer greatly… and so long announced machine requirements. It can’t have threaded rendering to any perfomance meaningful level because this actually requires rewriting the game engine. And games don’t want GPGPU — especially graphic intensive games.

Meanwhile DX10 was essentially a flop. It brought very little new and games like Crysis or Bioshock could draw very little from it. To the point you needed a clinical eye to notice any different on screenshots… and a genius to spot them during gameplay. It’s no wonder that developers largely ignored DX10. So DX9 is still the default choice if one doesn’t want to support the new technologies. It can produce complex games like Crysis 2 and DX11 is ony required if the new technologies are to be used.

I wish people stopped whining about DX11 as if by simply adding would change their game for the better. At least make an effort to try and understand a technology before crying like a baby if your favorite game doesn’t implement it.

cmon. why is everyone making such a giant story out of this? i mean, just look at the game. did you actually play it? did you play the demo? did you look at it?

this game looks amazing. it looks better than any other shooter ive known. i personally find it looks even better than crysis1, just because of the fact that there is so much glowing. yea, the oval flares just got me.

uhhhh, and we are talking about directx9 here. directx9! thats like a few generations behind and pretty old now i guess. but wtf have crytek achieved with it? pure awsomeness.

and it gets even better. here on my midrange pc (q9400 @2.67 ghz, 4gb ram, radeon 4870) it runs in constant 60fps. what else do you want folks?

“we want directx11 to see some minor graphical changes and want to suffer a 30fps breakdown”

i mean, tesselation looks really nice in detail but how often do you see an object near enough to actually notice the tesselation? tesselation also needs a lot of performance.

coming back to the 60fps and the options. right, my pc might not be the worst, but it sure isnt one of the best anymore. but i have my 60fps. why would i want to have more options? it already runs at max fps and looks awsome. dont see the point there.

im even saying it doesnt matter for the “real” hardcore. we are calling us hardcore pc gamers. did we forget to manipulate config files? i mean, we have a console and we have an autoexec.cfg. get the cvars from the internets an experiment by yourself. for me personally thats part of a good pc gaming experience. be able to dig deeper into the engine and manipulate values only developers are supposed to manipulate. you can even alter the FOV of the game. so, what else do you want?

Also tessellation can be used to make a big difference in the silhouettes of characters and landscapes. It really makes them look more natural. That said, it is usually one of the first things I turn off when I need more performance.

I wonder how many people would’ve complained if the game had been launched (with adequate backing by Microsoft’s Games For Windows initiative) as a DirectX 11 exclusive title on the PC. “It runs on 5 year old consoles, why can’t it run on a 10 year old operating system?” many a voice would have proclaimed.

In the end, what Crytek are doing now seems (at least to me as an outside observer) little more than giving in to the — frankly, irrational — demand to support a technological base line with features they don’t even access in their engine, just to silence the complaints. I have not heard about additional features to be added, but I highly doubt they will rewrite any calculations to use the Compute Shaders, or redesign their rendering path around HW-Tesselation, since that would lead to diversification of their codebase and capabilities between all supported platforms, which kind of defeats the purpose of a cross-platform engine. CryEngine 3 would become just another blob of middleware with platform specific quirks that licensees would have to anticipate, design, program, test and optimize against.

The cynical/pragmatic coder in me bets that they’ll simply add a new compile configuration in their Visual Studio IDE that links against DX11 instead of DX9, maybe flick the switch for 64 bit binaries while they’re at it, and hit compile. Diff it with the latest version, distribute as patch, voilà. The CG-programming enthusiast in me hopes that they will go the extra mile to utilize the features DX10 and 11 offer over 9 (cleaner API, additional shader support) so that in retrospect all the complaining voices would be vindicated. As it stands now, to me it’s only shrill noise coming from the incessant feedback loop that is the internet.

I waited this long*, I think I’ll hold on for DX12 now. I was into DX11 way back when it wasn’t the hip thing to do, you see.

*I didn’t really wait, of course. Utter boring blandness of Crysis, after suffering through identically boring blandness of FarCry, made me decide CryTech can’t make a decent game and that was the end of it.

I did, yeah. I finished Crysis. Well, there was a bug with the end boss, but I consider that as having finished it. Saw the final cinematic on YouTube, of course. FarCry got glitched at some point in the second half of the game, probably in the last quarter actually, and I couldn’t proceed any farther.

FarCry was more interesting of the two, though I’d hardly call that interesting. It’d be more accurate to say Crysis was more boring. Or perhaps it’s just that essential gameplay is so similar that it was less brain-numbign the first time around. Running through the jungle and encountering yet another Korean base, or what have you, is probably a good tech showcase but it makes for incredibly stupid gameplay when you do it over and over and over and over again. Even the aliens bit at the end of Crysis was boring, but at least the change of pace in the visuals were interesting enough and they didn’t overstay their welcome.

Personally, I thought Crysis 2 was hella pretty, and I played it on 360.

Hey its Crytek game that doesn’t turn to shit halfway through, the few vehicle bits are fun (VTOL bit in Crysis, I’m looking at you!) and the SP has lots of fun set-pieces. (the Ceph Pingers are really fun to fight, I found.)

I think most of the PC zealots are butt-hurt that their prized PC exclusive developer went to consoles. As an omni gamer, I don’t care. I’ll play anything good, on any platform. Which Crysis 2 definitely is.

Mind you, I thought Far Cry Instincts was way more fun than Far Cry. So take that on board if you want.

For me it’s not the fact that I think DX11 matters.
It’s the fact that they told us we’d have it that does.

At some point in the past two years I’ve felt really ripped off by a lot of companies I respected before.
In my mind that sort of twisted my perception of the whole business into some kind of shady evil slew of corporations who are exploiting and deceiving me! So when I’m told something is going to be included I feel as if I’m being robbed when it’s not at this point time. So on the simple principle of wanting to know what I’m going to get before I pay for it.. I hold off on buying games now till’ I can actually see for myself if it’s going to be what I hoped it was.. I have patience and I don’t really need to ALWAYS play the nicest game.