NVidia have written a little apology note to all suffering with Tomb Raider graphics issues. Although I’ve yet to receive chocolates. I mentioned in yesterday’s Tomb Raider review that I had some issues with running the game on prettier graphics, and it seems I’m not alone. Apart from the silly hair mode reducing NVidia cards to jelly, I had peculiar problems with the OSD occasionally causing the game to judder, and couldn’t play above the normal settings. Extraordinarily, as spotted by Joystiq, this is because for some reason NVidia didn’t receive final code of the game until the weekend before release, so didn’t have a chance to create an update to accommodate it all.

They weren’t the only ones. For reasons unclear, PC code of the game wasn’t made available to reviewers before release either, with an endlessly slipping promise explained by the developers’ “tweaking until the last minute”. However, it seems that it would have made sense to ensure NVidia had access whether they were worrying about details or not. It’s also not known at this point whether AMD – they behind the TressFX hair daftness – were given some sort of priority, since all appears to work very well with their tech. Games have long picked one side or the other to befriend, but it seems openly detrimental to give a disadvantage to one of the two producers of your customers’ graphics cards.

A statement left as a comment on the NVidia site states,

“We are aware of performance and stability issues with GeForce GPUs running Tomb Raider with maximum settings. Unfortunately, NVIDIA didn’t receive final game code until this past weekend which substantially decreased stability, image quality and performance over a build we were previously provided. We are working closely with Crystal Dynamics to address and resolve all game issues as quickly as possible.

Please be advised that these issues cannot be completely resolved by an NVIDIA driver. The developer will need to make code changes on their end to fix the issues on GeForce GPUs as well. As a result, we recommend you do not test Tomb Raider until all of the above issues have been resolved.

In the meantime, we would like to apologize to GeForce users that are not able to have a great experience playing Tomb Raider, as they have come to expect with all of their favorite PC games.”

So the take-home message here for NVidia users is to keep a close eye on their pages to look out for the release of the next driver, and make sure you keep your game patched to the eyeballs for when CD/Nixxes releases an update.

I think I prefer the look of the game without these. Not a fan of the way games try to emulate camera effects (And shoddy cameramen, since they seem hell-bent of getting as much dust, water and lens flares as they can in every frame).

I was actually impressed when I noticed the game wasn’t obscuring my vision with strawberry jelly whenever Lara got hurt. Shame it’s a bug.

I’m all for maintaining the artists intent, so I hope they fix this in a patch. Maybe add an on/off switch for those post effects for those who always complain about those effects on forums and such. They’ve been doing it ever since bloom got introduced in Prince of Persia Sands of Time and haven’t shut up about it since!

I’ve always found it odd that publishers knowingly release games that don’t work with the latest drivers. I mean surely they should be coding for stable, working drivers or else how are they managing to test their game?

Why should we be waiting for driver updates at all? Optimisations, fine, but when issues are as serious as this then what is the publisher doing releasing a game that doesn’t work properly?

The major manufactures have development drivers they sell to developers, with extra options available like, say, triple tessellation if the developer want to really work with that. These devkits are very stable and show off everything both the drivers and the hardware can do, but the problem is they cannot be copy-pasta’ed into game code, be it UE3, Source, or Crytek. This is mostly because every game tends to rewrite how the engine refers to usage drivers, either out of feature priority or optimization. Therefore developers tend to cobble together a jury rig driver and send it to the manufacturers saying “This goes here, this goes there, this isn’t working, can you fix it in two days?” This happens throughout development, and gets really busy towards the end, to the point multiple driver candidates are being tossed around trying to squash one graphics bug. At release, the driver used to optimize the game is so disjointed that the manufacturer often needs a few iterations of drivers to fix the reference codes and communication to a point it won’t break everything else a user has.

That’s probably why an established driver works well with previous releases, but is shit for a new release. Conversely, it’s also probably why when a driver released specifically for one game works alright, but utterly breaks a few others

I could be wrong though, but I think this is a reasonable generalization to keep in mind.

If a game studio releases a game with all sorts of new graphical bells and whistles, they get an edge over the competition. If a graphics card manufacturer can boast that their cards can display new graphical bells and whistles of currently hot game X, they get an edge over *their* competition.

So there is an incentive for companies to cooperate on pushing the limit, but of course when you are on the bleeding edge, sometimes a bug knifes you. Or something.

Not at all, all it means is that the code has changed in the meantime, and this can cause compatibility issues. I’ve tested online games and sometimes upon getting a new build you have to bug things like “3D models no longer display on Firefox”. So it’s not like CD have intentionally sabotaged Nvidia or anything.

It is a strange situation that Nvidia had to wait so long for final code though. Maybe that was a term from AMD in exchange for working on the TressFX? We’ll never know… I doubt it will be long before a new Nvidia driver comes out to fix a lot of it anyway. And my GTX 680 is running it just fine.

We clearly don’t, but it also happens all the time with Nvidia’s “The Way It’s Meant To Be Played” games which perform much better on Nvidia hardware because Nvidia works together with the dev while AMD gets the code at the last minute. Now AMD did the same thing back to Nvidia.

It’s very much “an eye for an eye” sort of thing, but maybe now Nvidia will realize now that these “hardware exclusives” are pretty shitty, only end up hurting customers, and thus come to an agreement with AMD to not do so in the future? Well, we can dream, at least.

Get used to this crap AMD are the ones dominating sponsorship of PC games now with their AMD Evolved program so they will lcok out Nvidia GPU’s. Nvidia appear to have retired their TWIMTBP program which used to provide decent drivers for the latest PC games.

Nvidia hasn’t retired it, they’re just not as active. I believe Witcher 3 and ArmA 3 are nvidia titles. Also its okay for Nvidia to sponsor games but not AMD (Ironic since their Gaming Evolved Titles are all great ports like Hitman, Deus Ex, Sleeping Dogs, Tomb Raider, and Bioshock soon)? Personally think both companies should both keep their noses out.

Effectively they have as their arrogant CEO decided Tegra was the way forward & who needs to spend $10M every year on the 250+ engineers they used to employ to QA the latest PC games in the TWIMTBP program! Now Nvidia have lost all the next console GPU contracts & also sidelined the TWIMTBP program meanwhile AMD are stealing market share & winning customers back. Nvidia have decided F2P garbage is the best way forward to bundle those with their future GPU’s.

Nah, it was a program. It ensured that Nvidia worked with the devs to have the best performance, while locking out AMD, and thus ensuring that Nvidia hardware would perform better. And now we saw AMD do the same thing back at Nvidia.

If corporations were people, Nvidia might at this point have thought to itself, “Wow, now that I see how it feels, and how it hurts consumers, I realize that I’ve been a huge dick all this time with that TWIMTBP program.”

Did I step into some kind of interdimensional portal or something? In what weird dimension would that be NVidia’s fault? That’s like having Mozilla make a public excuse because their browser isn’t rendering my page properly, because they didn’t get a final build of my website before release and couldn’t release an update for their browser.
How about fixing the effin’ game?

It’s something I don’t really get either, but NVidia and AMD seem to put out a lot of patches to their drivers for specific games, when I would assume that people make games to match the standards (a-la OpenGL/ DirectX) that the cards support.

Speaking from experience it’s entirely possible that the game *is* written to the standard, but for the particular edge case that the game hits the nVidia drivers don’t implement the standard correctly. Which would be why it works fine on ATi cards.

It would explain why nVidia are complaining about not having received “the latest code” from the developers – nVidia haven’t fixed the problem because they didn’t know exactly what they needed to fix yet.

Although the phrase “these issues cannot be completely resolved by an NVIDIA driver” would imply that there’s still other bugs in the game code which may be making things less clear.

Games are coded to standards, when the driver doesn’t know what the game running is then it does a reasonable approximation of complying with those standards. Running an optimised game involves looking at things like a flush command that, if you ignore, gives you 10% better rendering performance as the dev who added it only considered a subset of the hardware configs and in some cases it hurts and doesn’t help performance. Add in throwing away triangles you know won’t render to the visible screen, optimised tweaks to use less detail as it’ll only get thrown away later in the pipe, and a whole host of other things (rewriting a shader to be better optimised with the same output for your architecture and so detecting it and hotswapping it when the game loads); drivers are not doing what the game asks but rather are doing the least work they possibly can to render something almost identical to what a standards compliant rendering of what the game asked for each frame would look like.

Also modern GPUs (like the CPU brethren) are clocking dynamically and different sections are under different loads so the re-ordering of commands to best use all the resources while staying under the thermal/voltage limits is at play, even if GPUs are out-of-order designs, the game can be detected and stuff re-ordered on the CPU before the GPU picks it up at the driver level. There are many a complex thing going on and games are being coded to be as fast as possible and drivers are working to optimise out as much they don’t need to do as possible and in the middle the standards are there and are kinda being applied by both sides. It is no surprise that sometimes things go wrong for big games which are going to influence future purchasing decisions when benchmarked and so it is important for all manufacturers to show the best numbers they possibly can as soon as possible. Code on both side of the table (dev and driver teams) could be slightly off and causing issues and it sounds like nVidia are expecting both their driver optimisation and CD’s code to need another pass before they have a stable and fast implementation.

Out of the 4 cards I’ve ever owned, only one has been non-Nvidia. Current card is Nvidia, yet I still think this serves them right. All the stunts they’ve pulled, especially the proprietary PhysX CRAP. They have grounds for complaint here.

I would say only the consumer suffers, but to call low performance in your video game “suffering” would be foolish.

Is this really NVidia’s apology to make? Surely Square should be apologising. I mean the game is clearly heavily optimised for ATI, but surely the onus is on the developer not the graphics card manufacturer to ensure good performance and communicate with NVidia if there are performance issues. Unless NVidia ignored the bug reports as the game was pushing ATI as a brand.

I’m an ATI owner so i’m pretty blown away by the graphics performance in the new TR, but I remember the problems with Rage on ATI. I never blamed ATI for them but iD Software.

The issues are pretty substantial – running at Ultra settings (which includes tessellation) I am getting complete driver/GPU crashes resulting in a need to hard reset the whole system. I will be waiting for the next set of drivers and a patch before playing more, even though it seems I can avoid the issue by turning off some features.

Also interesting to note that the game doesn’t work at all for me with the EVGA Precision OSD enabled – on launch I just get a big black window.

What I have played of the game so far though, is awesome. Not quite as ‘interactive’ as you would hope but a really awesome and cinematic experience, especially in Surround.

If AMD did get preferential access to code I wouldn’t be too surprised. It’s not very ethical and sucks for customers of the other vendor, but I have no doubt that the same has happened for NVIDIA before now.

The hair situation bogs me down. I’ve a laptop with an AMD 6990M 2gb in it, and with the fancy hair situation it runs quite poorly, 20-30fps; with the hair off it runs flawlessly, I couldn’t say the framerate but I expect it’s at or above 60. Is the fancy hair supposed to cause so drastic a drop in performance? I thought it was supposed to just ‘work’ on AMD cards.

I’ve got the 6990M 2GB as well in my lappy, and yes turning on the hair crap drops my framerate like a rock. I basically turned off that and tesselation and it’s running really well now (tesselation seems to slow me down more than it’s worth in most games, so i generally turn that off anyway). I don’t think i’m getting 60FPS out of it, though, what are your other graphics settings? I mean, the card is about equivalent to a 6850, so while high end for a laptop it’s really just midrange, so I couldn’t expect to max things out for long, but really the portability and flexibility make it a worthwhile tradeoff for me.

Probably often we see errors in games that are causes by poorly programmed drivers. And we never see a apology by the makers of the video card. We always blame in these instances the game dev’s.

So is somewhat weird to get here a apology by the videogame makers, for a game that apparently is not as smooth as should be in this particular hardware. Is this a admission of a error from NVidia? don’t sound like one. It sounds like taking blame for something themselves never did.

Is cool, I suppose.

Are the new consoles ATI or NVidia? maybe devs will change love interest now. Most games used to be made for NVidia first, and ATI second, maybe things could change.

To be fair, Sony has been very open about cross platform play. In fact I definitely see Planetside 2 releasing on the PS4 and being able to play with the PC version of the game. The only reason the Xbox and PS4 won’t be able to play against each other is because Microsoft is very adamant about keeping their online service a walled garden. It’s easier for them to justify charging owners a premium to use it when you make it out to be something exclusive and good.

Been playing this on my laptop (with a Gefore 640M LE), runs just fine at 60+ FPS at almost all times on medium settings, but the Shantytown section really destroys my framerate. Even on the lowest settings that part brings my FPS down to around 30.

The crashing is the developers fault not nvidias. Nvidia drivers are stable with everything I’ve got installed on my PC, close to a hundred games, most of which were written before the 600 series even existed, so there’s nothing wrong with the drivers themselves. A little bit of Q&A from the developers/publisher could have easily avoided this.

TBH though when you hear CD only gave nvidia the game 2 days before release there’s a niggling suspicion that the devs have deliberately sabotaged the game on nvidia cards. Why else would you leave it that late?

Also are we now living in a world where you need to install new drivers for every new game release? I doubt the average PC games updates their display drivers more than a couple of times per year.

I was about to say “I’m glad they’re taking their time”. But they’re not, still kind of are, but with something as wonderful as TWD Game, I really want to see them do season 2 right. A game of this much importance (and it is important in a number of ways to the games industry) shouldn’t be getting yearly sequels.

That said, If it’s just 1 episode at a time again, then I think this autumn is far enough away for 2-4 hours of game to be made in.

Not quite sure what a big deal it is. I’m running old 300 series drivers on a GTX580 because I’m lazy. Set the preset to ultimate, and turned off tessellation, and high precision, and of course TressFX. Near solid 60fps on 2560×1440 with a few drops into 50s. If I drop it down to 1080p its fine.

None of those features make a drastic difference in visuals, and are all huge performance sinks. Game seems fine to me. Unless you’re benching with Titans or a Crossfire setup, seems like its no big deal. Wait for a driver patch, its still very playable.

So the take-home message here for NVidia users is to keep a close eye on their pages to look out for the release of the next driver, and make sure you keep your game patched to the eyeballs for when CD/Nixxes releases an update.

Or, like me, decide not to purchase until hearing confirmation that the situation has improved. I’d been considering taking the plunge this weekend or next (after the more-positive-than-I’d-expected RPS WIT along with other reviews), but if it’s not ready yet, I don’t suppose I’ll bother. It’s not like I’ve gotten anywhere in Dishonored yet, anyway.

Wow, i’m actually very impressed. Usually it is AMD with driver issues (in fact, 90% of the time it is) which is why my current card is nVidia (and will never go back to AMD), yet this one rare occasion they don’t have drivers up to scratch they APOLOGISE! AMD, take note. This is how you keep customers.

Having a look at Square’s support forums, it’s painfully clear that this isn’t nVidia’s fault. Both console versions have rampant game-stopping bugs, save corrupting bugs etc etc which make people unable to progress/even launch the game. Clearly this game was pushed out the door without being anywhere near finished. It’s not often a game is shipped to consoles with multiple showstopping bugs, methinks.

There was a PC patch which ruined everything instead of fixing anything, it was pulled back and all references to it removed. Does not bode well… In any case I’d imagine they are working their asses off to make the game even work at all on consoles….enabling the pretties for nvidia cards is probably a distant second.