Yeah, he's talking about a graphics downgrade. Ubisoft is a console publisher. They will not let Ubisoft Massive release the Division on PC looking better than consoles get it. Consoles are their bread and butter and anyway 95% of us PC gamers are pirates.

Well to be honest I thought ubi had turned a corner but with them doing the same with watchdog's I won't get caught out by them again, hell they recently patched wd and its now a pos I won't play, ie before it didn't stutter now it does.
This kind of nonsense is bad for business ,their business.

2 things lead to this type of thing, and I hate to say it but us PC gamers are partially at fault.

The first driving force for publishers to do this is to give a relatively uniform experience on all platforms. So you don't have all the PC Master Race people going around saying "look how much better the game looks on PC!" And all the console gamers that don't understand how underpowered current consoles are going "why does it look better on PC, I paid $500 for this console, it should be the best!"

The second thing is what I call the GTA:IV effect. If a publisher allows the user to crank up the graphics, and then the game runs poorly on max settings even on high end rigs you get the PC gamers bitching about the game being un-optimized. It doesn't matter that the game is optimized and can play just fine on reasonable settings on mid-range and low-end rigs, that doesn't matter. It seems the only thing that determines if a game is optimized is if the max settings run at 100FPS+ on high-end single GPU rigs. The sad reality is, if Rockstar had just limited what we could set the graphical sliders to half of what they allowed, no one would have complained about the game being un-optimized. But instead, they allowed us to max out the engine's options, and people bitched because their single GPU computer couldn't max the game out at 100FPS and said the game was un-optimized. So now we get games with purposely crippled graphics to appease all the idiots that want to say anything that doesn't run smoothly set to max on a single GPU is un-optimized. This is what we get. So next time you see someone complaining about a game being un-optimized because their rig can't run it maxed out, tell them thanks for dicking over the PC game industry. This is their fault.

Finally, I'd just like to say my opinion on them purposely crippling the graphics. I don't give a damn. I could not care less. If the game is fun I'm still going to play it, any true gamer will. I play Watch Dogs on my 7850K's integrated graphics and love it. The game is fun, and I enjoy sitting on my couch and playing it with a controller. It doesn't look nearly as good as it does on my main rig, but it doesn't matter, the game is still fun.

2 things lead to this type of thing, and I hate to say it but us PC gamers are partially at fault.

The first driving force for publishers to do this is to give a relatively uniform experience on all platforms. So you don't have all the PC Master Race people going around saying "look how much better the game looks on PC!" And all the console gamers that don't understand how underpowered current consoles are going "why does it look better on PC, I paid $500 for this console, it should be the best!"

The second thing is what I call the GTA:IV effect. If a publisher allows the user to crank up the graphics, and then the game runs poorly on max settings even on high end rigs you get the PC gamers bitching about the game being un-optimized. It doesn't matter that the game is optimized and can play just fine on reasonable settings on mid-range and low-end rigs, that doesn't matter. It seems the only thing that determines if a game is optimized is if the max settings run at 100FPS+ on high-end single GPU rigs. The sad reality is, if Rockstar had just limited what we could set the graphical sliders to half of what they allowed, no one would have complained about the game being un-optimized. But instead, they allowed us to max out the engine's options, and people bitched because their single GPU computer couldn't max the game out at 100FPS and said the game was un-optimized. So now we get games with purposely crippled graphics to appease all the idiots that want to say anything that doesn't run smoothly set to max on a single GPU is un-optimized. This is what we get. So next time you see someone complaining about a game being un-optimized because their rig can't run it maxed out, tell them thanks for dicking over the PC game industry. This is their fault.

Finally, I'd just like to say my opinion on them purposely crippling the graphics. I don't give a damn. I could not care less. If the game is fun I'm still going to play it, any true gamer will. I play Watch Dogs on my 7850K's integrated graphics and love it. The game is fun, and I enjoy sitting on my couch and playing it with a controller. It doesn't look nearly as good as it does on my main rig, but it doesn't matter, the game is still fun.

Click to expand...

Screw that. I spent hard earned money to become a member of the master race. I assume no responsibility for inferior products. They are beneath my epic hard drive space.

As long as we don't need $3000 GPU to run the Division maxed out then there is no need to cripple it to justify making it an optimized game, and if they are doing this to delude the console players that they are still close to PC then that is a big shame, even the console owners know that they will not catch-up with PC ,that is not the goal of console.

But instead, they allowed us to max out the engine's options, and people bitched because their single GPU computer couldn't max the game out at 100FPS and said the game was un-optimized. So now we get games with purposely crippled graphics to appease all the idiots that want to say anything that doesn't run smoothly set to max on a single GPU is un-optimized.

Click to expand...

Not that I disagree with most of what your saying (or the point in general at least, since I feel your blowing it up a lot). I just want to say you are downplaying the GTA IV issue. At the time of GTA IV's release on PC I was rocking a 4870x2, which was released just months before that game came out, and max settings still killed my setup. So it wasn't just single GPU users having issues with it. And also worth mentioning since you mention 100fps, I wasn't even anywhere near 60fps. Though all I did at that point was turn down the settings a bit and enjoy the game.

Not that I disagree with most of what your saying (or the point in general at least, since I feel your blowing it up a lot). I just want to say you are downplaying the GTA IV issue. At the time of GTA IV's release on PC I was rocking a 4870x2, which was released just months before that game came out, and max settings still killed my setup. So it wasn't just single GPU users having issues with it. And also worth mentioning since you mention 100fps, I wasn't even anywhere near 60fps. Though all I did at that point was turn down the settings a bit and enjoy the game.

Click to expand...

Yes, and now with Watch-dogs I can't achieve smooth play using 2XGTX680, I know they are not latest and greatest but still way higher than what they announced at some point that :you can max out or play on high using GTX660.

What the point of announcing new graphic engine "snowdrop"to show us super duper graphic and then just releasing game with just standart graphic as usual.why release new graphic cards if only thing we can test em on is 3d mark,to see its full potential.
P.S.
I don't care much about top notch visuals,I always can download some modded packs for games what makes game look much better than developers made it.

They should do the same thing the creators of GRID Autosport are doing and make HD textures available separately so if your PC can handle it you can install them, if not the run the same version as the consoles.

This seems to be the source of it all, and of course it depends whether you believe it or not.

WhatIfGaming seems to be a fairly new site, and they could just be trying to spin controversy to create hit points. On the other hand, if the claimed testimonial in this article is completely false, you'd think Ubi would have stated so by now and insisted they pull it and print a retraction or take legal action against them.

I have to say though, I don't have much faith in Ubi lately. They seem to be poster boys for the console industry. What I don't get though is if they DID decide to disable graphics features in both Watch Dogs and The Divison to level the playing field after they found out consoles couldn't handle it, why did they even leave those scripts in the game files?

To leave that script in just creates HUGE potential for a PR disaster with PC customers. Are they really that cocky that they don't care at all what we think? And even if they don't, this is potential lawsuit material for false advertising. Why risk it?

And please, don't get me started on GRID Autosport. There's a LOT more to a game than texture detail, and they mucked up just about everything about that game. Worst game from the GRID/DiRT franchises yet.

Yes, and now with Watch-dogs I can't achieve smooth play using 2XGTX680, I know they are not latest and greatest but still way higher than what they announced at some point that :you can max out or play on high using GTX660.

Click to expand...

How you tried TheWorse mod? I have been playing WD maxed out on my 290 without much of an issue. Have some hiccups, but it's nothing that bothers me too much.

Screw that. I spent hard earned money to become a member of the master race. I assume no responsibility for inferior products. They are beneath my epic hard drive space.

Click to expand...

I agree. PC gamers plop down a shitload of their hard earned cash just to play their games on not only maxed out settings, but with superior graphics to consoles. WhyTF should AMD or nVidia bother updating their chipsets if game developers are just going to put all of their efforts into console gaming?

If I were AMD I would approach the game developers/studios and be like " Our bread and butter comes from the sales of PC chipsets upgrades, whether it be CPUs, graphics adaptors, or mother boards, NOT consoles. Our loyal and vast user base expect not only a certain level of performance increase, but a superior experience as well. YOU are making money using OUR hardware. OUR customers are not happy because of YOU! Our two parties can either work together to give our paying customers what they expect, or we at AMD will stop our driver support for your software which in turn will hinder your ability to release quality titles that in the end, will only hurt you! "

Not that I disagree with most of what your saying (or the point in general at least, since I feel your blowing it up a lot). I just want to say you are downplaying the GTA IV issue. At the time of GTA IV's release on PC I was rocking a 4870x2, which was released just months before that game came out, and max settings still killed my setup. So it wasn't just single GPU users having issues with it. And also worth mentioning since you mention 100fps, I wasn't even anywhere near 60fps. Though all I did at that point was turn down the settings a bit and enjoy the game.

Click to expand...

and crysis 1 WAS? please gamers are spoiled brats now. 8800s in SLI during crysis 1 demos couldnt hold their head above 60 comfy. Their is no way to justify the attitudes gamers have now. Game engines are becoming closer to rendering engines and derp players are having a hard time understanding what this means.

Not that I disagree with most of what your saying (or the point in general at least, since I feel your blowing it up a lot). I just want to say you are downplaying the GTA IV issue. At the time of GTA IV's release on PC I was rocking a 4870x2, which was released just months before that game came out, and max settings still killed my setup. So it wasn't just single GPU users having issues with it. And also worth mentioning since you mention 100fps, I wasn't even anywhere near 60fps. Though all I did at that point was turn down the settings a bit and enjoy the game.

Click to expand...

The first thing you need to realize is that when GTA:IV was released it wasn't properly supported by SLI or Crossfire(I don't think it is properly supported by SLI even now). So your 4870x2 was really just a 4870. That is kind of why I was sticking to the single GPU setup argument. And I also don't believe that not being properly supported by SLI/Crossfire is in any way a fault of the game developer. It is AMD/nVidia's job to make sure their multi-GPU technology properly support new games.

On top of that, Rockstar acknowledge that max settings were too much for high end rigs. They said they left the option to crank up the settings in the game so future hardware could take advantage of the higher settings. This is a good thing, I wish more games would do it. Yet everyone bitched about it and it bit them in the ass. And this is likely why we'll never see a developer do this again, unfortunately.

The test of if a game is optimized or not is not about the game being able to be maxed out on the current high end hardware. If that was the case, you basically said it yourself, all that would be need to "optimized" GTA:IV would be to artificially cap the settings at 75% of what was allowed. However, the true test of how well a game is optimized is how low of hardware can it play on and still look decent. Of course what looks decent is different from person to person, but IMO, looks decent means it looks at least as good as the console version. In GTA:IV's case, I was able to play the game on my HD4670 with settings higher than the consoles used and a higher resolution. Yeah, the HD4670 was a newer card, about the same age as the HD4870 you played with, but the benchmarks actually put it as being slightly weaker than the HD3850. So basically, GTA:IV could be played on the last generation's high end GPU, and still look better than the console version of the game. That isn't bad optimization. But even more impressive was that paired with my HD4670 was an Athlon X2 4400+, a CPU that at the time was over 3 years old!

Yes, and now with Watch-dogs I can't achieve smooth play using 2XGTX680, I know they are not latest and greatest but still way higher than what they announced at some point that :you can max out or play on high using GTX660.

Click to expand...

I'm not sure where you saw that you can max out WD with a GTX660 but Ubisoft never said that. They said that Ultra would require at least a 780 and that all the footage they showed was done using a Titan.

IIf I were AMD I would approach the game developers/studios and be like " Our bread and butter comes from the sales of PC chipsets upgrades, whether it be CPUs, graphics adaptors, or mother boards, NOT consoles. Our loyal and vast user base expect not only a certain level of performance increase, but a superior experience as well. YOU are making money using OUR hardware. OUR customers are not happy because of YOU! Our two parties can either work together to give our paying customers what they expect, or we at AMD will stop our driver support for your software which in turn will hinder your ability to release quality titles that in the end, will only hurt you! "

Click to expand...

I understand your angst and feel it too, but AMD is SO not in the position financially to be leveraging ANYONE, and until someone sneaky and clever enough to infiltrate these big developing teams proves they are intentionally downgrading vs experiencing performance issues as claimed, no one's going to have a leg to stand on legally either.

This is a bigger problem than people think, and I'm sure Ubi's somewhat suspect legal team have already considered the repercussions. It's really looking like they are flaunting this right in the faces of PC users, something even worse than Epic did when they declared all of us pirates. And meanwhile console customers are fine with just having cheaper systems to play on.

In fact I only see this as getting worse. It's sending a huge message to a lot of console players considering a gaming PC to not even bother with it. It's as if Sony and MS read the recent predictions of PC game sales surpassing console sales this year and decided to start greasing the palms of big publishers to tell their dev teems to make graphics equal on all platforms.

Next step might be Cliffy B becoming head of the Xbox division, and someone equally as anti PC for Sony, and we're talking a war that cannot be won.

I'm not sure where you saw that you can max out WD with a GTX660 but Ubisoft never said that. They said that Ultra would require at least a 780 and that all the footage they showed was done using a Titan.

I agree. PC gamers plop down a shitload of their hard earned cash just to play their games on not only maxed out settings, but with superior graphics to consoles. WhyTF should AMD or nVidia bother updating their chipsets if game developers are just going to put all of their efforts into console gaming?

If I were AMD I would approach the game developers/studios and be like " Our bread and butter comes from the sales of PC chipsets upgrades, whether it be CPUs, graphics adaptors, or mother boards, NOT consoles. Our loyal and vast user base expect not only a certain level of performance increase, but a superior experience as well. YOU are making money using OUR hardware. OUR customers are not happy because of YOU! Our two parties can either work together to give our paying customers what they expect, or we at AMD will stop our driver support for your software which in turn will hinder your ability to release quality titles that in the end, will only hurt you! "

Click to expand...

and yet the next generation is powered by amd trust me they don't give a sh*t about you

Yeah, he's talking about a graphics downgrade. Ubisoft is a console publisher. They will not let Ubisoft Massive release the Division on PC looking better than consoles get it. Consoles are their bread and butter and anyway 95% of us PC gamers are pirates.

Yeah, he's talking about a graphics downgrade. Ubisoft is a console publisher. They will not let Ubisoft Massive release the Division on PC looking better than consoles get it. Consoles are their bread and butter and anyway 95% of us PC gamers are pirates.