Nope, you won't be able to run Ghosts at max setting with GTX 660. Though the IW game engine is not that power hungry as the Dunia engine used in Far Cry 3 or even the Frostbite engine used in Battlefield 3 the GTX 660 is just not powerful enough to run the game at max setting. I'm using (2) watercooled EVGA GTX 680 SC on SLI & I even had hard time running Far Cry 3 at max setting without the frame rate dropping around 55-60 fps.

Well,apparently, the game engineof CoD: Ghosts (based onQuakeengine from 1996!)seems to be indeed a “little”powerhungry when compared to other PC games with (better?) graphic issues:e.g. BF4(releaseon31.Okt2013):

The same withCrysis3 thatwasalreadyreleasedandhasgreatgraphics by much lowerPCrequirements! PLUS the user (customer!) got the possibility to change graphic options based on his PC (Minimum and Maximum requirements).

Also strange, that Activision/IW call NVIDIA® GeForce® GTS 450 on the same level as a ATI® Radeon™ HD 5870… this must be joke!! ;-)

And whats up with these 40GB free space you need for this game? Is there a HD-Movie included?

Like the title says i wonder if i can play ghost on max setting, maybe hard to know but just guess.

NVIDIA GeForce GTX 660

6GB DDR3

Intel Core i5-3570K ( ivy bridge )

It depends on what resolution you are playing at.

I have just upgraded my dad's graphics card from a Gigabyte Windforce GTX660, and he was able to run Battlefield 3 maxed out at 1080p.

Yes the Engine for COD is more resource hungry than the ones used in Battlefield and a few other games but, I would strongly suspect that you should be able to run it maxed out or pretty close to maxed out at 1080p.

If you struggle to though, just turn some settings down or drop the resolution it's that simple.

I play most games with no AA since most games don't need it on my monitor, but seeing as my pc can max any game out at 1080p I have no issues with this to be honest.

That's pure BS. The Frostbite engine in Battlefield is more power hungry than IW engine. Of the 3 (Dunia, Frostbite & IW) it's the IW that uses up less graphic resources but it doesn't mean also that it's not power hungry. The older version of Frostbite on Medal of Honor 2010 used up less power than that of the newer Frosbite on Battlefield 3.

This is a dose of reality my friend. Don't listen to people running Nividia 400 or even the 500 series (this include the high end 500 series GTX 580 single & GTX 590 dual GPU video cards) that they can max out Battlefield cuz it's not true. The low end GTX 660 (including the TI version) is less powerful than the older GTX 580 & GTX 590 & can't max out Battlefield 3. Running the game at max & you'll end up lagging. It takes a toll particularly on AA, MSAA & V Sync.

The GTX 460 is an slow & obsolete graphic card. It can't run Battlefield 3 at high setting without getting a massive hit on frame rate. Even at the time of the 400 series the GTX 460 was not known for being powerful. The GTX 480 fared better. Long ago I used to own (2) Sapphire Radeon HD 4870X2 (dual GPU) running it at quad xfire. The 4870X2 was more powerful than the GTX 480 but even with such power it's impossible to run even a game like MOH 2010 )which has an older & less power hungry Frostbite engine) without crashing & burning at higher setting. The GPU will get so hot that if you don't stop gaming after 5 minutes you'll get a massive hit on frame rate & will get BSOD.

Depends on what your version of 'running' is and if it's a solid 60 FPS then no a GTX 660 won't sustain 60 FPS on HD 1920x1080 @ MAX 8xMSAA. A GTX 760 or R9 280X will barely hold 60 FPS @ MAX in Ghosts; mins will drop below and you'll be stuttering. I can't imagine Ghosts being 'less' demanding than BO2. Now in contrast low-medium settings somewhere in the 2xMSAA range you should be able to keep your low FPS above 60.

As far as BF4 LMAO if you even think HD 7870 or GT 660 is a 'good' idea. Certainly not based upon the beta I tried.

In FPS the advantages go to the player that can nail 120+ FPS and without stuttering all over the place. In my case I'm going to be running GTX 770's in SLI driving a QHD (2560x1440) monitor and I seriously doubt @ 8xMSAA in Multiplayer.

I think you fail to understand how fps/refresh rate actually works then. If you think you are going to get more frames to appear than you are sadly mistaken.

Refresh rate is typically measured in frequency (Hz) which translates into the number of times per second your monitor can redraw the entire screen. Thus a refresh rate of 60Hz means that your monitor can redraw the entire screen contents 60 consecutive times during a single second; 85Hz is 85 times, and so forth. This is fairly straightforward, but remember, this is how fast your monitor can refresh the image on the screen, not how many FPS your system is actually producing or displaying. Let's examine the difference.

The mechanics of the game has zip to do with what your monitor can display; 60 Hz (60 FPS)/120 Hz (120 FPS)/144 Hz (144 FPS)/etc. It's how the game responds with higher FPS and even with 200 FPS the rate of fire (fire rate - which I personally haven't noticed myself). Some folks you see flying around often use a FPS Hack to remove the ~200 FPS cap, and yep many get banned for doing it to.

Again, it's the mechanics and in particular you do NOT want Stuttering which is worst. Today, compare setting your FPS to 60 and 120 and then compare, and lower you MSAA to even off ... then let me know.

Trust me I loath playing anything less than 120 FPS. My current GPU's (pair of GTX 560's) is a problem, I can get 200+ FPS but since they stutter I use 120 FPS. When I'm buttery smooth and don't stutter I r@pe. So new for Ghosts I'm going for a pair of GTX 770's on my new 2560x1440 monitor.

They should be dropping within 3 weeks for sure lol, with Black Friday/Nov (newegg) keep an eye out. I just couldn't wait any longer so I grabbed one a couple months ago or so and have had a great experience so far with the asus direct cuii sitting at 1290 core and 7225 memoryit hasn't gone over 68 degrees on load

Your GPU is better than mine combined now. As long as you have no Stuttering you're golden!

I'm just waiting for the damn prices to drop on the GTX 770's, hopefully very soon, since the 290X release...

Well since the R9 290x is now out and the R9 290 coming out today I believe the prices should drop on the 770's and 780's.

They may drop even further since the GTX780ti will be released in the middle of November.

Personally I am waiting for Maxwell to be released and also for 4K monitors to become cheaper, since Maxwell is supposed to be able to handle 4K surround, when running SLI easier than the current cards.

My 680's are technically perfectly fine for now and will technically be fine even next year but they are poor overclockers and starting to show there age to me.

Most of the new GPU's cannot OC worth a crap. Your GTX 680's are basically re-branded GTX 770's with a different memory bus. The GTX 770/780 don't have the stuttering (latency) issues of the AMD in CF/SLI comparisons. Sad situation for those purchasing R9 280X/R9 290X in CF as they'll have a rude awakening. Further, I can't stand nor will I purchase reference cards though the reference GTX 770/780/Titan all looked good and the noise/temps were fine.

NewEgg has some R9 290X's ($550~$580 USD) listed for sale but nVidia is still stuck on their damn pricing, and yep I'm keeping and eye pealed. The GTX 770 (to me) is not worth $400 - period; not right now anyway.

If all I did was game 7/24 then yep 120Hz monitor with uber low lag; though I kinda like the ASUS VG Series VG278H Black 27" 2ms 120Hz and it would be interesting to compare it to the BenQ Gaming XL2720T Black 27" 1ms 120Hz. The problem is for anything else than gaming TN panels blow. Other than that I wish I could transplant my G500S right/left buttons on my sluggish clicking M65 mouse, but I do really like my K90 mechanical KB.

I'm an old fart and simply wanted a 2560x1440 color accurate monitor as I do at my office. I do SQL/PHP coding and need the larger work area, and no way am I going to swap back and forth a 120Hz monitor nor do I want to cover my @ home desk with multiple monitors.

Yep it can cause tearing especially if your FPS is Stuttering and your Minimum FPS drops below whatever MAX FPS you set. Multiplayer is a whole different world than shooting bots in Campaign and making the rendering 'pretty.'

That should be an accurate assesment of GTX 660. The card won't hold up a sustained 60 fps at max setting & will stutter. Even worst, this new IW engine should be more GPU resource hungry than the previous IW engine.

In addition, with the new 600 series driver (version 331.58) the MXAA seems to have vanished &/or Nvidia removed it. At least I don't see it anymore as an option on the graphic setting of my EVGA GTX 680s. MXAA is a real frame rate killer particularly on slower video cards.

Now only FXAA. Nvidia seems to drop MSAA & TXAA on the new driver cuz I'm not seeing it at all. Even FXAA it's says on the option "Not supported for this application" & I tried different application settings still the FXAA is not supported on the new driver.

Well I've seen bad Nvidia drivers before and 'new' doesn't necessarily mean 'better' -- fix one thing and F'up everything else. I'd rollback my drivers. TXAA especially 4x is ideal and does a better job than MSAA IMHO and MSAA is a hog.