This will unlock experimental dx11 support for wow(full support will probably be implemented in a future patch), it made me jump from 80fps to 100fps, not noticeable since the human eye only sees at around 32fps. For you it may be more noticeable.

The idea of high FPS is not to simply get the max, going from 80 to 100 for example. You want to increase the low end of your FPS (think raid/battleground with tons of spells characters) that is where you will notice the FPS increases.

it made me jump from 80fps to 100fps, not noticeable since the human eye only sees at around 32fps. For you it may be more noticeable.

Great guide, but I have to nitpick here, since it's a pet peeve whenever someone starts talking about how many "FPS" the human eye can see. Our eyes, in fact, do not see in "frames," but in continuous light. From a biological perspective, there's no known limit to how many FPS we can therefore detect. Personally, I can notice the difference between 60 and 100 FPS (on a 120hz screen, of course.) But my best friend says he can't really tell a difference between 30 and 60. It's all subjective.

Almost all of the last bunch of questions can be answered by browsing strunker's original post and a few of the posts after that including some of mine... pretty much all the useful mask numbers are summed up in post #19.

Can't you guys just read them instead of asking question that have already been answered?

As for Xeons... they're a really weird chip, and I found out quickly that I don't know enough about them to give proper advice, other than telling you to seek details here:http://en.wikipedia.org/wiki/Xeon

I saw something in there about quad Xeon still only accepting single CPU operations or something like that, so I don't know if it actually behaves the same as today's quad cores, and same goes for their HT. You'll probably have to do more reading about it yourself...
If it's actually like having two quad core HT processors then that would make make 16 threads... There's probably no need to allocate WoW to the last few threads because the whole point of this stuff is to just get the game off the threads that the OS would be using, not to stick wow at the highest number threads. You could probably just pick a number that works for single quad cores with (84 or 252) or without HT (14) and it might still effectively mask the first core so that game avoids it.

I'm sorry, but I actually went and read AMD specs for my processor and did not find any info on HT.
Which i should have known means they have no such thing, that was the only thing I was missing.
15 it is.

BTW great thread! I will go and check my TCP settings also, I play a lot of PvP and I was disappointed at how low FPS I have even though my PC is so strong.

Quick thing. I have W7 and DX 11. I tried adding that line you posted and sure enough my fps went up a few notches but now I cant see any water unless it's on low. I used to have it on ultra. Any idea what's going on? Yes, I had the water working right before I added the line.

Great guide, but I have to nitpick here, since it's a pet peeve whenever someone starts talking about how many "FPS" the human eye can see. Our eyes, in fact, do not see in "frames," but in continuous light. From a biological perspective, there's no known limit to how many FPS we can therefore detect. Personally, I can notice the difference between 60 and 100 FPS (on a 120hz screen, of course.) But my best friend says he can't really tell a difference between 30 and 60. It's all subjective.

This bugged me too, only because there have been practical studies which put the minimum refresh rate of a display necessary for the human eye to accept what they see as "real" much higher than this number the guy pulled out of thin air with no basis.

It's true that we see basically a continuous exposure but Showscan was developed by Doug Trumbull in the late 70s based on biometrically measuring a viewer's response to changes in frame rate. Refresh rates higher than 72fps are where no change in response was measurable in the viewer. And that's objective data.

Of course not responding in a measurable way does not equal an inability to perceive (though it is an odd concept). It's pretty easy to conclude, however, that refresh rates above those we actually respond to are practically irrelevant while refresh rates below those we actually respond to aren't insignificant, like the author seems to think.

I just want to note that the MSMQ key DOES NOT EXIST IN WINDOWS VISTA BY DEFAULT. If you cannot find the MSMQ key, then please do this instead:

Open up notepad and paste in the following:

Save it as MSMQ.reg (yes, .reg, not .txt) and then double click it. It will ask if you're sure blah blah, click yes. You can then delete the file. Reboot.

---------- Post added 2011-01-18 at 11:12 AM ----------

I can't edit my post...

Also, for some reason mmo is putting a space in the word PARAMETERS, so please remove the space before saving the file.

I assume this applies for W7 as well? And what happens if i accidentally saved it without fixing the word "parameters" ? I totally spaced on fixing it, but now i cant find where it is saved to to fix it.

Quick thing. I have W7 and DX 11. I tried adding that line you posted and sure enough my fps went up a few notches but now I cant see any water unless it's on low. I used to have it on ultra. Any idea what's going on? Yes, I had the water working right before I added the line.

strunker, you are a genious. and the guy who figured out which settings to use for a hexcore. wow is running smooth like silk, my task manager looks like the one tregan posted.

lol ty..

and yeah the hexcore calculation should work.. 56 will be the proper number.

TY for the images and the testing done for Hexcore.

If it was possible to edit the original post I would because I think it would be worth while to put that in the TLDR portion at the top so that people do not have to sift through this entire thread to get the information.

---------- Post added 2011-01-18 at 03:14 PM ----------

Originally Posted by Sevyvia

Okay, I'm probably being an idiot here, but I've read the OP and I cannot for the life of me find AffinityMask or processAffinityMask in my config.wtf. The only thing relating to cores in config.wtf seems to be coresDetected "2". I use Win7 if that matters.

Do I just type in a new CVAR of "SET processAffinityMask "5" " myself?

Yes you can do that.

However, if you dont have a dual core with HT / quad / hex core this will not help you much as WoW will already be running over both of your available cores by default.

---------- Post added 2011-01-18 at 03:15 PM ----------

Originally Posted by Plasmon

The command isn't there in the file by default, so if you want it, you have to just add it. However, it sounds like you have a dual core processor, and in that case WoW already utilizes your CPU cores in the best configuration so you don't need to do anything and won't benefit from adding the CVAR. (See post #19)

If you upgrade to more cores someday the line to add to the bottom of your config.wtf file is:SET processAffinityMask "84"
(or whatever affinity mask number is best for you)

Thanks for the guide. I do got one question regards the CPU.
I use a AMD Phenom II x4 965 3.40 GHz. Now not seeing where this CPU belongs on your list. Dunno if it got HT or not, don't even know what ht is, what affinitymask should I set it to? 56 like Shinrou with his X6 1090T ?

Thanks in advance

Originally Posted by Raqubor

Alright, I'm a lazy bum, so can someone tell me what to set for AMD Phenom II x4 965 Black Edition?
I would appreciate it greatly!

As far as I can tell none of these AMD processors have any type of hyperthreading technology. Which means they just have the four (quad)/ six (hex) logical cores.

So, for the phenom x4, you would use 15.

Same as the "i5 Quadcore which does not have HT " in the OP.

---------- Post added 2011-01-18 at 03:23 PM ----------

Originally Posted by Robula

So could you or anyone tell me why setting "CVar processAffinityMask 255" to force WoW to use all 4 cores + the 4 HT 'Virtual' cores would or would not work as apposed to your recommended setting of 84.

For the record, WoWWiki suggests setting 85 for Core i7 w/ HT.

In short, it is not recommended to run anything specifically over the "virtual" cores. Virtual being the hyperthread cores.

You will see no significant gains by running anything over the HT cores, and if anything can actually cause the CPU to bottleneck, which will actually hinder performance.

I did FPS tests with a setting of 255 and actually saw a drop in fps.

---------- Post added 2011-01-18 at 03:26 PM ----------

Originally Posted by sylpheed

I was wondering how this changes in a multi-cpu setup (I have 2 quad xeons with HT: 8+8v total cores) also, is this applicable under MacOS? dunno if Grand Central Dispatch or any other related tech does this already for us.

That is a really really good question.

I do not believe this will apply in Mac. As you said, since Leopard GCD handles the majority of application threading, not sure if you can over ride that at all.

I am also not sure if WoW is written to take advantage of GCD. You have to block the code of an application for GCD.

That part would be a really good question to ask perhaps on the Blizz technical forums. Kind of curious what the answer would be.

What OS are you using for your dual quad set up?

---------- Post added 2011-01-18 at 03:29 PM ----------

Originally Posted by Azioth

I'm a bit confused... Here you said to set for an I5 w/o HT to 15, but on wowwiki says 15 to use all the cores and 7 to optimal?
Which one should i use?

I am not sure where they are getting 7 from. But that doesn't appear logical.

The values for the i5s have been tested, and proven to function as expected.

Perhaps that page should be edited, proven out.

---------- Post added 2011-01-18 at 03:31 PM ----------

Originally Posted by surgio

The idea of high FPS is not to simply get the max, going from 80 to 100 for example. You want to increase the low end of your FPS (think raid/battleground with tons of spells characters) that is where you will notice the FPS increases.

Very true. Except in my specific case my system does not dip below 75fps, and that is only because I have vertical sync enabled which limits your fps to the refresh rate of your monitor which in my case is 75hz.

So 0101 is actually 2 + 16 which would be 18.
To get 80 its the inverse;
00001010

So really your cores pan out;
C1|C1v|...|C4|C4v
As originally stated

That was how I had originally placed the cores, and that seemed logical and correct to me.

When this guide was originally posted on guild forums before bringing it here, someone suggested that was incorrect and to reverse it.

Interesting.

I wish it were possible to edit the OP. Lots of corrections / additions I would like to make to it.

---------- Post added 2011-01-18 at 03:39 PM ----------

Originally Posted by BurnetRhoades

This bugged me too, only because there have been practical studies which put the minimum refresh rate of a display necessary for the human eye to accept what they see as "real" much higher than this number the guy pulled out of thin air with no basis.

It's true that we see basically a continuous exposure but Showscan was developed by Doug Trumbull in the late 70s based on biometrically measuring a viewer's response to changes in frame rate. Refresh rates higher than 72fps are where no change in response was measurable in the viewer. And that's objective data.

Of course not responding in a measurable way does not equal an inability to perceive (though it is an odd concept). It's pretty easy to conclude, however, that refresh rates above those we actually respond to are practically irrelevant while refresh rates below those we actually respond to aren't insignificant, like the author seems to think.

I don't think they are insignificant.

I was mostly speaking from personal experience, and I do not have super eyes.

The human eye is actually, and obviously, not perfect. Even for those of us who have 20/20 vision.

In any event. I would remove that line if possible since it seems to be aggravating some people.

---------- Post added 2011-01-18 at 03:40 PM ----------

Originally Posted by Kalix

Quick thing. I have W7 and DX 11. I tried adding that line you posted and sure enough my fps went up a few notches but now I cant see any water unless it's on low. I used to have it on ultra. Any idea what's going on? Yes, I had the water working right before I added the line.

Thank you

What graphics card are you using?

Is it okay if you remove the dx11 line?

Can you go to start / run / dxdiag and post a screen shot of what you see in that config utility.

thanks

---------- Post added 2011-01-18 at 03:41 PM ----------

Originally Posted by havix

I have a question about the SET processAffinityMask "84"

After everytime I open warcraft I look back into the config folder and my value has been reset to 0, so I can change it back to 84, open and close wow and it is already back to 0. Any ideas?

Can you please open your task manager and take a screen shot of your performance tab. Just curious to see how many cores are being detected by your OS.

---------- Post added 2011-01-18 at 03:43 PM ----------

Originally Posted by ropo

Could u specify a bit?
how do i find the " 0cc5b647-c1df-4637-891a-dec35c318583 "?

start / run / regedit

Once inside registry editor hold down cntrl + f (control key and f key at the same time). Alternatively you can click the edit button at the top left of the window and click on find.

This will open a small dialogue box copy and paste that string value 0cc5b647-c1df-4637-891a-dec35c318583 into the find box.

I'm running WoW on Win7 whith an ATI 4850HD card which supports DX10.1, my question is, if the card does not support DX11 natively is there any benefit on running wow with the line SET gxApi "d3d11" in order to run wow in DX11 mode?

For any future questions addressed specifically to me, that no one else here can help with, please private message me. There are a lot of smart people commenting in here though, so I doubt that will be needed. Becoming hard to track all the responses/questions.

Does anyone now of a way I can get edit privileges on my original post?

I frequent these forums a lot, but I never post here so kind of unfamiliar with how to proceed as far as that goes.

---------- Post added 2011-01-18 at 03:48 PM ----------

Originally Posted by spyke

I'm running WoW on Win7 whith an ATI 4850HD card which supports DX10.1, my question is, if the card does not support DX11 natively is there any benefit on running wow with the line SET gxApi "d3d11" in order to run wow in DX11 mode?

Nope.

The dx11 mode is only experimental anyway, and with dx10 you will see no benefit.

You need dx11 installed on your os, which you should have with windows 7, but you also need a dx11 capable gcard to take advantage of it.

Could you please clarify this for me - you recommend leaving the primary core unused and at the same time in the post above you said "So, for the phenom x4, you would use 15."
Looking at the graph posted on page #1 the number for "Quad-core, non-primary core" is in fact 14. Which number would be optimal then.

I noticed some asking right value for processAffinitymask with phenom 2 x4 965 BE, and i saw it says that use "15" but im kinda supprised since fps on Stormwind mass is 45 with "15" and of course I read first post and it says that "15" or "84", I tried to put "84" but it keep changing it to "0" when I launch wow.

The majority of windows programs whether you run them on xp/vista/7 will default to run on core 1.

The point of setting the affinity mask is to have wow run on whatever cores you specify. In my specific case, I have wow running over physical cores 2,3,4. So that if I am multitasking while playing, and running other things I dont have to worry about any cpu bottle necking.

For most people, none of these tweaks are "necessary". It is mostly just for those who like to run as optimally as they can. Efficiency is a quality we breed in people here, and I carry it over to every aspect of life.

---------- Post added 2011-01-10 at 03:40 PM ----------

Yes. Wow does have three main cores now as of 3.0 I believe, or 3.3. One of those patches split the game into 3 main threads, with a bunch of other filler threads.

I did a bunch of tests when I first wrote this article last year on my guild's website.. Below is a quote from the thread...

I dont see any reason to run wow over the virtual cores. In fact, in most cases, it is recommended to not run anything over the virtual cores and let the software (OS) dictate what processes are scheduled for them. In the below test I actually had lower overall fps with 255 than when I forced it over the last 3 physical cores.

80= Forces wow to run on physical cores 3/4, cores 1/2 are not used by wow. Total cores used 2.

85= Forces wow to run on all physical cores 1/2/3/4. Total cores used 4

84= Forces wow to run over physical cores 2/3/4. Total cores used 3. This is what I am predicting to be the best option. As it will leave core 1 open(which is the default core most windows application run over) for other windows functions meaning the system will remain stable while the other 3 cores handle all the wow processes. Since wow has tri-core support and a main thread isnt going to be going over the 4th core it seems almost wasteful to give the game access to anymore than 3 cores.

255= Forces wow to run on all physical cores plus all of the HT cores 1/1a/2/2a/3/3a/4/4a. Total cores used 8.

_____________

I would go with either 84 or 85. Both seemed pretty solid and I noticed a substantial increase in load time with 84 so that is what I am sticking with.

---------- Post added 2011-01-10 at 03:44 PM ----------

And yes there is no tcpack in vista/7, perhaps I should have specified that in the post.

However, there is TCPnodelay.

Tcpnodelay is directly related to the Nagle algorithm, and disabling Nagle is still a huge huge decrease in MS latency. I highly recommend everyone does it manually, or use the Leatrix tool from wowinterface.com

Just what I needed to know, I will be using 84. I'm at work, so you are saying the config file all we have to change is the CVAR to 84 or do we have to include the binary stuff?