When I used to run my Phenom II x4 965 @ 3.8GHz I had drops below 20FPS on Ultraxion, when I switched to my 2500K I ran perfectly smooth at 60FPS.
That Phenom is now in my girlfriends computer and I can see her struggling to get fluid game play in LFR sometimes.

That's why i said, relatively. And don't give the old 'anything above x is fine, do you need more'. If you're buying a CPU, you want to buy the best bang for buck. And what brand to use shouldn't be included in the choice by default, atleast not as a higher reason than performance/price. If all he plays is WoW, and he wants more performance, he's best off using an intel CPU. Period.

He might be, but he doesnt have the money for an intel. He doesnt want to switch out his mobo (or he just doesnt have the cash for it), so in that case the 6300 is fine.

And when you OC the 6300, it is on par (maybe even better) than the i3 3220/ 3225

To a certain extent. I think it is only useful when we are talking of FPS below 60. And i can only show you an example with skyrim on 1080p, where the OCed 6300 has 5 fps more than the 3225/3220. ( I have tried looking for more reviews but i just cant find any with these kind of graphs)

Because it's not 100 extra. In fact, intel offers more performance for less.
For example, using the anandtech link.
Fx-6300, which retails for 120 euro in netherlands, has 79.4 fps.
i3-2100, which retails for 105 euro in netherlands, has 94.1 fps.

That's why i said, relatively. And don't give the old 'anything above x is fine, do you need more'. If you're buying a CPU, you want to buy the best bang for buck. And what brand to use shouldn't be included in the choice by default, atleast not as a higher reason than performance/price. If all he plays is WoW, and he wants more performance, he's best off using an intel CPU. Period. I'm really sick of this endless debate, whenever someone suggests a certain brand that the inevitable brand-war ensues. Especially if they start making up false data. Really, if a certain processor is better at a game, stop defending the other because the OP doesn't benefit from this. You just end up getting his thread closed or w/e.

You're mistaking me for an AMD fanboy, I'm not. In fact, people on this forum helped me to put together the build I'm getting which includes an Intel processor. Trust me, I honestly don't have a dog in this fight. I really couldn't care less if one put the other out of business forever.

I'm just saying that if the OP is only going to play WoW, he can max it out for less. It doesn't sound like he's a hardcore going for every possible fps, so why buy a $230 cpu to max it out, when you can get very similar performance out of a $130 cpu? Sounds like he's on a budget and wants the best bang for his buck, just like you said. If that happens to be an AMD cpu, which will do what he wants it to do and cost less than an Intel CPU, why not explore that option?

I guess you're fine recommending the i3 dual core, and honestly I looked into getting it myself, but I heard so many mixed reviews about how it actually performs (I've heard that it bottlenecks gpu's, that it really doesn't perform as well as the benchmarks make it look like, etc.) that I've kind of come to not really trust it, and besides, moving forward it doesn't make much sense to invest in a dual-core anything according to most of the stuff I've read.

But, I am totally willing to admit my noobishness on the subject. I don't really mess with reading benchmarks and stuff because it doesn't quite reflect the performance you'll probably see at home anyway unless you go invest in an i7 oc to 4.7ghz (yes I know they do it to avoid bottlenecks...) That's why I just go on YouTube and look up the builds that I'm interested in to see what the real world performance is. All I'm saying is that the AMD-based systems I've seen there perform just as much to my liking as the Intel ones. I'm not saying one is better, not recommending one over the other, just saying that to me, it's all good either way. Might as well go the cheaper way if you're on a budget and only interested in playing one game in particular.

Really? Is it already bottlenecking my system :/? All my friends told me graphics cards are like 80% of the performance in games, and as such I should just focus on getting a new graphics card...
I'm a student, so not a ton of cash laying around, don't think I can afford both a CPU and a GPU upgrade :/

80% of "games". Every game is different in regards of what it uses. Take RAM for example. Most games don't profit from more than 2gigs, but planetside2 won't run any good below 6.

WoW is very heavily CPU dependent. It uses the GPU just for particle effects, shadows and shaders. Everything else relies on the CPU.

I would just rather see for myself how it looks and feels rather than read about what might potentially happen. Like I said, to me there's very little difference between 80 fps and 119 fps. All I said to the OP was, "Hey, maybe you don't have to spend so much. Check out these videos of cheap systems, AMD isn't as bad as that guy said really, look..."

And I can never see benchmarks of systems that I'm actually interested in getting, which is why I think YouTube is a better way to gauge it. You get to see exactly how it will actually hold up instead of just trying to imagine what it looks like.

But again, I admit to being new to this. Maybe It's because I've been playing WoW at 20 fps on lowest settings on my Macbook for the last 3 years that when I see somebody playing at 50 fps on Ultra (not 119, *gasp*, I know...) it looks really good to me and I'd be happy with that game experience, thereby negating the necessity to overspend on a computer that will deliver it. That's really all I'm trying to say. I'm not saying that AMD is great and Intel sucks or that I'm right and you're wrong.

I'm currently playing WoW at High settings with around 45-60 in leveling zones, and only around 18 fps in 25-man raids So I'm looking to upgrade!
Basically, I only play WoW (like 90% of my gaming on the pc is WoW) and I'm looking for the cheapest card that will run WoW at Ultra settings smoothly...

Don't bother. You're getting 10 fps at most in 25 man raids. World will be the same.

---------- Post added 2013-01-30 at 02:38 PM ----------

Originally Posted by n0cturnal

When I used to run my Phenom II x4 965 @ 3.8GHz I had drops below 20FPS on Ultraxion, when I switched to my 2500K I ran perfectly smooth at 60FPS.
That Phenom is now in my girlfriends computer and I can see her struggling to get fluid game play in LFR sometimes.

That is a huge lie. You can't get 60 FPS stabile in a 25 man encounter. With a 2500k NEVER. Maybe with a 3960x OCed get 45.

That is a huge lie. You can't get 60 FPS stabile in a 25 man encounter. With a 2500k NEVER. Maybe with a 3960x OCed get 45.

Here's what I wanna know: How did people back in 2004 manage to pull of 40-player raids? Even with the technology we have now everybody seems to be having trouble at 25, and people always say that WoW's graphics engine is still pretty much the same as it was then (I don't know this for sure, to be honest, it's just what I've heard, I really have no clue how it works)...so what kind of NASA top secret system did you have to have back then to do 40 mans?

Even if you're new and your comments are merely suggestions, they remain false, which is always bad. The comparisson which i posted you was a same-priced head-to-head where the Intel just flatout won. No extra price, or i7 4.7ghz mentioned anywhere. Neither do 'feelings' really help the OP. A 15 fps lead always represents more performance, that's why you use benchmarks.
Youtube video's are very unreliable, and you can usually not see the actual detail settings they are actually playing it on.

Originally Posted by RicardoZ

Here's what I wanna know: How did people back in 2004 manage to pull of 40-player raids?

Here's what I wanna know: How did people back in 2004 manage to pull of 40-player raids? Even with the technology we have now everybody seems to be having trouble at 25, and people always say that WoW's graphics engine is still pretty much the same as it was then (I don't know this for sure, to be honest, it's just what I've heard, I really have no clue how it works)...so what kind of NASA top secret system did you have to have back then to do 40 mans?

Simple -

Graphics were worse
Less texture, details and effects.
WoW takes on two cores, most then had two cores. Making the jump from two to four/eight not huge. I have my e8400 at 4,0GHZ. Runs almost the same as my 2700k at 5,0 GHZ. Speechless myself.
System hasn't really gotten better, I see cards from the 4xxx series from nvidia and 5xxx series that beat current best cards. Of course the 7970 that just recently came out is better,but the price tag is almost 2,4k dollars where I live.

Here's what I wanna know: How did people back in 2004 manage to pull of 40-player raids? Even with the technology we have now everybody seems to be having trouble at 25, and people always say that WoW's graphics engine is still pretty much the same as it was then (I don't know this for sure, to be honest, it's just what I've heard, I really have no clue how it works)...so what kind of NASA top secret system did you have to have back then to do 40 mans?

Back in vanilla there was almost zero AoE damage or healing in raids, you also didn't have several HoTs running on every member of the raid, the bosses had a maximum of 8 debuffs on them. That is a lot of extra info the client now has to account for and most of that puts on a load on the CPU.
We also have massive changes to the graphics engine, new lighting, weather effects, shadows, more spell details and so on.

Not to mention that most people still played on CRT when WoW was released, 1024x768 or 1280x1024.

Back in vanilla there was almost zero AoE damage or healing in raids, you also didn't have several HoTs running on every member of the raid, the bosses had a maximum of 8 debuffs on them. That is a lot of extra info the client now has to account for and most of that puts on a load on the CPU.
We also have massive changes to the graphics engine, new lighting, weather effects, shadows, more spell details and so on.

Ehm.. not sure you actually played vanilla. With grpahical engine you're right, but that has already been said in addition to the screen resolution of the game, how it was played. But 8 debuffs at most on a boss? I remember our Onyxia when she was flying in the air had probably 40 dots. Visual effects is another thing, but that doesn't matter as there were more players to make up for that.

Even if you're new and your comments are merely suggestions, they remain false, which is always bad. The comparisson which i posted you was a same-priced head-to-head where the Intel just flatout won. No extra price, or i7 4.7ghz mentioned anywhere. Neither do 'feelings' really help the OP. A 15 fps lead always represents more performance, that's why you use benchmarks.
Youtube video's are very unreliable, and you can usually not see the actual detail settings they are actually playing it on.

That's usually listed in the description and I've found that if you ask the authors of the videos will tell you. I'm not saying that the Intel didn't win, of course it did, it's like somebody running a 5 minute mile, then somebody else running a 5 minute and 10 second mile, then deciding that the guy who ran the 5:10 is a horrible noob who shouldn't even be competing and runs like a wet meatloaf. Sure somebody else won, but it doesn't mean the other guy totally sucks and is entirely useless, especially to somebody who can't afford running lessons from the gold medalist.

Ehm.. not sure you actually played vanilla. With grpahical engine you're right, but that has already been said in addition to the screen resolution of the game, how it was played. But 8 debuffs at most on a boss? I remember our Onyxia when she was flying in the air had probably 40 dots. Visual effects is another thing, but that doesn't matter as there were more players to make up for that.

WoW Patch 1.7.0 (2005-09-22): The debuff limit has been increased to 16 (from 8). In addition, the client will now display all 16 debuffs.
TBC Patch 2.0.1 (2006-12-05): Debuff slot limit increased to 40.
WotLK Patch 3.0.2 (2008-10-14): There is no longer a limit on the amount of Debuffs a target can have on them at any time.

WoW Patch 1.7.0 (2005-09-22): The debuff limit has been increased to 16 (from 8). In addition, the client will now display all 16 debuffs.
TBC Patch 2.0.1 (2006-12-05): Debuff slot limit increased to 40.
WotLK Patch 3.0.2 (2008-10-14): There is no longer a limit on the amount of Debuffs a target can have on them at any time.

OP: If you want to get performance boost, don't go intel. Buying a new board and then CPU would cost immensly amounts of cash. Go for FX-8150, grab a cooler and clock that baby to 5,0 and run stuff good enough.

Back in vanilla there was almost zero AoE damage or healing in raids, you also didn't have several HoTs running on every member of the raid, the bosses had a maximum of 8 debuffs on them. That is a lot of extra info the client now has to account for and most of that puts on a load on the CPU.
We also have massive changes to the graphics engine, new lighting, weather effects, shadows, more spell details and so on.

Not to mention that most people still played on CRT when WoW was released, 1024x768 or 1280x1024.

Honestly I didn't notice a difference until 4.0 came out. I started playing in July of 2010 which was a couple of months before that patch and the game was very smooth on mid-high settings on my Macbook's integrated Intel graphics (which I still use ). Then once the Cataclysm hit, it took my decent framerate along with it and I was bumped down to the slums of 20fps low settings. That's why I get excited about getting 40 fps in Stormwind on an FX-6100 with a 7770