Finally I could upgrade my CPU, as im not a fan of the desktop CPU's that AMD plans on releasing in the future, and would rather go back to a 4 core. I would probably go for a 4790k? I would obviously need another motherboard too. Or is it worth the wait for the Broadwell CPUs?

The games I will be playing over the next 6 months or so will be the Witcher 3 and dragon age inquisition. If anyone can give me some useful advice I would appreciate it.

I personally would not get another R9 290 just because one would run one at x16 mode and the other at x4 mode with your current motherboard. Yes it would still increase performance, but not as good as you would on another motherboard (one that would run x8/x8 or x16/x16). The R9 290 is still an excellent video card by itself.

You already have pretty much the best CPU your motherboard can handle, so if you wanted to stick with AMD (such as the FX 8320 or FX 8350) you would need a newer motherboard that can handle the AM3+ CPU. There are AMD motherboards that will allow you to put your current Phenom II into it and then you can upgrade at a later time to a FX series CPU (if you are trying to limit your expenses all at one time).

You have the "Sweet Spot" for Ram right now with 8GB. You could go 16GB, but that might be overkill right now.

My opinion would to be save up more money and go with a newer Intel motherboard and an Intel Core i5 4690K. I say the i5 4690K is better because they are less expensive then i7 4790K but you can still overclock them pretty well. The i7 comes with hyperthreading and more L3 cache two things that most games really don't make use off (yet). I know someone here will probably say get an i7, but if you want to keep costs down and still get excellent performance, the i5's are the sweet spot for gamers. Plus, if you felt like upgrading in the future, you could go with the i7 to get better performance.

I personally would not get another R9 290 just because one would run one at x16 mode and the other at x4 mode with your current motherboard.

Click to expand...

I've always wanted to know and never looked it up till now. I've always questioned the purpose of the SLI/Xfire bridge. And to be honest I didn't realize the R9 series no longer supported the bridge.

Current generation (XDMA)
The Radeon R9-285, R9-290 and R9-290X graphics cards (based on Graphics Core Next 1.1 "Volcanic Islands") no longer have bridging ports. Instead, they use XDMA to open a direct channel of communication between the multiple GPUs in a system, operating over the same PCI Express bus which is used by AMD Radeon graphics cards.

PCI Express 3.0 lanes provide to up to 17.5 times higher bandwidth (15.754 GB/s for a ×16 slot) when compared to current external bridges (900 MB/s), rendering the use of a CrossFire bridge unnecessary.

Click to expand...

PCIe 2.0 x4 would be crippling for a R9 290 that no longer makes use of a bridge. Even if a bridge is only a percentage of the bandwidth requirements, it seems the bridge would still help in the newest of PCIe 2.0 systems. I'm guessing AMD decided to move on and let the past be the past. Can't say that I blame them for it though.

Turns out my R9 290 is playing up anyway, as its crashing Ati drivers/ producing artifacts at 72C (its not OC'd) even though its rated for 95C. So im swapping it for the new excellent GTX 970.

I will probably go for the Intel upgrade, as I know AMD CPU's are leagues behind Intel at the moment, and its probably what's stopping me playing Rome 2 at the moment. I wasn't aware of the x4 PCIe so thank you for notifying me of that. I think I would still like an i7 though, even if its tech is not widely supported yet, for future proofing really. Perhaps I could just go for a lower clock speed model, as I'm familiar with CPU overclocking.

Im probably ruling out the monitor upgrade based on the fact that Ideally I would need two cards to run it at WQHD resolution, since I imagine one GTX 970 would be just about enough to run say the witcher 3 at max settings without AA/sampling.