Getting back into the PC building game
•
Page 2

Assuming you can just pull a HDD out of a Mac like you can with a PC then I reckon it'd be better to get a cheaper SSD now instead of that Barracuda, and just use your old HDD in the new PC if you need the extra storage space for music and the like.

Dirtbox wrote:
Migrating from an HDD to an SSD is a huge pain in the arse, you're better off with getting the boot/program drive to begin with and expanding with an HDD later.

Or just enable the raid on the motherboard when you build it and before you install the os, make sure the motherboard has rapid storage technology, and then when you do eventually buy a ssd, just pick one that's no larger than 60 gb, and enable RST on your Intel drivers in your os.

The ssd will then cache the hdd data you most frequently access, which is better than installing the os on a ssd, and your apps and data on a hdd.

Plus, you can do this without reinstalling the os, or even change drivers.

Sadly, I very much doubt you can get the features you are looking for to work in your budget.

As a starting point, the case and PSU you've chosen aren't any good. The case is old technology (Ac97/USB front panel. It should be HDAudio/USB3) making it unsuited to the newer motherboard. The PSU isn't 80plus certified and only has one pcie power connector for powering SLI graphics cards. The case is also too small for 10.7” GTX 660 card you selected

If the current next-gen rumours are to be believed, that value is less than half the memory bandwidth of the slower DDR3 setup used in the Durango(64GB/sec).

If you've read an NDA for the coming system's technical specs, then that value will either not bother at all, or it might concern you about your new PC's shelf life for gaming. Either needing a new GPU with at least 4GB of GDDR5, or needing a system based around socket LGA2011, rather than the socket 1155 setup.

Durango and Ps4 are rumoured to have minimum of 4GB of main memory at a minimum of 64GB/sec bandwidth. So an ivy-bridge system might need a GPU with 4GB of GDDR5 to subsume the consoles' specs (to be able to run next-gen cross platform games with any advantage).

Buy a Cell processor and use that for everything, because you don't need anything else if you've got a cell because the cell is the best because sony told me it was the best one for playing games and it can do loads of stuff that no one even knows about yet because they said so at E3 that year and it's too powerful for using for games but you can use it for playing games because it's cool but you can make nuclear bombs with it instead if you wanted because sony were talking about that once and saddam wanted to make a giant robot to crush the west and he was going to put a cell into it's brain because it is the best processor ever made.

Durango and Ps4 are rumoured to have minimum of 4GB of main memory at a minimum of 64GB/sec bandwidth. So an ivy-bridge system might need a GPU with 4GB of GDDR5 to subsume the consoles' specs (to be able to run next-gen cross platform games with any advantage).

Now do you understand?

The next console generation isn't really going to get going for another year or so anyway. Pretty confident my overclocked i2500k and 2gb 660ti will at least be in a similar enough ball park to keep me going for another couple of years.

Keeping up with the consoles isn't going to be that difficult... Reasonably budget kit will be powerful enough to iron out low frame rates and tearing on console versions that are over ambitious within a year of the console's launch.

@redneon, is the budget excl or inc vat? (since the link was to the business site). If it was excluding, just thought it might be worth mentioning chillblast still have a possible less than than parts cost system, no monitor, but 3570k, GTX670, 240GB SSD, inc OS for £1k inc vat.

Vizzini is in cloud cuckoo land again. All consoles will be doing is playing the catch up game to get to the current level of PCs for the first few years, after which all the developers and publishers will be going out of business because they can't afford the development costs of the games they'll be trying to make to raise the already too high ceiling with.

If I were to go with an AMD CPU would I be best off getting an ATI GPU too? I just wondered, with them being from the same company, if there's any benefits? I'd rather have an NVidia GPU though, if I'm honest, as the Linux drivers are much better than the ATI ones.

redneon wrote:
If I were to go with an AMD CPU would I be best off getting an ATI GPU too? I just wondered, with them being from the same company, if there's any benefits? I'd rather have an NVidia GPU though, if I'm honest, as the Linux drivers are much better than the ATI ones.

I build PC's for a living, supplying a number of business and schools. I have based this on AMD stuff as that is what I know, they work very well and tend to offer a few more features at a given price. The computing industry is going multi-core, and as software catches up AMD's stuff just gets better.

I've owned a heap of AMD GPUs and CPUs over the last 15 years and while the CPUs have all been great, I've never once had an AMD GPU that didn't have some annoying little problem with it, whether it's a dodgy driver installation or not changing resolution properly or fast enough, silly stuff like that. Still easily the best bang for buck though and definitely great cards. Just be aware that st least one stupid OCD twitching annoyance is par for the course with them.

redneon wrote:
If I were to go with an AMD CPU would I be best off getting an ATI GPU too? I just wondered, with them being from the same company, if there's any benefits? I'd rather have an NVidia GPU though, if I'm honest, as the Linux drivers are much better than the ATI ones.

I build PC's for a living, supplying a number of business and schools. I have based this on AMD stuff as that is what I know, they work very well and tend to offer a few more features at a given price. The computing industry is going multi-core, and as software catches up AMD's stuff just gets better.

@Sandbox

Watching the video you posted and looking at the comparative technical specs, the AMD cpu does compare more favourably on features+memory bandwidth to Intel's LGA2011 systems, and is still cheaper than the lesser Intel ivy-bridges.

The only reservation I'd have about recommending AMD/ATI stuff is from reliability. And driver support. In my experience Windows does tend to degrade performance quicker on AMD CPUs. Hotfixes/service packs don't tend to cripple ageing Intel CPU systems the same for some inexplicable reason.

redneon wrote:
@Sandbox Now, that's interesting. I have to admit, I hadn't even considered getting an AMD CPU but maybe I'll look into the FX8350...

I've just watched this. You have to take their results for Arma II and Farcry 3 with a lorry load of salt.

Their results show that at 1080p you will get double the framerate with the 8350 vs the 3570K. The idea that even a stock 3570K is struggling with the processing demands of Farcry 3 or Arma II is not credible. However it get weirder than that, at 1440P the GPU load has increased significantly, the CPU should matter less, but according to these results switching from the 3570K to the 8350 will get you triple to almost quadruple the fps.

In fact it's worth highlighting just how ridiculous the Arma2 figures are:

Stock
AMD 1080p: 51.92
Intel 1080p: 25.56

Overclocked
AMD 1080p: 59.96
Intel 1080p: 29.87

Stock
AMD 1440p: 37.8
Intel 1440p: 13.12

Overclocked
AMD 1440p: 54.48
Intel 1440p: 15.92

According to these results at 1080p stock we get 51.92fps but at 1440p if we overclock the FX8350 to 5GHz we get 54.48fps. By these results an overclocked FX8350 is making more difference to the framerate than a GTX670 GPU.

I would love for it to be true that a couple of Windows fixes for AMD FX CPU scheduling has made GPUs redundant for gaming and look forward to multi CPU AMD motherboards in the future.

I suspect the fact that their results disagree with logic and the benchmarks we have seen from every other testing site means that they've screwed up in their testing somewhere.

Getting back to the games that were affected by memory performance, only one title exhibited differences significant enough to be noticeable during real-world play. Even then, the average frame rates were so high that your eyes (and displays) would need to be about twice as fast as ours to realize the real-world benefits of faster RAM.

The game in question, F1 2012, consistently averages more than 100 FPS, yet also scales well with memory improvements. Really, that's only important to sustain if you're using AMD's HD3D and Eyefinity technologies at the same time, encouraging frame rates two times the 60 Hz refresh rate of most monitors. If you don't have a trio of stereo-enabled screens, large performance bumps above and beyond already-high frame rates are really only good for bragging rights.

It sounds like you don't understand how each subsystem works in a computer, and how a bottleneck in one impacts on others.

Memory module speed and memory channel bandwidth are not the same thing. Faster modules would make virtually zero difference if they are already bottlenecked by channel bandwidth.

The test video that sandbox linked, and FutileResitor brought into question was on frame-rates below 60fps. Memory bandwidth can be an important bottleneck in modern day computer games if the engine is able to scale to CPU, memory size, memory/HDD bandwith, in addition to using the GPU. Adding an irrelevant link to something different doesn't change the point.