Question:Is it better to try to build only to a given specification, picking the parts that are best for the current specification, at the current time, with current technology? Or is better to try to pick parts that will "future proof" the build by picking some things - like bigger case, PSU, motherboard, etc.) than is technically required by the current specification?

What I mean is if the current technology specifies "8Gb of RAM, one particular card, and everything fits in a case of size Q", should a person plan for JUST 8Gb/card/caseQ, or should a person plan for "get 8Gb now, but might want 16Gb later, and card is good now, but build to allow two of those, and case Q is suitable now, but case W might fit better later".

Basic argument for:Future proofing is GOOD! If you get parts that are "future proofed" now, you might spend a bit more, but when you upgrade you'll definitely save money because buying a whole new item would have cost so much more! So spend more now, and you won't have to spend anything later. It just makes so much sense, why would anyone think differently?!?

Basic argument against:Yeah, right. You're going to try to guess what technology is doing at the time you want to upgrade? Sure, you bought that expensive motherboard that can handle 32Gb "just in case" you wanted to upgrade. Except the chip set on that motherboard is horribly obsolete, and it doesn't handle the latest RAM speeds anyway, so you have to get a new one anyway if you want to upgrade. And that PSU? You specced out twice what you needed, but components now use 1/2 the power they used to do, so it's STILL way over-specced, and you STILL spent more than you should have. Tech doesn't require "future proofing". Buy for what exists NOW, don't try to guess where it's going to be when you want to upgrade your system. It just makes so much sense, why would anyone think differently?!?

So, what's your position?

_________________Using: iMac 24" dual boot (Mac for work and banking, Windows 7 for gaming), Logitech LZ, Apple keyboard.Looking for: short (no numpad) backlit keyboard, maybe with extra keys on the left.Also Looking for: Mouse that doesn't require online DRM to configure, such as the Razer Naga that I used to be able to use, but can no longer.Currently building new system: check it out

My background:30 years of I.T. (as of this year, I started in 1982). Worked for companies such as Honeywell Bull, Monenco, BNR, Nortel, Veritas, as well as complete unknowns and even myself. I've seen future-proofing actually work, where a 5 and even 7 year predictions happily came true. I've also seen future-proofing fail completely when assumptions were wiped out in a single technological advance. I approached my most recent PC building attempt with an "well, future proofing could work" attitude. I'm slowly being converted to the "future proofing is for idiots, build for now, not for later" camp. This attitude runs counter to my training and to my personal preferences, but it seems to be the attitude that makes sense in the current tech climate. I admit it's a struggle that could be helped by cogent arguments from people who have their own opinions based on their own experience. Which is the purpose of this thread - to try to elicit comments from people who might have actually thought about this.

This post was done as a background to my build post in this forum. I've always been one to over-analyse stuff. This... is part of that.

_________________Using: iMac 24" dual boot (Mac for work and banking, Windows 7 for gaming), Logitech LZ, Apple keyboard.Looking for: short (no numpad) backlit keyboard, maybe with extra keys on the left.Also Looking for: Mouse that doesn't require online DRM to configure, such as the Razer Naga that I used to be able to use, but can no longer.Currently building new system: check it out

In general, you really can't. There are some specific things you can do for future re-use:- get a nice case and fans- get a nice PSU. CPU and mobo power use is trending down. GPU is down a bit. Don't see that reversing. - get a nice optical drive. - get a nice cpu cooler. - get a nice monitor.

Beyond that, it depends on your apps. For my mid range gaming use, I find I replace the GPU about every 2 yrs* and the cpu lasts about 3-5yrs. Intel seems to change their platform every 2 years now...so this is the last year for socket 1155. Next year brings socket 1150 for at least 2 years...but after that, it's a crapshoot. DDR3 will be with us for the next 2 years. DDR3 1600 is typically the go-to speed for IVB, there's a slight gain moving up to 1833 and then diminishing returns after that (AMD gets benefit from faster RAM). Seems like the sweet spot moves up one speed step every 2 yrs, so maybe buy DDR3 1833 if you plan to use it in another platform 2 years from now. <shrug>

A $200 GPU seems to last me 2 years and a $300 GPU lasts about 3 years. Go figure.

I vote for get what you need now, not what you might need. Unless you really do need the best of the best, paying more for the top end rarely pays off. Look at all the expensive "Extreme Edition" or equivalent CPUs from the past. Then look at how quickly the mid range from the next generation beat them for 1/3 or less the price. Even counting the cost of a new MB and RAM, you usually come out ahead in the long run.

Future-proofing is fine if you've got a rational argument for upping a specific spec above what you currently need.

~95% of the time when people talk about "future-proofing", what they actually mean is they want to justify paying for something that's going to be vaguely better than what they perceive as bog-standard or low-end.But maybe if having a "superior" computer helps them feel like they're a special snowflake or something, they could get a health benefit out of it.

The way prices are today, you usually get better value for your money over the long run by upgrading a cheaper computer more often.But if you hate building computers and are not satisfied with the ready-made products, maybe it makes sense to overspend on a computer you won't have to touch for a longer period of time in order to spare you the agony of researching parts and handling a screwdriver. I'd wager very few SPCR posters are in that situation...

edit: what are optical drives even for? I thought this thread was about the *future"!

Future proofing can only last form one toc to the next toc, because intel used to change cpu sockets, which leads to motherboard upgrades which don't pay off normally.

AMD, afaik, tries to keep current sockets for about 2-3 years.

So "future proofing" covers the live span of your cpu socket. That might be between 2-4 years.

I tell my people to buy what they actually need, performance wise. In case they are "gamers", i usually tell them to buy the best stuff they can afford, might be a "ego-problem" just like the question how much horsepower your car actually has and how much of them you actually use.

Future proofing usually means you spend a disproportionate amount of money on day 1 for horsepower you don't yet need, and down the line you end up with a PC that barely cuts it when you do need the power.

Had you "future-proofed" a build in 2008, you'd be now having a Core 2 Quad with a Caviar Black HDD and a GTX 280. If you had built a Core 2 Duo system with a Caviar Blue and a Radeon 4850 instead, then sold it some two years later, bought a 2500k, an SSD and a 560 Ti - factoring in resale of the original system, you'd have spend the same amount of money, but you'd be living with a kickass system now.

Always buy the PC you need NOW. If you need more, upgrade and/or sell the old PC and buy a new one. You'll spend less money on hardware that is up to date.

The current price in Germany for an i5 3570K is about 200 Euros, while an i7 3770 is just 60 Euros more. Most would reasonably object that the 3770 is far too powerful for most routine computing tasks, but 60 Euros is not such a large price gap if you consider its purchase as insurance against greater software demands that might exist in the not-so-distant future. If you used the build for just one more year because of the more powerful processor, you would recover the 60 Euros. But that's an "if" and a risk, and therein lies the dilemma Smiling Geek has pointed out. It makes sense to be ahead of the curve, but how far ahead?

Apart from this "risk assessment," I wonder whether the choice of a more powerful or advanced component simply serves to gratify the ego and not much else. There are people who buy fast cars and fast computers in a misdirected attempt to satisfy psychological needs that these things can't truly satisfy.

Apart from this "risk assessment," I wonder whether the choice of a more powerful or advanced component simply serves to gratify the ego and not much else. There are people who buy fast cars and fast computers in a misdirected attempt to satisfy psychological needs that these things can't truly satisfy.

I've never seen faster CPUs compared to large pickup trucks in quite this way before.

Excellent points raised in this thread so far. Thanks for that. Now to figure out what I really want, and if I'm just compensating for some smaller body parts, or actually trying to meet a real (or perceived) system requirement.

_________________Using: iMac 24" dual boot (Mac for work and banking, Windows 7 for gaming), Logitech LZ, Apple keyboard.Looking for: short (no numpad) backlit keyboard, maybe with extra keys on the left.Also Looking for: Mouse that doesn't require online DRM to configure, such as the Razer Naga that I used to be able to use, but can no longer.Currently building new system: check it out

The current price in Germany for an i5 3570K is about 200 Euros, while an i7 3770 is just 60 Euros more. Most would reasonably object that the 3770 is far too powerful for most routine computing tasks, but 60 Euros is not such a large price gap if you consider its purchase as insurance against greater software demands that might exist in the not-so-distant future.

But how much better is the i7 3770 compared to the i5 3570K? They are of the same generation and have the same "features" such as HD decoding and such. If future demands makes the i5 3570K obsolete then it's likely that the i7 3770 will be as well. Save the 60€ I say!

The new system boots much faster, but that's not just hardware, the older system has 3x the number of loaded apps, which naturally slows boot time. The new system does some things much faster, like working on huge image files with multiple layers in Photoshop... but that's rare. Ditto video encoding, again rare. All routine tasks -- including everything associated with SPCR production & maintenance, web browsing, email -- feel pretty much the same on the 2 machines. Starcraft II is a bit nicer to play on the new machine, but the old one hardly limits me (not that I'm highly skilled at games). They are equally quiet -- inaudible in normal use.

XP is getting long in the tooth & I really need to update that machine with a clean Win 7 install, but it's a chore I've been putting off. My feeling is that once that's done, the two machines -- built with high end parts 6 years apart -- will be too close in performance to differentiate with just about everything I do. Esp. if the Velociraptor HDD is replaced with a modern SSD.

From my PoV, the demands of typical user apps have not really pushed hardware boundaries except in 3D gaming for at least half a decade. Extensive expansion of voice, gesture & touch interfaces and 3D apps for general use (like in database interface) might change this, but Windows 8 (which features many of those) is handled easily now by mid-level Intel-based PCs, so if some big software change is going to come along to force everyone to upgrade, I have no idea what it could be.

Futureproofing is a weak concept, imo. There are usually sensible things you can do to improve your odds in anticipation of future changes, but "proof" against them?

I've got to agree with Mike on this one. Obviously giving yourself a little bit of a performance overhead is nice for both current and future operation, but it only makes sense when it is economical. (I guess there is a whole other debate to be had about the definition of economical.)

I'm still using an Intel Core 2 E8400 machine with an nVidia 8800GT video card [edit: these were both $100-$150 USD when I got them]. Now and then I think about upgrading, but then I start wondering why I would bother. I played WoW up until a few months ago, and occasionally play other CPU limited games like Starcraft 2. Everything runs just fine, although I often turn the settings down to ensure that I get >25fps during computationally stressful periods. Oh, and this is on Ubuntu, so I think I have a pretty hefty FPS penalty from using either OpenGL or DX translation.

As far as RAM goes, I don't find many applications which require 8GB, and 99% of the time 2GB is sufficient without swapping. Again, this is on Ubuntu, so YMMV (I have no clue what the memory usage of Win7 looks like). I think whatever is economical at the time is the target for RAM. If it's $2 to get 2 DIMMs instead of 4 with the same total capacity, that might be worth it for future upgradability. Unless you know you want to run a RAM-disk or something, I think RAM is one of the things people waste way too much time thinking/talking about.

There is no way to predict the future demands of applications. If you take a conservative look, things will still be single-thread-limited for years to come. There are limits to parallelism in most tasks, so this view is not unreasonable. If you subscribe to this view, you might benefit in the future from faster single-thread x86 performance. But, if you subscribe to the view that we are fast approaching a transition into massively parallel applications, it may be that CPU performance should be sacrificed for a fancier video card. Who knows? Should you really buy hardware for a future you can't predict?

The only thing I can think of that really makes me want to upgrade are the new SATA & USB interfaces. And stuff like that either is available, or isn't. There was no way to future-proof that 5 years ago.

Agreed on all the points about upgrading the CPU usually not being worth it - when it bottlenecks, you need a whole new architecture, not a few more MHz. RAM is cheap - buy at least 8 now, leave room for 16. Gobs of RAM never goes out of style. Have the ability to upgrade/ add a graphics card - that's easy and non-invasive, and can make an older CPU work for newer games for a while.

The only thing I'll add otherwise is ports - it's always nice to have every port possible to plug in a new toy. This goes double when you build (or suggest) a new computer for friends & family - nothing quite as much fun as explaining why there isn't a FW port to plug in the camcorder they got for Christmas. So at the moment, I say gigE, FW, TBolt (ideally), eSATA, and USB3. That's more about future use cases than future compute power, but comes up more often in my experience. Not like browsing Facebook has gotten more strenuous on the CPU in the last few years.

The current price in Germany for an i5 3570K is about 200 Euros, while an i7 3770 is just 60 Euros more.

If you shop at reputable shops you will find that the 3770k is pretty much 95€ more expensive than the 3570k. That's about 120 US$. A lot of money for a CPU, that's only faster if you use heavily threaded applications. And by far not everyone does. If you do, you know it and will not hesitate to spend the 95€.

If you don't know, you are most likely better off investing the 95€/120$ into a better graphics card, bigger SSD, better monitor, better keyboard or whatever else you haven't topped out yet.

Who is online

Users browsing this forum: Yahoo [Bot] and 3 guests

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum