I just wanted to post a commentary on the state of desktop and see what kind of insight people had to offer on what the shrinking desktop and desktop component market is having on our hobby.

People have been saying this for years, but for one reason or another the desktop and destop OEM component market continues to shrink. With laptops taking up 60% of the market and counting and consoles snatching the bulk of the gaming market, what niche is left for desktops. Software doesn't push the hardware to the limit that it used to and with gaming continuing to move to consoles, most people feel that laptops do everything that they need in good time.

Desktops once had the advantage of price and performance over laptops, but with inexpensive bare bones netbooks further driving down laptop prices, desktops no longer have the same price advantage...and it's getting more difficult to say that we can throw together a decent laptop cheaper than Dell- yes they're arguably better than Dells, but noone other than us cares.

Most enthusiasts say that they don't care what the rest of the world is doing and that they will continue to build their own systems- but without the same fundamental and compelling reasons to build or buy a desktop that we had 5 and 10 years ago, who will be the next generation desktop enthusiast?

We are noticing a slow-down in the roll out of new generations of products- instead we see more and more re-labelling and hairsplitting of product lines with a noticeable lag in innovation.

The effects of a shrinking PC gaming market are clear- new big titles are trickled out at a rate of two a year- look at blizzard's marketing tactics.

On the other side of the coin, our hobby is really about pushing the exisiting hardware to the limit and creating some snazzy looking builds in the process, and truthfully, we might be happy if we were still just monkeying with Athlons. But there's no denying the boost that we get from maxing out the latest game, or figuring out the foibles and tricks of a new chipset.

I hope I not coming across too 'doom and gloom' but I'd like ot know how other people felt about the future of 'our thing'

I think advancements in laptop tech (ie. Expresscard) will allow some degree of laptop customization. Right now, as the tech stands, desktops have an important role to fill; but if laptops can be customized easily by the end user after purchase, you will see the desktop market fade.

Right now, I think there is a big enough market for desktops due to the end user customization options, and the PC industry is still making a killing off selling upgrades. I think they would be shooting themselves in the foot if laptops took too much of the market, as they have traditionally been harder to upgrade.

punk_zappa

August 24, 2009 03:53 PM

Gaming is one of the main reasons of upgrading. If games are released far and between for PC and games are released constantly on consoles, IMHO, that would be it. I don't see upgrading to a better CPU and GPU for surfing the net justified.

SugarJ

August 24, 2009 03:53 PM

I think there must be much better markups in the laptop market. Look at how much you'd pay for a current middle of the line laptop (900-1000, say), and how much the equivalent components in a desktop would cost (probably 500-600). There has to be a bit more in design / engineering of the units I'm sure, but there still has to be bigger margin in them as opposed to selling desktop components.

Babrbarossa

August 24, 2009 05:07 PM

One thing that could happen other than the obvious lack of games and thin driver support is that watercooling parts get more expensive and harder to find as demand drops. This may be a good thing for he purists though- I have a couple of friends at work that used to be overclocking/wc hobbyists until alternative cooling system became an over-the-counter thing- they used to machine all of their waterblocks - even made flow meters.

It would be a shame, though since so few people have access to that kind of equipment.

MpG

August 24, 2009 06:40 PM

Honestly, I'm of the mind that the last ten years has been one big bubble for desktops, simply because we were making them do things that they really shouldn't have been doing. Until recently, our laptops sucked, virtualization technologies were poor, and graphical horsepower was always struggling to cope with our available resolutions and gaming engines. Now, we've got laptops and netbooks (and I believe even Intel granted that many netbook sales weren't "new" sales, but "instead-of desktop" sales). Virtualization setups can drastically reduce the number of real desktops in business', while our multi-core chips have more than enough horsepower to handle an office full of machines that do nothing more challenging than WYSIWYG. And even mid-range GPU's can usually handle games on medium detail.

Honestly, we don't NEED to buy new desktops anymore. The economy has likely made us realize that a little sooner than later, but I think people are really cluing into the fact that they don't need to upgrade, and that many upgrades have very dubious benefits.

I think companies are realizing this too, to a certain extent. Look at Nvidia's push for non-gaming applications. I believe OCZ was recently quoted that their SSD's sales are (or will be) their number one product now. In fact, it's amazing how many memory companies have suddenly jumped on the SSD bandwagon - it's irritating in a way, but I think they're all seeing the writing on the wall. When the latest generation of DDR3 sticks are exceeding JEDEC specs by 100%, you know you're selling luxuries, not needs. :haha:

Case in Point: I just recently reloaded Crysis:Warhead, and played through the entire game at 2560x1600, Gamer quality, with absolutely zero problems, with just a single GTX280. And over the next half year, Nvidia and ATI are going to try and sell me a bigger card? Oh, they'll succeed eventually, but they sure won't catch me lining up to pay the Day 1 price-gouge. :biggrin:

And for aftermarket stuff - even watercooling is hitting a bit of a stall, because the benefits just aren't what they used to be. My i7 920 honestly doesn't care if it's at 60*C or 80*C, my GPU is the same at 35*C or 55*C. It's fun, but I don't delude myself that I'm getting any extra performance because of an extra few degrees. That, and the used market is really coming into its own these days - a little patience and research, and all you need to buy new is tubing and clamps.

My 2c.

Squeetard

August 24, 2009 07:59 PM

You are speaking as a gamer and enthusiast Barbarossa. The workplace desktop market is just fine. And as long as takes a few bios tweaks and unlocks, the overclock/enthusiast market will be fine too.

Squeetard

August 24, 2009 08:05 PM

We are also hitting the limit for silicon performance. Thank god for that. AMD and Intel are actually working on getting us more per watt/ghz now and researching new transistors, the lessons learned now will surely carry forward into the new technology.

gingerbee

August 24, 2009 08:28 PM

Quote:

Originally Posted by Squeetard
(Post 244464)

We are also hitting the limit for silicon performance. Thank god for that. AMD and Intel are actually working on getting us more per watt/ghz now and researching new transistors, the lessons learned now will surely carry forward into the new technology.

I know isn't it exciting :clap:

JD

August 24, 2009 08:37 PM

I can't really see myself using my laptop over my desktop, though I suppose it's feasible eventually, especially as PC gaming just becomes ported from consoles. At the same time though, my PC still looks better than 360 IMO. It seems consoles always forgo AA, or is it just me? Probably doesn't help having my 360 run at 720p on a 1080p TV, but my 360 lacks HDMI and my TV doesn't do 1080p component.

As for laptops not being upgradable, that means more profits for the companies. People either need to go to the manufacturer to have it done, or they buy a new one. Far more profit there than allowing users to buy their own parts and install them. Kind of like how Apple runs, they make you buy Apple-certified components and eventually you give in and just replace the whole machine.

However I believe the reason why things haven't changed much lately is because Intel/NVIDIA didn't exactly have any competition to drive further innovation. Only this past year has AMD/ATI actually started to gain some traction again and put out some competitive products.