Future Intel CPUs may come soldered to motherboards, but what does that change?

We're still waiting on Intel's next-generation Haswell CPUs to launch, but details and rumors are already swirling about processors that are even further down the company's roadmap. The latest scuttlebutt is that Broadwell, the follow-up to Haswell due in 2014, will forego standard socketed desktop processors. If the rumors are true—and both ZDNet and SemiAccurate report that they've received confirmation from PC OEMs—Broadwell desktop CPUs will need to be soldered directly to motherboards, and won't be easily swappable or upgradeable by users and system builders.

CPUs soldered to motherboards aren't anything new. While Intel-powered desktops normally use a land grid array (LGA) package to allow for OEM and end-user upgrades, laptops, all-in-ones, and other more highly integrated systems already often use soldered-on CPUs in a ball grid array (BGA) package. However, this would be the first time that this limitation would be imposed on standard desktop processors.

While this move wouldn't have much of an impact on the vast majority of desktop users, most of whom simply don't perform their own processor upgrades, there's been quite a bit of hand wringing among power users who feel that their ability to upgrade and build their own systems is in jeopardy. But are the desktop-as-we-know-it's days really numbered, or is all of this worrying much ado about nothing?

Broadwell may not be a general-purpose desktop part

Broadwell desktops may be more like the tiny "Next Unit of Computing" than traditional mid-towers.

The most plausible explanation for this move is that Broadwell may not be intended for traditional tower computers at all.

For the last few years, Intel's processor line has been upgraded on a predictable annual schedule, referred to by Intel as "tick-tock." Every two years, a major new processor architecture ("tock") is introduced—Haswell fills this role, as Sandy Bridge did back in 2011. New architectures are then followed up the next year by a "tick," in which Intel makes some smaller tweaks to the processor's performance and introduces new power-saving manufacturing processes. Generally, tocks are the parts that bring the big performance improvements, while ticks bring small performance increases but reductions in power usage—power usage reductions that are generally more useful for battery-powered laptops than they are for desktops.

It could be the case that Intel intends Broadwell to focus primarily on mobile computers like laptops and tablets, while continuing to sell older (but not significantly slower) Haswell parts for traditional socketed desktops. It would put technophiles in the position of not always having the latest-and-greatest powering their desktops, but its practical effect would be negligible. Broadwell's integration of the chipset into the CPU package would likely require new motherboards anyway, making the upgrade path for users who update their hardware regularly that much more expensive.

This wouldn't be the first time that Intel's mobile and desktop product lineups had been split: Intel sold Pentium M CPUs in laptops alongside Pentium 4 chips in desktops for years. A return to this path for Haswell and Broadwell is one that makes sense, since it frees up more of the company's manufacturing capacity to make higher-demand (and higher-priced) CPUs for laptops and tablets. Intel has also been known to skip architectures for certain segments of its product line when it makes sense—for example, the highest-end multisocket Xeon processors will eventually go straight from the current 32nm Westmere EX architecture to the 22nm Ivy Bridge EX, presumably because the power usage of a 32nm Sandy Bridge EX would have been too high to be palatable.

As for the motherboard-mounted Broadwell desktop parts that will exist, it could be the case that whatever Broadwell desktop CPUs make it to market will simply be targeted toward highly integrated systems like all-in-ones, or small form-factor desktop like Intel's so-called "Next Unit of Computing." These sorts of systems, which are rarely intended to be upgradeable in any event, could still benefit from the reduced heat output and lowered power consumption of Broadwell, while larger towers could still have a fast and fully upgradeable CPU in Haswell. Socketed processors may then come back for Skylake, Intel's next "tock" scheduled for release in 2015—the report from SemiAccurate believes this to be the case.

The future is murkier—whether Skylake's successor would take the same bifurcated approach as Haswell and Broadwell, and whether the LGA packaging for desktop processors would survive beyond Skylake, is unknown at this point. But Intel has tried some things in the past that could point the way forward.

Upgrading your hardware using software

Back in 2010, Intel launched a small-scale experiment with its Pentium G6951 CPU. The idea was that users could buy computers using this CPU, and then at some later date purchase a code that could unlock latent features in the processor. Users could thus "upgrade" their processor without physically needing to upgrade their processors.

Intel hasn't expanded this idea to its entire lineup—upgrade codes are also available for a few midrange Sandy Bridge processors only—but like the Broadwell announcement, the move caused quite bit of teeth-gnashing at the time. It wasn't cost-efficient compared to buying a faster processor in the first place, and some regarded the sale of a processor with intentionally disabled features as disingenuous, but it made a certain amount of sense for the people who weren't comfortable prying open their own computer cases. Buy a cheap computer now, put down a smaller chunk of change down the line to give it an easy speed boost.

In a world where CPUs come soldered onto motherboards, this approach could be utilized to give users the ability to upgrade their PCs without actually upgrading their PCs. It would require a fairly radical rethinking of Intel's tangled product matrix—I can't imagine motherboard makers would be willing to produce two dozen different motherboards for each separate processor SKU—but it gives those motherboard makers and OEMs a way to offer various products at different performance levels and price points without actually needing to sell different hardware.

This approach could potentially cause cost-increasing problems for OEMs, though—if you're selling a dual-core processor that can be upgraded into a quad-core processor, you've got to make sure that all of your systems have cooling fans and power supplies that can handle the demands of a quad-core chip. Depending on how Intel consolidated its product lineup, it could also be costly for the chipmaker: if it sell a load of CPUs that can be upgraded via software to have more cache, more cores, Hyperthreading, or other features that end users rarely take advantage of, that could add up to a whole lot of wasted silicon (and thus, money).

Still, selling CPUs and other products with disabled features isn't exactly a new phenomenon: many chips are sold with disabled cores or less cache in order to address multiple price points without actually having to design new purpose-built hardware. The best example of this is probably AMD's triple-core CPUs, which are just quad-core parts with one of the cores turned off—manufacturers often take this approach early in a new processor's manufacturing run so that they can still sell a CPU even if all of its cores aren't working as they should (a trick to increase yield rates and avoid wasting silicon), but in many cases enthusiasts can unlock these disabled features without issue. In our hypothetical scenario, Intel would merely be institutionalizing a practice that has long gone on behind the scenes.

Whatever happens, the modular desktop won't last forever

The desktop as we know it, complete with replaceable processors and user-upgradeable components, will probably still be around for at least the next few years, whatever happens with Broadwell. In the long-term, though, desktops with replaceable CPUs will probably become a thing of the past.

One of the biggest, most pervasive trends in computing is increasing integration. It arrived early in phones and tablets, where cramped physical space and tight power usage windows helped drive the integration of the CPU, GPU, memory, and other components into monolithic systems-on-chips. We've seen it move into laptops as well, where Ultrabooks and a more general push toward lighter, thinner systems has driven OEMs to shave off millimeters wherever they can. Small form factor desktops and all-in-ones (many of which actually use mobile CPUs instead of desktop CPUs anyway) have followed suit—it is arguably only a matter of time before the trend claimed the run-of-the-mill desktop, which isn't exactly a segment with high enough growth to continue meriting its own separate processors and parts indefinitely.

More than that, selling desktops with all of the parts integrated into the motherboard more accurately reflects how the vast majority of consumers use their PCs—most of them simply don't upgrade them over their useful life, and if it gets too slow, stops meeting their needs, or simply breaks, they'll either turn to an authorized repair center or replace the system outright rather than digging around in it and upgrading it themselves.

Like many of you, I've been upgrading components and building my own systems since childhood, and I get as annoyed as anyone by laptops with the RAM soldered to the motherboard or non-standard hard drive connectors and form factors, but that's the way that things are moving. In general, having fewer parts means that there are fewer things that can break, and that manufacturing costs (and, hopefully, costs to the consumer) are lower. These are the things that most users and businesses care about—the death of the user-replaceable CPU would be a sad road marker on the highway of technological progress, but the amount of impact it would actually have on computing is probably less than we'd like to think.

302 Reader Comments

I only built my first PC in 2009. I was planning on building a new one in 2014-2015. I'm not sure I particularly care about the CPU integrated into the motherboard, but I would very much like to continue building my own desktop PCs, especially for high-end gaming. CPU is probably the part I am least likely to upgrade. I've put more RAM and new hard drives in my current machine, but I'll be damned if i have to put on that fiddly heat sink again.

In terms of upgrades, integrated CPU/Mobo combos wouldn't be much a change for me; how often are you upgrading between two CPU's with the same socket? I don't know that I've ever done so, at least not recently. I suppose you could upgrade from an i3 to an i7, but in general when I am looking to upgrade it's not from Sandy Bridge to Sandy Bridge - it's to a new generation of CPU and that generally means a new socket. Maybe I'm in the minority there though.

Where it becomes more of a thing to me is in the case of repair. I've burned up processors multiple times and having to replace the entire Mobo just because I need a new chip would really bug me.

Apart from vendor lock-in, is there an advantage to soldering on the CPU?

To tell the truth, I don't think I've ever dropped a new processor into an existing motherboard. Definitely not in a system I've built for myself. I tend to hang on to machines for so long that, by the time I'm ready to upgrade, the motherboard (and the RAM, and the video card, more often than not) is pretty much obsolete as well.

This is no issue for my personal building habits. Typically when I am ready to upgrade a CPU the socket has significantly changed, so a motherboard has to be purchased anyway.

My current build was put together a few years ago with an i7 950, so I don't see myself upgrading any time soon. By the time I am ready there will be very different sockets even if they didn't solder CPUs to motherboards.

So OK, this looks like it's not a big thing. However, thing is if CPU is gonna be permanently attached to motherboard, you most likely won't be able to buy high-end MB with mid-range CPU, or vice-versa.

Gut reaction is FUCK YOU YOU CAN'T TAKE THAT AWAY FROM ME, but then I took a step back and realized that every time I've replaced a CPU since I first built my system, I've replaced the motherboard as well. That said, if RAM starts getting soldered on, or expansion slots go the way of the dodo, I'm going to pitch a fit.

So OK, this looks like it's not a big thing. However, thing is if CPU is gonna be permanently attached to motherboard, you most likely won't be able to buy high-end MB with mid-range CPU, or vice-versa.

Exactly. This move in a word limits choice...intentionally. They're doing this because it helps their bottom line, not yours.

Another reason for me to avoid buying Intel. AMD has had a much more enthusiast friendly motherboard/CPU system. Generally, a bios update, and a new cpu works in the same socket. Or, one could upgrade their motherboard, and use the same processor until they get money for a new one.

I still upgrade my CPU (e.g. from 2nd gen i5 to 3rd gen i7) and performance gain is significant. I would really like to still have an option to swap out major components.

Twice in my PC building experience I had the mobo fail, I replaced the motherboard with same-CPU-socket one and replugged all the drives, memory, and CPU of course and it ran like a champ (OS / drivers were finicky but that's different).

Having to replace the entire shebang if something goes wrong saddens me.

I've built my computers since the dawn of the 2000s. During their lifetimes, the only modifications I ever made to them is to add a few sticks of RAM (in a curious turn of destiny, I always happened to buy them when RAM prices were at their highest levels), a new hard drive of SSD and eighteen months ago, I replaced my video card when my venerable 8800 GTX ended up unable to show me a single frame. That was the first time I ever made such change.Each time I thought about replacing the CPU, the motherboard socket happened to require CPU that are a bit too old to be sold a decent prices and replacing the whole CPU/MB/RAM/GPU made more sense. And anyway, the cheap and efficient upgrade have always been the GPU, not the CPU.

Even if is seems a bit infuriating to loose the possibility to change the CPU and only the CPU, my guess is that the people who will be the most annoyed are all the journalists writing tests and performing hardware benchmarks for PC hardware sites. The same ones who put emphasis on having perfectly tool-less serviceable cases whereas most users will only mount their PC during maybe 2 hours every other year.

On a side note about this trend towards miniaturization, I think the next Ars System Guides should really seriously consider mini-ITX gaming rigs. Nowadays, they aren't much more costly than classical ATX towers and USB has perfectly fulfilled the promise to enable users to plug just about everything that might not fit inside a computer case.

I don't care if they include it into the motherboard... the only thing that concerns me is if the board dies and needs replaced, and it happens to be out of the manufacturers warranty. This could mean a huge cost to me now having to spend probably 500+ for another board with this cpu on it because the board failed or something else on it dies.

Most of you guys saying that you don't do CPU upgrades are probably running Intel machines. They change their sockets constantly, so yeah, there's never really a good upgrade without a socket change.

I've been running AMD for a while since I'm a cheapskate, and I've done multiple processor changeouts for each socket. I typically buy on the new end of a socket, and then when the prices come down for the higher end processors for said socket, I'll buy that one (usually looking at $50-70). My computers have always been slowly evolving machines. I rarely do completely new builds... then again, I'm a cheapskate.

So that being said, this may be an Intel only move, so there's really no point in me getting too worked up.

Edit: Also, with regards to the "fewer components to break" argument, I think that's a red herring. How often does the processor socket go bad? If my board fries, now I'm paying a fortune to replace the board and the processor instead of just one component. Processors aren't cheap enough to make that a good idea. Interchangeable parts are a good thing.

For a while now I've wondered why it wasn't already like this to begin with. CPU-to-mobo compatibility cycles about every 6 months nowadays. By the time you might think about upgrading a CPU, the motherboard is most likely already obsolete compared to the current CPUs.

Gut reaction is FUCK YOU YOU CAN'T TAKE THAT AWAY FROM ME, but then I took a step back and realized that every time I've replaced a CPU since I first built my system, I've replaced the motherboard as well. That said, if RAM starts getting soldered on, or expansion slots go the way of the dodo, I'm going to pitch a fit.

If you're integrating enough to soldier the CPU to the motherboard, it makes no sense to keep RAM etc. separate. Basically, this level of integration would mean the end of desktop PCs in the classic sense which is why I don't believe it will happen as there is still demand for that sort of thing.

This would actually worry me not for the ability to replace the CPU per se, but because this would vastly limit the choice one has when building a computer. Mixing "budget" and high-end components can lead to a very high price/performance with very little compromise for the tasks you want to run on a computer. This is so much harder to do with everything integrated.

On one hand the whole idea of integration annoys me as I like to build my system piece by piece. On the other hand it's not like it's going to happen overnight. I can see the day when system building goes by the wayside but I doubt it's going to be anytime soon and I'm going to enjoy it while it lasts. Who knows my mindset might change by then and I won't care as much.

I mourn the recession of meticulous system-building to its inevitable niche, but there is no mainstream future in it.

Tightly integrated systems are not necessarily anemic as they once were. Competitive parts can be found on factory-integrated boards, at competitive prices.

The advantages of tight integration, including well-tuned performance, are set to overtake the advantages of building a Frankenrig. Short of hitting a benchmark for its own sake (which is fun, to be sure), it is no good ignoring the many well-built pre-integrated systems available.

You've always had to know a lot about each and every part in the system in order to have a competitive machine. All that's changed lately is, you don't necessarily have to assemble it yourself anymore.

When it comes to desktop system builders, it seems like many of the comments here are missing an important point. The benefit of the way things are now isn't in being able to upgrade the CPU, it's in being able to select both the CPU and Motherboard separately. If I want to build a home server I can save money by pairing a lower end CPU with a Motherboard that has useful server features like a large number of SATA ports.

The big problem I see with this is generation of e-waste and the fact that with processors soldered onto motherboards, the OEM will either have a large selection of mobo/processor combos or you will be limited to x processor for the life of your system.

At least with the ability to swap out processors, the OEM can have one board that can handle a range of processors. What intel is doing is trying to front load the cost to mother board manufacturers, forcing them to integrate the processor to the motherboard at the factory.

Whether or not you install a new CPU in an existing motherboard isn't the point - this will reduce choice and flexibility, and force particular combinations on system builders and buyers. On the other hand, that isn't much of an issue for most buyers anymore.

I think the idea that this line just isn't aimed at traditional towers is just right - it's aimed at the highly-integrated market. Intel will likely continue to produce some LGA (or another swappable form factor) chips for some time, but perhaps gradually they will become fewer, and isolated to high-end enthusiast/performance chips only.

It still seems likely to me there will always be a market for some kind of "separates" system that allows building custom devices, but these will probably take a very different approach to what have now, and with most computing moving to the cloud the market for them will probably be vanishingly small (or "specialist").

How possible would it be for someone like Asus to buy the unsoldered chips, solder them to some daughterboard which plugs into their motherboards? - Not too hard I guess unless Intel tries to stop them?

Then again, like most of you, I've never upgraded just a CPU. Its almost always CPU/Mobo and RAM, using old HDDs and Graphics which then get replaced next.

Gut reaction is FUCK YOU YOU CAN'T TAKE THAT AWAY FROM ME, but then I took a step back and realized that every time I've replaced a CPU since I first built my system, I've replaced the motherboard as well. That said, if RAM starts getting soldered on, or expansion slots go the way of the dodo, I'm going to pitch a fit.

10 internet dollars says when it does finally happen, you shrug and don't care.

It's won't matter either way. We're heading back to the old days of completely closed systems, hardware, software, and all. It's even worse with the whole "App store" model because now the owner of the given eco-system you work within can not only charge you for the hardware / OS / tools, but now they get a piece of your profit to list an app, and when you sell the app.

It's all thanks to Apple( however, the cell phone scam / racket and the ISP monopolies are the same, too ), who steadfastly remained a closed system. Would that they died off in the 90's, but for Steve Jobs finding the magic formula to lure the users. It was brilliant from a business POV. Apple cashed in huge, and of course, who doesn't want to emulate success? That's why Microsoft has switched to the same model.

On my current system I went from a socket AM3 midrange $70 CPU to a highly overclockable Socket AM3+ 965 BE on the same motherboard. When I upgrade my motherboard, probably to a Crosshair V FZ, then I'll keep the same CPU in it until Steamroller or Excavator comes around. There's a possibility that socket AM4 might be coming with Steamroller, and if that's the case then I may wait for the 1090FX chipset and get an AM4 motherboard that I know will last for several years of CPU cycles.

At this point, if you have a motherboard with SATA3, DDR3, and PCIE3.0, then there's going to be very little reason to upgrade that motherboard in the next 4-5 years. Possibly SATAe, but that's going to be pretty niche I think, and not worth a $200 mobo upgrade. There could be many reasons to upgrade your CPU in that time though, and I can't forsee myself giving up that option.

Also, imagine the costs when you're a big hardware integrator like Dell, having a lot of hardware support / warranty contracts. Either they will have to discard all their motherboard + CPU faulty combos, or have to send to repair to specialized centres (it will no longer be as easy to recover a CPU from a broken motherboard if you need to desolder a thousand pins and solder them again, so a local technician will not be able to perform the repair).

I don't see why this is a big deal. Whenever it's time for me to upgrade the CPU I always get a newer motherboard. It's rare that I will keep a motherboard and simply upgrade the CPU. But then again I'm not one of those who feels the need to constantly upgrade their CPU just because a new one comes out.

All these posts saying it is not a big deal that the CPU is soldered to the MB are pretty much right; not many replace the CPU on the same board. But the trend is still unsettling. Who knows what gets soldered on next? RAM? Only on-board video? The CPU itself is not a big issue, but if integration moves further, it could be.

Plus, having to replace both CPU and board when only one has failed seems like a waste of both money and materials.

Also, imagine the costs when you're a big hardware integrator like Dell, having a lot of hardware support / warranty contracts. Either they will have to discard all their motherboard + CPU faulty combos, or have to send to repair to specialized centres (it will no longer be as easy to recover a CPU from a broken motherboard if you need to desolder a thousand pins and solder them again, so a local technician will not be able to perform the repair).

How often has that happened to you? I have that happen to me only twice in 20 years. One of those times it was 100% my own fault for doing something stupid.

All these posts saying it is not a big deal that the CPU is soldered to the MB are pretty much right; not many replace the CPU on the same board. But the trend is still unsettling. Who knows what gets soldered on next? RAM? Only on-board video? The CPU itself is not a big issue, but if integration moves further, it could be.

Plus, having to replace both CPU and board when only one has failed seems like a waste of both money and materials.

Thre's a legit reason for non-replacable CPU. Please give us a legit reason to implement non-replacable RAM, Video cards, and so on.

Gut reaction is FUCK YOU YOU CAN'T TAKE THAT AWAY FROM ME, but then I took a step back and realized that every time I've replaced a CPU since I first built my system, I've replaced the motherboard as well. That said, if RAM starts getting soldered on, or expansion slots go the way of the dodo, I'm going to pitch a fit.

10 internet dollars says when it does finally happen, you shrug and don't care.

It's won't matter either way. We're heading back to the old days of completely closed systems, hardware, software, and all. It's even worse with the whole "App store" model because now the owner of the given eco-system you work within can not only charge you for the hardware / OS / tools, but now they get a piece of your profit to list an app, and when you sell the app.

It's all thanks to Apple( however, the cell phone scam / racket and the ISP monopolies are the same, too ), who steadfastly remained a closed system. Would that they died off in the 90's, but for Steve Jobs finding the magic formula to lure the users. It was brilliant from a business POV. Apple cashed in huge, and of course, who doesn't want to emulate success? That's why Microsoft has switched to the same model.

Everyone loses in the end.

Apps are a bad example since you aren't restricted to apps found in the App store.

I don't see this as a big deal these days. While I mostly don't build anymore, the last time I replaced the CPU over the useful life was never. So, in my last several builds. I always bought the CPU and the motherboard together anyway. Don't see that changing in the imaginable future.

If anything, I'll probably stop building or go for some sort of bare bones where the CPU and mobo (at least) are done together.