Posted
by
CmdrTaco
on Wednesday April 21, 2010 @09:18AM
from the consistency-is-stupid dept.

mr_sifter writes "According to a leaked roadmap, next year we'll be saying hello to LGA1155. The socket is 1-pin different from the current LGA1156 socket Core i3, i5, and some i7s use. Sandy Bridge CPUs will be based on the current 32nm, second-generation High-k metal gate manufacturing process. All LGA1155 CPUs will have integrated graphics built into the core instead of a separate chip. This is an upgrade from the current IGP, PCI Express controller and memory controller in Clarkdale CPUs. which is manufactured on the older 45nm process in a separate die (but still slapped together the same package). This should improve performance, as all the controllers will be in one die, like existing LGA1366 CPUs."

Nigel Tufnel: Look at this pin... still has the old tag on, never even used it.
Marty DiBergi: [points his finger] You've never used...?
Nigel Tufnel: Don't touch it!
Marty DiBergi: We'll I wasn't going to touch it, I was just pointing at it.
Nigel Tufnel: Well... don't point! It can't be used.
Marty DiBergi: Don't point, okay. Can I look at it?
Nigel Tufnel: No. no. That's it, you've seen enough of that one.

Basically they've run out of ideas on how to use those billions of transistors to make things faster or better.

How about a specialized CPU? Lots and lots and lots of weak single-threaded cores with their local non-shared memory, all running their own small program and connected to a very fast bus, allowing them to pass messages to each other. It would be ideal for many emerging applications, such as image recognition and AI in general.

The thing is, a general-purpose serial CPU is already as fast as it's ev

It's only useless for gamers because this is the early generations of the technology. More than likely for the next several decades hard-core gamers will want a dedicated video card. But at some point soon you will wind up with "good enough" graphics on the main die to get by for the casual gamer on all but the most graphics intense games.

More than likely for the next several decades hard-core gamers will want a dedicated video card.

The 3dfx Voodoo card was released less than 1.5 decades ago. Game console generations are getting longer and longer. My crystal ball says few people will be buying standalone graphics cards in 5 years.

Not if you convince "proper graphics card" to see it all as a CPU integrated in their graphics card.

I don't think it'd be very hard right now to convince an alienware buyer to uy a computer that's essentially a graphics card with all the rest integrated around it. Except, maybe, the hard drive. And even there you could argue "it has a SSD for you to install one or two games at a time. You can buy a standard HD for the rest."

The only thing to leave outside would have to be the mouse (some elite pro killer ra

to bad it's the same gma crap that amd has a better on board chip and plane to work on getting in the cpu + letting it boast a add in ati card as well. what will intel card do just shut down when a better card in installed?

i5-661 (with the fastest on-package graphics) is performance-competitive with AMD's latest integrated graphics. The slower on-package GPU from Intel are behind, but not by much. Nothing Intel can't solve in its next processor (especially as AMD did not increase its IGP performance)

Low end systems become even cheaper to produce when the chipset on the motherboard does not need to include graphics support. Also, if your add-in video card fails, you can always run off integrated until you can replace it. You are right about a 'proper' video card being a better choice overall, but if you look at those $400 to $500 computer towers being sold all over the place, not a single one has a dedicated video card.

Now, AMD is moving forward with their Fusion project, which will add a GPU to s

But for a desktop PC, isn't this a disadvantage? If you're using a proper graphics card, couldn't that space in the CPU be used for better things than a redundant graphics circuit?

Don't look at the PC enthusiast/gamer market. Look at the desktop PC for basic business use. Cost is much more king there, as long as performance is acceptable. You gotta cut a lot of costs if you want to be able to slap down a whole PC for less than $200.

I wouldn't be surprised if in a couple more generations we're looking back at 'system on a chip' designs. No northbridge, southbridge, video controller, etc... Just a central chip on a board with power and interface leads.

Not sure if it's possible, but I'm guessing that if one added a graphics card, then the processing power of the graphics portion of the CPU could be used for other things. Granted, I wouldn't expect CUDA type performance, but I'd think a few new instructions that allowed programmers to specifically target unused graphics units for processing SIMD instructions would be welcome. Same thinking goes for the AMD chips. Basically an either-or choice: all-in-one chip, or increased computational power... which are

I can see that integrated graphics in a CPU can be handy for some applications, like low-power mobile stuff and such.

But for a desktop PC, isn't this a disadvantage? If you're using a proper graphics card, couldn't that space in the CPU be used for better things than a redundant graphics circuit?

One could make the same argument about motherboards right now. A lot of them come with onboard graphics that takes up space on the board better used for SATA ports or some such, and yet people still buy them and stick video cards on them.

I've seen a lot of systems. If it has a video card, odds are better than 50/50 it's also got onboard video.

Simple.There is a HUGE segment of the population that doesn't need anymore graphics capability than what even the crappy Intel integrated graphics offer.The current offerings are much better.Here is the maximum graphics requirement for about 80% of all Windows PCs.Will it playback 1080p video.And that is the maximum they require.

A lot of people never play any video game that is more graphically intensive than Plants vs Zombies.A lot of people never play any video better than what is on Youtube.

Um, no. Cache is very important, especially with 64-bit code. In fact, x86 is a terribly die-area-inefficient architecture; we'd be a lot better off with a modern RISC, opening up space for more cache.

Your point would have been valid 10 years ago but the die area used for the CISC instruction decoder on a modern x86 processor is negligible. Infact the x86 instruction set is more compact than a pure RISC cpu so you can fit more instructions into the instruction cache (ARM processors have a THUMB mode with more compact 16bit instructions because of this).

The key is modern RISC, not RISC. x86 is horribly inefficient. I'm not talking about the instruction decoder, I'm talking about the instruction semantics. x86 was never designed for today's high-performance CPUs, and the result is that the instruction set basically allows the programmer to do anything they want, even if it goes against modern CPU design optimizations. This forces the CPU to devote a large amount of die area to workaround logic that detects the thousands of possible dirty tricks that a programmer might use which are allowed by the ISA. For example, every modern RISC requires that the programmer issue cache flush instructions when modifying executable code. This is common sense. x86 doesn't, which means there needs to be a large blob of logic checking for whether the data you just touched happens to be inside your code cache too. The fact that on x86 you can e.g. use one instruction to modify the next instruction in the pipeline is just so ridiculously horribly wrong it's not even funny. There are similar screw-ups related to e.g. the page tables. I can't even begin to imagine the pains that x86 CPU engineers have to go through.

You can make an x86 chip reasonably small and very slow, or very large and very fast. x86 doesn't let you have it both ways to any reasonable degree.

The neat thing about the x86 architecture is that it has forced the chip designers to be really clever. E.g. the register limitations has forced them to find ways to make level 1 cache really fast; you'll be hard pressed to find non-x86 chips with faster level 1 cache. Similarly, the system call latency is fantastic. Most importantly the (quite) strong memory ordering provided by x86 means that x86 is pretty much unmatched when it comes to inter-CPU communication. Look at the hoops e.g. PA-RISC goes through

RISC typically needs more RAM than CISC (and it seems less than 10% of the die area is devoted to x86 instruction decoding, at least in high-performance processors), so you'll trade the space for more cache for the need for more main memory.

I love how everyone jumped so quickly on the instruction decoding bandwagon. Of course instruction decoding is cheap these days, even for x86. The problem isn't decoding, it's the huge amount of dirty things that instructions can potentially do after being decoded. Things that go against modern high-performance CPU design principles.

Is this ignoring the fact that most of Intel's chips for many years have basically been RISC processors with an x86 translation unit?

This doesn't really make sense. ALL CISC processors are pretty much RISC processors with a translation front end. This has been true since the 8086 and (especially) the 68000, when RISC wasn't invented. The whole point of RISC is that it was discovered that you can live without that front end. Look up microcode.

The original 8086 was a bit RISC-like in that some instructions were in hard-coded logic and didn't go through the microcode layer. Modern x86 is less RISC-like, because all instructions need to go t

I'm using a desktop that I recently built with a Core i3-530 and the built-in graphics are quite acceptable, even at 1600x900(the monitor was free so I won't complain about the odd resolution). The only place they suffer is in high-performance areas like games. The IGP is designed to process Full HD video.

More people will need this than you might think. Let's look at each piece of your claim:

I think that the issue here is where you place the line on a 'proper' graphics card.

By that I mean that today even integrated video cards are easily able to keep up with GUIs, play even blue-ray movies, etc...

I'm not sure SVG/Canvas, rasterization will really bog down modern integrated graphic engines. Or if it doesn't support it, it'll fall back to the CPU, and assuming you're not doing anything too CPU intensive at that moment, it won't matter. You don't need a 5870 to run Office or IE.

Geometry: I'll admit that most office and web applications currently use much simpler geometry than a typical Xbox 360-class 3D game.

I would be shocked if the GPU integrated into Intel's next-gen CPU doesn't blow away what's in the XBox 360, which after all is a medium-high end card from 2005. And yet, you'll notice there seems to be little push for next-gen consoles beyond the XBox 360 and PS3. Integrated video will eventually be good enough for most applications including games, the question is not i

Don't forget that modern Operating Systems are able to store the contents of windows in the Graphics memory, leaving more room in system RAM for other items. This also speeds up the loading of less recently viewed windows (by having them cached less often).

I hate to break it do you, but a huge portion of PC users DON'T play anything more intensive than Farmville on their systems - if they game at all.

Even for myself - I do play games on my PC, but only on 1 of them. I've got 5 systems (Windows desktop, Linux desktop, Linux laptop, Mac desktop, and Windows desktop at work) and ONLY my Windows desktop at home ever sees any gaming. In the other 4 I really don't care what chip is in them because Chromium, Visual Studio, Safari, etc simply don't need it.

I hate to break it do you, but a huge portion of PC users DON'T play anything more intensive than Farmville on their systems - if they game at all.

Ya know... I've seen Cafe World bring a dual-core with 4 gigs of RAM to its knees. 60% CPU usage, with half a gig of RAM in use for Firefox alone... with no other apps running, nor even any other browser windows/tabs open.Hell, it's sluggish and choppy on the quad-core 2.8 in my living room. Farmville is a little better on most days, but still.Browser-based Flash Game != low-powered app. Might not be graphics intensive, admittedly... but Zynga really needs a head check when it comes to resource usage.

Not all integrated graphics are made the same. Intel integrated are utter shit.

I haven't tried nvidia integrated graphics.

But the ATI 3300 HD series of integrated chips?

It is as good as a top of the line GPU from about 3 or 4 years ago. 128Mb integrated memory, 32 stream processors, decent clock speed. There are a handful of high end games it won't do well with, but it will meet the needs of most people, including those playing TF2.

It will probably be junk like usual. If they released on board graphics on par with something like a 9800 GT it would crush NVidia and AMD/ATI as there probably isn't enough of a market above that to keep them operating.

Then there will be Federal investigations and anti-trust lawsuits... they just don't need that kind of trouble.

I have an i3 cpu. Given the pricing, I don't expect great things from the integrated graphics, but it's certainly been capable for light to medium gaming, and as an office desktop (we're standardizing on it at work), it's fantastic. If you want to run Crysis or Dragon Age, go buy a $150 gaming card. Otherwise, as an integrated graphics package, it's all I need and much better than I'm accustomed to.

Same socket, but can it run all the newer processors? That at least happened to be with a Shuttle I had that I thought about upgrading - for various reasons with the board it couldn't even with a BIOS upgrade. And there always seemed to be some sort of shift like AGP to PCIe, PATA to SATA, DDR2 to DDR3, USB 1.0 to 2.0 or some various other good reasons to upgrade anyway. Expansion cards are just silly expensive compared to motherboards, I'm guessing due to volume.

Yup, it can do it all...the ONLy thing i can't do is run DDR3 (it has four DDR2 slots), but other than that I can take care of all the new stuff (exceptin' USB3.0 and Sata6, of course...but not much on the market can do that yet either)

I fully agree however that a 3 year old AMD motherboard with a new CPU gives you just about the same experience as brand new system as long as the motherboard OEM provides ongoing support through BIOS updates. I'm a loyal Gigabyte customer for t

...the AM2+/AM3 socket on my AMD board continues to be useful for new AMD CPUs literally years after I originally purchased it.

Intel had a long run with the Socket 775 boards, and AMD pulled this stunt back with their Socket 939 to AM2 upgrade. AM2 is a 940 pin socket.

I do agree AMD did something right with their AM2, AM2+, AM3 sockets being interchangeable for many CPUs. Just some of the more interesting features get disabled when running an AM3 cpu on an AM2 socket.

I can't understand why they would force another socket design on customers. I am using a four year old motherboard and recently replaced my AMD CPU with a current model. It was a drop in replacement. Sure I could get some benefits from a newer MB, but I can make the upgrade at a time of my choosing. I can spread the cost, get the big boost from the CPU now and get a smaller boost from a new MB in a year's time.

Board manufacturers have to spend money implementing the new socket. Retailers are stuck with old stock that no-one wants because a new socket is around the corner.

It raises prices and hurts the end user. Why are we still seeing this behavior?

Uh, perhaps because renegades like me and thee - heck, we're probably filthy hackers, and we may even have links to organised crime - who upgrade our systems are an insignificantly small market, and Intel are happy to cede it to AMD in order to squeeze more profit out of the other 98% of their customers?

Because Intel sells motherboards and chipsets too. They don't want to sell you just a new processor, they want to sell you a new processor and a motherboard.

If Intel thought they could make more money by keeping their stuff backwards compatible, they would, but I'm sure the bean counters figured the amount of sales lost to AMD would be less than the profits they could make by forcing you to buy new motherboards too, and I would tend to agree with that.

I don't like it, I don't think it's good for consumers, but it makes sense from Intel's perspective.

Intel is riding high right now and thinks that everyone will fall right in step no matter what Intel does. They're getting greedy, pure and simple and it's about to bite them in the ass.
I was going to write something completely different after this, and then I read the last line of the article.
"Oh, one last thing: one of our sources states LGA2011 will launch with quad-and six-core CPUs (with Hyper-Threading so eight and 12 execution units) although another source has stated eight-core CPUs are also on

The design of a CPU includes the way it interfaces to the motherboard. If you make a new CPU on the same interface (bus), you don't get full performance. And you can't optimize power either. And it buys you very little to not pair the two up. Very few people upgrade their CPU, they usually buy a CPU with the motherboard and don't change it until they get a new motherboard.

And heck, few people even buy their own motherboard anyway! People who build their own systems don't realize how few people do so now. It

A large part of the performance gain in new generation processors is actually the combination of the processor and chipset. The core i5, core i7, etc. processors did away with a a separate memory controller -- that itself has been a huge power and speed advantage. Without upgrading the stuff supporting the chip, you don't get much benefit from an upgrade.

16 pci-e lanes to low when the chipset lacks usb 3, and other things like sata 3.0 and other new buses fores MB makes to use switchs and other stuff to fit in video + sata 3.0 + usb 3.0 or cut down the video card to x8.

My gripe with Intel is more about the price of their MBs, especially compared to AMD's. The cheapest AMD MB within an AMD IGP is listed at 54 euros at my favorite retailer ( Asus AMD2+, not 3, but perfs are broadly the same), while Intel's cheapest MB is 84 euros (Gigabyte). Their low-end CPUs are also kinda expensive. And their IGPs also still kinda suck, even for playing video, and definitely for even light gaming.

The interesting thing these days is smaller size. Mini-ITX mainboards are becoming common, there's cheapish ones with AMD2/3 or 1156 sockets, good cases (Silverstone...), huge HDs. Unless you really need a graphics card, you can build a very small and quiet PC.

There's always AMD's Fusion on the horizon. If they can execute well on that they have a chance to do what they did with the Athlon. Intel has yet to demonstrate that they actually have GPU tech that can compete with nVidia and ATI in this space. I really hope they do, Intel has had too long at the top of the market and they're getting all monopolistic again.

Someone help me here. Although I understand the basic need for new sockets sometimes, I dont understand the drive to implement them so often and frequently. If there truly is only a single pin difference, wouldn't it make sense to at least attempt to design the chip to meet existing sockets? It also seems like it would speed adoption of a new processor if it is socket-compatible with existing motherboards. This is the piece that confuses me. It seems like any time a new socket is required, it's bad for busi

I dont understand the drive to implement them so often and frequently.

The cynic in me says 'Yeah, its a money grab; new cpu = new socket, and the user shells out for new hw, Intel wins big until everyone else reverse engineers the whole set up and sells clones." But then the rationalist in me knows that socket aside, the new cpu is going to require a new chip set, especially if the package contains a gpu as well. So its kind of moot. Plus, anything they can do to remove pins is generally a good thing. Generally. One less pin sounds like they've made something more efficient.

I really hope that AMD gets back on top and can compete with Intel on the top-level CPUs again. I am tired of the Intel fanboy's crapping all over AMD for the last few years, and really the industry NEEDS AMD to get back on top and help drive the price of these Intel chips down. The price gap is so huge between AMD and Intel that it makes building a top of the line Intel machine very daunting for us working-class enthusiasts and system builders.

Maybe it's changed in the last couple of years but my understanding was that AMD was still the place to go for database type machines because of the bus speed but Intel was the way to go for number crunching app machines. At the time I did the research and tried to explain that to my manager but it didn't matter because the Core 2 Duo was kicking the pants off AMD on the personal computer so Intel was faster and that's what we were getting for all the machines.

I don't think the GP is upset at *Intel* in this regard; I think it's more a perfectly realistic consumer complaint: "I wish there were more competition in this space because that would be better for me as the consumer." AMD dropped the ball pretty badly after a very strong run with earlier Athalons. It'd be great to see them get back into the game and really help push things along again.

AMD dropped the ball pretty badly after a very strong run with earlier Athlons. It'd be great to see them get back into the game

(corrected one product name...)

That's not so simple. How much of "AMD dropping the ball" was because if illegal, anticompetitive practices of Intel? Practices which, essentially, robbed AMD from money needed for aggresive R&D and fab expansion.

Intel, through illegal practises, prevented AMD from benefiting fully from their lead with K7 and early K8 Athlons. This illegally rerouted money weakened AMD R&D and fabs, while strenghtening Intel ones at the same time.

Trying it is not enough. It's 2010, and AMD bought ATI almost 4 years ago (1 [arstechnica.com]), so there are no excuses. I would be glad of buying AMD+ATI integrated graphics instead of Intel, but it is a no-no until drivers for Linux reach its Windows counterparts performance-wise, and of course, I will not buy anything from AMD+ATI until then, not before. I buy products based on facts, not promises (I already made a mistake 3 years ago buying a AMD/ATI integrated graphics, still today without proper driver for Linux WTF

I can agree it might be oversight...which doesn't speak very good of Intel if AMD manages to do it usually. Heck, I've seen a very late i865 ASRock motherboard with Core 2 Duo support. And latest Intel CPUs (as well as those upcoming in 2011) basically use just PCI Express and some interface to output video...

Yes, very few people want and expect to mess around with the insides...so why Intel, seemingly, does some effort to outright block such possibility?

So, there's no way to do this using the current socket/motherboard? My guess is that they do this purposely (at least some of the time) so that users need new hardware for their upgrades. It generates more revenue. I work in the software resale industry and the software vendors pull this crap all the time. (e.g. no backward compatibility forces more users to upgrade so that they can all work together)

Only because Intel chooses to obsolete old chipsets (or, more preciselly, arbitrarily changes bus specs on new motherboards - I've seen an ASRock one for C2D with i865). AMD somehow manages to keep latest versions of their CPU interconnect backwards compatible...you really want to say Intel isn't capable of doing so? (especially if Intel simply uses PCI Express for those chips, which is explicitly backwards compatible)

Don't concede. They wouldn't need a new CPU if there was intelligence in the design. CPUs should not be talking directly to anything but memory. All other communication (other processors, PCIe, South Bridge, etc) should be done via a point to point protocol. So then the only thing tying a specific CPU to a specific mobo or socket would be the memory technology. The graphics communication could talk to a PCIe device for the display driving (for the actual conversion of signal to DVI). So a legacy mobo

You would need a new motherboard regardless if they changed the socket or not.

Ummm, why? You can upgrade the CPU on an AM2/AM2+ motherboard with at most a flash of the bios. And the AM2/+ CPUs are typically backwards compatible (a AM2+ will run on an AM2, but with reduced functionality). So that AM2 board you purchased 4 years ago is still compatible with the latest processors (but not with DDR3). Given AMD's track record with sockets, I'd be surprised if the AM3 gets "phased out" within the next 5 ye

also amd HT is in all CPU's unlike Intel that only has there in high end cpus.

so intel low end cpu are stuck with low pci-e lanes to the point where usb 3.0 can get in the way of x16 video cards make some boards use a pci-e switchs. and foreing apple to use core 2 in there 13" laptop just to get good video with needing to add full video chip + chipset.

Intel also uses this to lock out NVidia. They should put there new bus in the i3 i5 i7 (low end) and not crap GMA video + 16 pci-e lanes.

There are different things to consider. On the AMD side of things, which everyone is using for comparison, you can often drop a new CPU into pretty much any AM2+ or AM3 motherboard with just a firmware update. You don't need to replace the RAM or motherboard, and you get the benefits of the new CPU. Going to a new MEMORY type would require a new motherboard, but with all of the new AMD processors, they support BOTH DDR2 and DDR3 memory.

With my current AM3 socket, I can upgrade to a 6-core AMD chip with just a BIOS update. Why can't Intel do that?

They could. But since only a tiny fraction of people ever upgrade CPUs, there's no reason to cripple your CPUs with support for old chipsets when you can just release a new one; every current AMD CPU has to support DDR2 RAM as well as DDR3, for example, and there's some evidence that requirement is significantly affecting AMD's memory performance with DDR3.

It would be different if AMD had better CPUs than Intel, but since Intel's are the fastest right now you can either buy the fastest CPU with the appropri

You've had to do this for a while. Don't you remember having to get a new motherboard to use newer CPUs, even though they had the same socket? Yeah, I do. That was very confusing at times, and at least with a new socket, you will have a better chance of knowing what will / will not work.

But upgrading CPUs has become much more attractive lately - you can go, say, from cheap singlecore (AMD still has some singlecore Semprons; plus singlecore Athlon64 AM2 was quite popular for some time) in original, cheap machine to...also cheap now quadcore. Getting huge boost for very little money (you might also upgrade memory while ddr2 is still cheap)

Of course Intel simply wants you to buy more; chipsets are also quite lucrative after all (maybe pointing out it's a horrible waste would work with current

But you were likely to buy it 3 years ago, in the form of AM2 Athlon64. Now, and still for quite some tinme to come, you can slap in a quadcore. Or if buying now some cheap CPU you would still be able to upgrade to significantly faster one later on...

You can also donate just the old CPU, btw...somebody will need it (old sticks of RAM? You just buy reasonable amount in two sticks at first and have two free slots for later expansion...yes, you have take note to get a motherboard with 4 RAM slots, but that won

If your sound card blows, you could replace it. If your mainboard blows, you could replace it. Why replace everything when one goes kaput?Just as discrete CDROM controllers (ISA) went the way of the dodo, just like IDE controller boards did the same, just like the network cards got integrated...

Damnit, I just upgraded my old Athlon 64 3500+ with a nice new Core i5 750 as well. £320 for processor, mobo and 4GB memory. Good job I was hoping for it to last for a while, cos it sure looks like Intel don't want me to just upgrade my processor when it gets to be lacking.