Intel "Lynx Point" 8-series chipset, which will form the foundation of 4th Generation Core processors in the LGA1150 package, codenamed "Haswell", was detailed in a leaked company slide. A slightly older report this week focused on Haswell chips having DirectX 11.1 graphics, and a reorganized display output logic that sees digital display outputs being wired to the processor package, while analog display outputs being routed to the chipset. This chipset talks to the processor's embedded graphics controller over a slightly less functional Flexible Display Interface (FDI).

Lynx Point chipset is a platform controller hub (PCH), much like all the Intel client-platform chipsets released since P55. A crude way to define its function would be to call it a "glorified southbridge", which handles all the connectivity of the system, while lacking the main PCI-Express root complex of the system to which graphics cards are ideally connected, as that's relocated to the CPU package. The PCH does have a narrower 8-lane PCIe hub, but to wire out x1 and x4 expansion slots, and onboard controllers. The Lynx Point chipset connects to the processor primarily over DMI, although the slide doesn't detail the DMI bandwidth. Most likely, it's similar to Cougar Point's 4 GB/s. Lynx Point also lacks a supplementary 4 GB/s PCIe link from the processor that's found on X79 chipset.

Getting into the fine print of its connectivity, we find that the PCH finally has all its SATA connectivity sticking to the SATA revision 3.0 (6 Gb/s), compared to some of its immediate predecessors having some SATA 6 Gb/s ports, and some SATA 3 Gb/s. The PCH having all SATA 6 Gb/s ports is a particularly big revelation, because users of this platform will be able to finally set up complex RAID configurations using SATA 6 Gb/s drives, such as RAID 5, 10, etc., which require more than two physical disks.

This purism doesn't extend to USB, sadly. It still has a combination of USB 3.0 SuperSpeed and USB 2.0 HiSpeed ports in an unknown proportion. All USB 3.0 ports backwards-support USB 2.0 devices, but then not all USB ports from this chipset are USB 3.0. The chipset also lacks a PCI Express 3.0 hub and retains PCI Express 2.0. This bit is significant, because now makers of third-party USB 3.0, Thunderbolt, and SATA 6 Gb/s are encouraged to make PCIe 2.0 x2 controllers instead of waiting to see if chipsets in the foreseeable future have PCIe 3.0, so they could make lower pin-count PCIe 3.0 x1 controllers.

The rest of the connectivity is largely similar, except of course the display output. The PCH now only has to deal with analog display outputs. Gigabit Ethernet MAC, SPI, LPCIO, SMBus and HD Audio are carried forward unchanged. Haswell and Lynx Point are slated for the first half of 2013.

a whole new socket! how about that. cpu sales just arent enough for intel anymore now they want it all. imo this new socket every year thing is getting old fast.

Click to expand...

:shadedshu Then don't upgrade. Better yet, unplug yourself from the Internet and be done with it.

Seriously. Nobody's putting a gun to your head demanding you to upgrade. This whining about socket change gets old real fast. There are hundreds of reasons why a manufacturer needs to change sockets. If you can't deal with this, then design your own damn processor.

this will eliminate costs and the need for additional VRM controller on the cpu, and sata/usb3 controllers on the chipset, integration takes one more step towards system on chip, the socket change this time is a good thing.

what is bothering me is why are they still using the 65nm tech. if they simply used 22nm like the cpus, the chipset would be 10mm2, 8 times smaller than current 80mm2, and therefore could be integrated in the cpu package for no cost, the cpu being 200mm2 surface area, and now we have to pay 50$ for this old 65nm chipset instead. at best the Lynx looks like 45nm in the photo, old tech.

:shadedshu Then don't upgrade. Better yet, unplug yourself from the Internet and be done with it.

Seriously. Nobody's putting a gun to your head demanding you to upgrade. This whining about socket change gets old real fast. There are hundreds of reasons why a manufacturer needs to change sockets. If you can't deal with this, then design your own damn processor.

this will eliminate costs and the need for additional VRM controller on the cpu, and sata/usb3 controllers on the chipset, integration takes one more step towards system on chip, the socket change this time is a good thing.

what is bothering me is why are they still using the 65nm tech. if they simply used 22nm like the cpus, the chipset would be 10mm2, 8 times smaller than current 80mm2, and therefore could be integrated in the cpu package for no cost, the cpu being 200mm2 surface area, and now we have to pay 50$ for this old 65nm chipset instead. at best the Lynx looks like 45nm in the photo, old tech.

Click to expand...

Cause 65nm is cheap, as in nearly free for Intel, whereas 22nm is expensive. Also, if they stuck the PCH inside the CPU they couldn't launch 11 different chipset SKUs and charge different prices for them. Besides, what would Intel be selling to the motherboard makers? Nothing and then they'd have to charge us more for the processors. Or maybe you just want 11 different SKUs of every single CPU model with a feature disabled here and there? I don't, as it's bad enough as it already is. So ok, there might not be 11 different chipset SKUs, but currently there are at least 7 and that's bad enough...

:shadedshu Then don't upgrade. Better yet, unplug yourself from the Internet and be done with it.

Seriously. Nobody's putting a gun to your head demanding you to upgrade. This whining about socket change gets old real fast. There are hundreds of reasons why a manufacturer needs to change sockets. If you can't deal with this, then design your own damn processor.

Click to expand...

I agree. Look at AMD clinging onto the same old sockets and having the legacy nothbridge/southbridge holding them back (on AM3+/AM3/AM2+/AM2) when Intel continues to change sockets and actually improve their CPU designs. If you think about it, Intel CPUs are plenty fast anyway so there is no need to dump your platform every single year.

I would rather have a company that constantly pushes performance limits than a company that sticks to the same socket for 5 years only to release the a product at the end of those 5 years that can barely keep up with their previous products.

I would rather have a company that constantly pushes performance limits than a company that sticks to the same socket for 5 years only to release the a product at the end of those 5 years that can barely keep up with their previous products.

Click to expand...

well true but atleast they are pretty well motivated with their chipsets having the sata3 and usb3 on the go by default cutting down values as they dont have to buy from 3rd party. then yet again they are slow but they are economical and keep up the competition rolling else intel would forever stick to one thing and move even slower if there were no company like AMD.

P.S. improving the processor technology isnt enough one must improve itself over all. Everyone knows that jumping from lga1156 to lga1155 was a bad move as everyone having the 1st gen got pissed. there was no point in changing the socket yet only the chipset could have been the break through BUT I AM INTEL I RULE THE MARKET SO YOU HAVE TO BARE WITH ME.

well true but atleast they are pretty well motivated with their chipsets having the sata3 and usb3 on the go by default cutting down values as they dont have to buy from 3rd party. then yet again they are slow but they are economical and keep up the competition rolling else intel would forever stick to one thing and move even slower if there were no company like AMD.

P.S. improving the processor technology isnt enough one must improve itself over all. Everyone knows that jumping from lga1156 to lga1155 was a bad move as everyone having the 1st gen got pissed. there was no point in changing the socket yet only the chipset could have been the break through BUT I AM INTEL I RULE THE MARKET SO YOU HAVE TO BARE WITH ME.

Click to expand...

People don't get pissed enough at Intel. Look at all the zombies above who like to suck intel's corporate dick: "It's OK, we like buying and installing a new mobo every time there's a new cpu"... Seriously, they don't even realize how Intel doensn't give a rat's ass about how economical or practical their solutions are. They just reinvent everything because they don't give a shit about their end users. And those, in turn, applaud the corporation's moves. :shadedshu

It's the same with Intel's personnel: when a manager doesn't meet his quota of burn-outs, they sack him for being too soft. Again, the zombies will intone "No one forced them to go work at Intel. If they can't do the job, they should leave." Cluelessness and rabid fanboism.

People don't get pissed enough at Intel. Look at all the zombies above who like to suck intel's corporate dick: "It's OK, we like buying and installing a new mobo every time there's a new cpu"... Seriously, they don't even realize how Intel doensn't give a rat's ass about how economical or practical their solutions are. They just reinvent everything because they don't give a shit about their end users. And those, in turn, applaud the corporation's moves. :shadedshu

It's the same with Intel's personnel: when a manager doesn't meet his quota of burn-outs, they sack him for being too soft. Again, the zombies will intone "No one forced them to go work at Intel. If they can't do the job, they should leave." Cluelessness and rabid fanboism.

Click to expand...

Not economical? I held onto my Q6600 for like 4 years, and it was just barely starting to bottleneck my GPU. Intel is plenty economical if you play it smart. I spent $240 on my i5-2500k, and I guarantee it will last for 2-3 more years at least. Sure, Intel releases tons of products on new platforms every year, but here's the kicker, they always offer more than previous generations did. The FX Series barely beats out Phenom II in anything except Memory Bandwidth (which still fails compared to Intel offerings) and applications that use 8 threads. It also consumes substantially more power, required most people to upgrade their motherboards anyway, and offered worse performance on applications using 6 or less threads. Compare that to Nahelem -> SB. SB was cheaper, substantially faster, and used much less power.

Nobody is "sucking Intel's corporate dick", and to treat Intel like they are some terrible corporation who has lied and stolen everything is nothing short of naive. They (like everyone) don't necessarily play fair, but if they released garbage products, they wouldn't be leading the industry. The fact is AMD hasn't done anything special since Athlon 64, and that was mostly just because Intel took a chance with Netburst (much like AMD has with BD) and AMD just continued on. If AMD would just release something to the tone of Phenom III, where they improved the IMC, dropped to 32nm (or lower), and managed to scale it up to 8-cores, I think they would crush Intel. But they have sunk so much money into BD it's not a possibility.

As for this chipset, it looks pretty promising. I want to see motherboards supporting new features. How SATA3 and USB3.0 aren't standard yet is beyond me

This argument is old...If u dont like it dont buy... just that simple. There is a cost for the performance increases that intel provides which usually require a new socket. Their older stuff still hangs quite well though so its usually a win win situation. Those that upgrade every 2 years do so because they want to or/and because they can. not because their hardware is obsolete. Amd-ers who are still using that 8 year old board and wondering why they're in second place?......The answers simple.....They're only 2 companies... if they were more you'd be lower.

Not economical? I held onto my Q6600 for like 4 years, and it was just barely starting to bottleneck my GPU. Intel is plenty economical if you play it smart.

Click to expand...

while I agree that the chipset looks promising, and I agree the Q6600 is even today a very respectable solution (I own a Phenom X4 9550 and I still go with it) you have to grant AMD some credit for their line up, because smart play or not, its very economical in every way, you can go for medium graphics gaming even on a budget thanks to them.

Of course, if someone asks me to build a computer, I will ask my customer about it. Some don't really care about graphics performance, while some don't care about the upgrade path... there are even some they don't really care about core-count, or power consumption.

But most of them (not all of them) care about cost. And when someone tells you about the best gaming computer for 299$ it means 299$ and not 350$. So yeah, I have to make the right choice. Also I barely sell used computers, because if that was the case, I could grab a core 2 duo with a discrete graphic card and it will surely beat a llano A8 in gaming.

Bulldozer power consumption is dissapointing, but according to this, things will change in the forthcoming months. in less than 6 months they started to improve the monster.

while I agree that the chipset looks promising, and I agree the Q6600 is even today a very respectable solution (I own a Phenom X4 9550 and I still go with it) you have to grant AMD some credit for their line up, because smart play or not, its very economical in every way, you can go for medium graphics gaming even on a budget thanks to them.

Of course, if someone asks me to build a computer, I will ask my customer about it. Some don't really care about graphics performance, while some don't care about the upgrade path... there are even some they don't really care about core-count, or power consumption.

But most of them (not all of them) care about cost. And when someone tells you about the best gaming computer for 299$ it means 299$ and not 350$. So yeah, I have to make the right choice. Also I barely sell used computers, because if that was the case, I could grab a core 2 duo with a discrete graphic card and it will surely beat a llano A8 in gaming.

Bulldozer power consumption is dissapointing, but according to this, things will change in the forthcoming months. in less than 6 months they started to improve the monster.

Click to expand...

AMD definitely does have its Strengths, and products like Llano and Trinity are very impressive. But those are different products with a different audience in mind. When it comes to the Desktop, Intel or older AMD currently have the best offerings. You mentioned cost, and that is--for me--exactly why. All BD products at their current price are inferior in terms of price\performance to Intel offerings, or even AMD's Phenom II line.

As for the new revisions, if they don't offer any performance increase, they will just be what AMD should have launched in the first place. Piledriver is their next real chance to offer something of value.

After re-reading, I am disappointed that Intel is still mixing and matching USB and SATA ports. Since both are backwards compatible, there appears to be no logical reason to mix and match the older parts with the newer ones.

After re-reading, I am disappointed that Intel is still mixing and matching USB and SATA ports. Since both are backwards compatible, there appears to be no logical reason to mix and match the older parts with the newer ones.

Click to expand...

Yes, but as far as I know, most motherboard mix USB 2.0 with USB 3.0, even mobos with the greatest AMD chipset I believe.

With SATA II and III there's also some mixing here and there, depending on mobos model.

I would use USB 2.0 ports for keyboard and mouse, and older pendrives, and 3.0 for hard drives and whatever else. Thinking it that way, its not so bad to add older ports. Except perhaps if the older ports are taking the place of the faster ports. Its a mobo manufacturer thing, they are the ones who decide after all, to take advantage of the chipsets capabilities.

They still manufacture AMD motherboards based on 760 chipset which is in my opinion, very old. But its very cheap also.