Look closely at some of those slides. See that "number of instructions" chart? It goes down to 1971 or so (presumably the 4004).

Question: Does this chart include BiiN (billions invested in nothing?), how about the 432? You didn't think the latest 860 was the only intel chip by that name did you (they had a failed RISC architecture by that name as well)? Howabout the not-so-recent and hardly-mourned Itanic? They also seem to leave out the latest huge improvement, AMD64, completely.
Reply

Well this the chip that makes real harm to AMD. It can be really cheap to produce and offer anough speed to allmost everyone. So it will compete directly with new AMD Athlons.
Interesting chip to budged range in anyway. Just hope that AMD gets their new production faclilities runnig sooner than later. I really hate monopolies, even they make exelent products...
Reply

Already said it once, will say it again: desktop cores are about maxed out, with regards to the need. The gains in speed are almost negligible in the fact that, yes it's faster, but it's not as big as a deal any more. Core2 revolutionized things.

I will like to see a 50 core, 20GB cache CPU running on 100Watts. When I can get that in my home system, I'll be happy (and it's just around the bend).

I'm more excited about getting USB 3.0 and SATA 3.0 on motherboards. I'd also be just as excited if Intel put some of that nanotech towards the DRAM and more efforts towards SSDs. How about a processor that is closer to the southbridge, maybe one that integrates with an SSD. I want to see Intel revolutionize the internals as a whole, not just parts of the whole. Another example that this could be done by removing the need for RAM. Reply

x86 is a lousy instruction set. For Intel to base a GPU on it, when their competitors are using instruction sets better suited for the task, puts them at a disadvantage.

It's why Atom can't compete in a lot of markets. The x86 instruction set is hideous, and you can't really execute it well directly. The original Pentium actually did, and they say this is based on the Pentium, so they're apparently not de-coupling it.

Where compatibility is king, you'd have no choice, but where it's not very important, you've got to be wondering why Intel went down this road. It's definitely going to put them at a disadvantage against ATI and NVIDIA. I don't know how the compatibility is going to offset that. I guess we'll see. Reply

I think the went down this road because it's the one they know.
They're a CPU company, so they're trying to make a GPU with that filter in place.
I agree that it's dumb, but once we moved into the realm of programmable GPU's it was inevitable that they would take this approach.
Intel has a long history of trying to steer all processes toward the CPU (MMX,SSE,SSE2, etc.).They've always been jealous of their "territory".
Their overall goal remains integrating everything into the CPU.

NVIDIA and ATI are taking the opposite approach and trying to steer functions that have traditionally been CPU-based toward the GPU, to take advantage of that parallel processing.
They're at an advantage because all they have to do is offload processes that are currently tying up CPU time to their GPU's and they can call it a win.
Since they're able to hand-pick the processes that can take the most advantage of the GPU's approach to processing, they have a much easier time of it.
Like I said, Intel *is* dumb for taking this approach, but it's right in line with their philosophy.
If they can produce a somewhat decent level of performance in Larrabee, that ends up being a proof-of-concept that the CPU can replace a GPU.

This picture is pretty interesting, looks exactly the same as lynnfield to me. It would mean lynnfield in fact also has 3 memory controllers on die, but 1 disabled (something I speculated already about, since if you compare lynnfield and bloomfield die shots the memory controller looks exactly the same). Plus a unused QPI link. Reply

Any info on future chipsets ?
And: any chance Intel will be offering a revised 32nm socket775 CPU ?
I'm doubting they will, but thought I'd ask anyway.
Or: even just a revised chipset for either socket775 or for socket1156, that includes onboard USB 3.0 & SATA 3.0 (6 Gb SATA) controller.
And: did they specifically state whether there'd be a future multi-core 32nm socket1156 (non-IGP) CPU usable with the current P55 motherboards?
Also: any info on possible UEFI implementation by Apple, that might then allow using unmodified "generic" PC video cards in (future) Mac Pro's ?
http://www.uefi.org/about/">http://www.uefi.org/about/Reply

"Larrabee rendered that image above using raytracing, it's not running anywhere near full performance"
Does that mean larrabee isn't running at its peak to render the scene or that the game is running at low FPS? Reply

They might mean it is running at 1000MHz instead of 2000 and/or 24cores instead or 48.

Raytracing requires a lot more performance than rasterization. An Intel staff member on a blog years ago said the Larrabee was not intended to be a raytracing GPU, it is just that is can perform pretty well as one. Other GPU's would get about 1-2 fps or something unless they cheat, then they may get 5 fps. 99% of games will still use rasterization for a long while as it is faster.

I am getting one slow or not, bugs or not ;oP. Intel tend to be pretty good at removing most hardware bugs you would ever see though. The rumor is most of the bugs are ironed out now. Reply

You can get 8 cores and 16 threads - just get a dual processor configuration. Since the real Nehalems use QPI, you're not bottlenecked like you were with the previous generation.

But, what do you need this for? I thought I had pretty demanding needs, and I really don't know how I could possibly use all that. If you have a compute farm, or something, yes, but for home? You must be building a nuclear bomb or something. Reply

Need? Ha! Ha! How does need enter into the equation? Some people build cars as a hobby, others, like me, build computers. Hell, John Carmack builds rockets! Does John Carmack *need* a rocket?

I built multi-socket systems for my own personal use for two hardware generations after I had a legitimate use for them. I would probably still be doing so, if Intel hadn't adopted their "Tick-Tock" strategy... Reply

Sounds plausible. A hobby project involving molecular simulation was one of my self-justifications for purchasing my first dual socket system as a grad student. CUDA is probably more cost efficient for that sort of thing today... Reply

Do you really care about official support for something better than PC 1066? It works perfectly anyway.

I'd like to see them get the power use of the x58 down to a practical level. I don't expect it to reach Lynnfield levels of power use, since it doesn't have so many compromises, but, they need to move the chipset to 45nm to realize some power savings so it's not SO bad as it is now.

Jasper looks like a real killer. The six core processors aren't going to really matter much in the market. It's going to be expensive, and a niche product. The Jasper will probably really do well, though.

When they integrate graphics into the platform, it will kill the fairly useless Lynnfield, and probably become a VERY successful product. I think they should have held off on the Lynnfield, and just introduced this product. I think the Lynnfield just increases market confusion, but once they add the graphics core to it, I think people will really go for it. Reply

We're enthusiasts! We are precisely the "niche" that Gulftown targets. Fewer cores and greater levels of integration do not interest us, unless the integration leads to increased performance. Frankly, only six cores is a bit of a disappointment at 32nm. We would rather have had eight. When shopping for a primary system power efficiency only interests us in so far as it allows greater sustained performance, and allows the system to run cool when "idle" (i.e. doing mundane tasks like web browsing or watching videos[1]). When we put on our IT hats our choices may be different, but that doesn't reflect our passion for hardware that sends us to sites like Anandtech in the first place.

When the OP asks whether Intel will officially support PC 1066, he is indirectly asking whether the memory controller has improved to support even higher bandwidths than Nehalem. Since the processor is not out in the wild he can't directly ask that question. (It is always possible that the officially supported RAM speeds might increases without influencing stable overclocked speeds, though.)

---
Footnote [1]: Compared to the amount of processing power we can bring to bear, it is an idle task for us. Reply

For me, stuff like this is more like reading about a Porsche. It's fun to read about, but it's not something I would realistically consider.

You'd really pay $1000 for a processor? These days, with obsolescence being so rapid, it doesn't make a lot of sense for home use, and you'd find it difficult to see the difference.

It's changed a lot. For example, even in 1987, IBM released an 8086 (PS/2 Model 30). The chip was created 1978. To put that in perspective, that would be like releasing a 1 GHz Pentium III now. Then again, maybe it's not so different with the Atom.

I personally prefer ILP, to TLP, and would prefer two killer cores to six slower clocked ones. But, certainly, they both have their uses.

The Bloomfield can easily handle memory at higher clock speeds than 1066. Intel could tomorrow say it handles 1333 MHz memory, with zero change to the processor. I don't think it's very meaningful. Reply

No, but I would pay $300-$400 for the lowest clocked Gulftown. If I needed to I could then overclock it beyond the stock specs of $1000 version... though I would probably leave it at stock clocks for stability and longevity reasons most of the time. See my comments later in this thread about multi-socket systems, and how rapid architectural improvements have changed my spending habits.

There is no reason why ILP and TLP cannot be improved in parallel, pardon the pun. Clock speeds, however, have hit a brick wall. Software desperately needs to go parallel, and there is only so much parallelism in the instruction stream that can be obtained without intentionally redesigning to make your software overtly parallel.

Benchmarks tend to agree with your assertion that memory speeds aren't a pressing issue at current clocks, with current architectural designs, and with existing desktop software. In the server space there are already Xeons with official support for DDR1333, and I am sure most if not all desktop Nehalems can have their memory controller overclocked to utilize DDR3 speeds well in excess of 1333 MHz. I agree it is a non-issue. However, I also understand the OP's desire to know if there are any improvements to Gulftown beyond the number of cores. The answer to which is that there are eleven new instructions added to the Westmere instruction set -- six for doing AES and five carry-less multiplication instructions. Reply

I doubt you will see a Gulftown in the $300-$400 range. The cheapest one is expected to launch at $1,000. Gulftown is mainly designed as DP server chip that they happen to be offering as a high end desktop part as well. Maybe the gen after gulftown when they shrink to 22nm you could see that sort of pricing, or if Intel releas a 4 core 8 thread 32nm part later on. Reply

When is your next copy and paste article going to be printed at Toms? Just wondering what kind of insight you will be offering since your comments around here continued to be "brain damaged". By the way, still waiting on your article comparing the 920 to the 860 to prove all your statements. Since all of the websites disagree with you, I am guessing it is hard to copy and paste information to support your lost cause.

You know, for a loser like you to be casting insults, is almost amusing. Almost, because you're too stupid to be witty.

I'll be working on an article on the original IBM PC when I have time. With the different technologies, market positioning, good/bad, things borrowed, etc... when I have time. Thanks for asking. If you want, I can send you an advanced, signed, copy, since you are a fan.

It does take soooooo much time. And, your idiot remarks about copy and paste just show what a hypocrite you are. Show an example. In reality, I rewrote those pages so many times, I was almost afraid to reread them for fear I would spend another hour or two getting it just right. I wish I had the talent where I could write it once and get it write, but I didn't even attempt it. Finally, when I was finally relatively happy with the content and style, they edited it and dummied it down so people like you could understand the vocabulary used. If you want me to put up the original, just let me know and I'll send you a link. You are a fan, after all, so, for you, I will.

Actually, I am very satisfied with Anand's tests showing how much faster, 3.5%, the Bloomfield is. Well, I'm not really, since he had the uncore on the Bloomfield faster. I don't know why these guys don't get it right, it's like they don't understand you have to get things precisely accurate, instead of saying it doesn't matter much. That's my frustration.

If they want to send me the equipment, I'll be happy to test it for them. I swear I'll send the stuff back too. Really, I will. Well, I will for the Lynnfield. The Bloomfields, well, I do have a large cat, and occassionally some raccoons visit the house, and I can't be responsible if the raccoons somehow take the Bloomfields. Naturally, even they have the sense not to want the brain-damaged Lynnfields, so I'm not so worried about them. Reply

The hate is having to pay a premium for graphics that suck. Intel has been claiming to take over graphics since before 3dfx departed from the seen. The idea of not only losing a core or two, but losing it to an Intel GPU will make all but the most committed fanboys run to AMD (or possibly current i7's and nvidia).

On the other hand, Intel might realize this (actually, I assume most of them do. The question is what happens to the poor saps who says the emperor has no clothes). If most home Dells come with some sort of Larabee thrown in, if no more than a few extra x86 cores, this will allow software to be written for the least common denominator being able to run plenty of (low power) threads. Eventually, this will make a big difference.

Just don't expect to ever use that GPU (and get anything but "minimum suggested hardware" preformance).
Reply

The same reason the market seems to hate it. It doesn't make a lot of sense for a lot of the market.

I don't hate it, I just don't think it's a useful product in a lot of instances.

There are sites like this that built it up a lot, and then didn't want to admit they were wrong, but, so far, it's not doing so well. Why would it?

I think the technology will really take off though, when they can integrate the graphics. The brain-damaged design right now seems to be because it's just an interim chip. Probably even Intel knew it sucked, and wouldn't have much impact, but when they get the video on it, everything will change. The inferior PCIe implementation all of the sudden stops being bad, since you have video on board. The price goes down a lot, power use, already good, goes down a lot (considering you need a discrete card now), and all of the sudden you have an inexpensive, low power, system with outrageous performance in that segment.

It's like looking at Florida in College Football. They're great for a college team, but if they play in the pros, they'd be abused. As soon as the Lynnfield, or really what follows it, finds its proper place as a low cost, low power system (Celeron), it's going to be extremely competitive, and I think almost unavoidable for a lot of the market. As it is, it's just not a useful platform for a lot of people. Anyone with brains will get a 920 and overclock it, and get better performance, better flexibility, and better upgradeability. Anyone without any will not know that the Lynnfield is faster than the Athlon, and will not want to pay as much for a 2.66 GHz chip when they can get a 3.2 GHz chip from AMD for the same price.

Yes, people who know more will understand, but they are going to get Bloomfield. So, where is the market? The i5 750, sure, it's a nice price point, and a decent processor. But, really, it's not a big segment. Once they get onboard video, I think it's going to make Core 2 instantly obsolete, and, because it's 32nm, and they are only using one chip for the chipset, they probably can sell it inexpensively.

So, yes, I think Lynnfield is basically crap. I think it's an interim chip without much use as it is, at the price it is (except for some i5 750). But, I think the next interation that use this technology and adds video, will address a large segment very well. But, that's the future. For the present, I don't think Lynnfield really matters much. I think AMD outdid them with the $99 quad Athlon. Reply

How do you know that the market hates it? Lynnfield has only been available for 2 weeks. You sure seem to be feeling the need of defending your 920 purchase. Yes, you paid a couple of bucks more for the same performance, just get over it.

And I wouldn't call AMD's 99$ quad-core "outdoing" anybody. What they are doing is 1) ruin themselves some more and 2) kill their price performance for years to come. I know they don't have options, but they're right back where they were with the last of the K6's - they're the buy-em-if-you-need-it-cheap company. Reply

Most people that talk like you are viewing the world through their own limitations, of which you probably have many. I don't have to rationalize anything I do, and don't have a bias one way or another. I just like the truth, and what's real to be said.

So, don't be patronizing. You're too stupid, and you're wrong. I am interested in technology, for the sake of technology, and it irritates me when sites like this don't really tell it like it really is, and instead impart a bias on it. The worst part is, jackasses like you eat it up without question.

I disagree with you on your assessment of AMD. Really, you don't like to think much beyond the superficial, and that's your main problem. AMD was in a much better situation with the K6 line, since it used much less power, was much smaller, and could be sold much cheaper. It was in some ways superior to the Katmai, because of this, and also, the K6-III was actually faster, clock normalized, than the Katmai on 16-bit integer code.

The present AMD processors flat out suck. They are the same size, consume more power, and perform miserably. I guess one plus though is, the K6 platform, Super 7, was horrible, with the MVP3 being the most stable chipset. The ALI was dreadful, the MVP3 still bad. Now, they have a nice platform, but the processors are dreadful. From that perspective, I think the Athlon was a very, very important and excellent decision by AMD.

I'm inclined to agree with your sentiment that AMD better get their design competitive again, or the other stuff like platform and outmaneuvering Intel will not hold out forever. But, given what they have, and can realistically do in the near term, I think it's a very, very smart move.

Lynnfield confuses the market. I think they would have been better off just increasing clock speeds the Core 2 line. They are cheaper to make, have an IGP, and are mature platform. Lynnfield would have made a lot of sense on 32nm.

You're right, two weeks isn't a lot, but when you already suspected a product wouldn't sell, and then it doesn't, it's a pretty strong confirmation. It's like running an experiment you already know the results of.

I don't think the technology is all bad, I think it's positioned wrong. Apparently, so does the market. Intel may have thought they could fool people by lobotomizing the platform, but, really, no one is fooled. When they get rid of the Core 2, and this becomes their low-end platform, it's going to be a killer in that segment. Again, I think it's just positioned wrong, for now, and will settle in when they put the video on it, and the costs come down. Reply

You know, for a long time I thought TA152H was just an Intel-hating troll, but I think I have just changed my mind. Pretty much everything he says makes sense, even down to the $99 Athlon.

Although I have to disagree that Lynnfield was completely pointless; it was a stop-gap measure, trying to slow down the Phenom revolution with a much cheaper platform price than what Bloomfield could ever reach. Performance is high enough to warrant a premium - I still can't understand how AMD can make any money selling their $99 quad-core.

He's not an Intel hating troll, his statements just slam Lynnfield and even Gulftown. What kind of a statement is this "you're building a nuclear bomb?" I find that anyone who's out of college and mature mentally don't post that kind of non-sense, even jokingly.

Lynnfield it's a good processor for those who want anything than Core2Duo - Core2Quad (the price is similar, anyway), assuming they are Intel fans (me included!).
BUT: I have to say it's a total crap from Intel to require another processor socket. That's a big setback, I think. Reply