CPU cooler manufacturer Arctic (aka Arctic Cooling) may have inadvertently leaked a very long list of 4th generation Intel Core processors based on its LGA1150 socket. Longer than any currently posted lists of Core "Haswell" processors, the leak includes model numbers of nine Core i7, seventeen Core i5, five Core i3, and two Pentium models. Among the Core i7 models are already known i7-4770K flagship chip, i7-4770S, and a yet-unknown i7-4765T. The Core i5 processor list is exhaustive, and it appears that Intel wants to leave no price-point unattended. The Core i5-4570K could interest enthusiasts. In comparison to the Core i5 list, the LGA1150 Core i3 list is surprisingly short, indicating Intel is serious about phasing out dual-core chips. The Pentium LGA1150 list is even shorter.

The list of LGA1150 processor models appears to have been leaked in the data-sheets of one of its coolers, in the section that lists compatible processors. LGA1150 appears to have the same exact cooler mount-hole spacing as LGA1155 and LGA1156 sockets, and as such upgrading CPU cooler shouldn't be on your agenda. Intel's 4th generation Core processor family is based on Intel's spanking new "Haswell" micro-architecture, which promises higher performance per-core, and significantly faster integrated graphics over previous generation. The new chips will be built on Intel's now-mature 22 nm silicon fabrication process. The new chips will begin to roll out in the first-half of 2013.

Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.

I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales). Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.

I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales). Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

So depressing.

Click to expand...

I know exactly what you mean. It seems like things are stagnant. There are mostly just architectural changes that are marginally better than the last.

Now they have tablets and phones with graphics superior to the Xbox 360 and PS3 (if you are unaware of this, check it out. One good example is the Unreal engine, but there are a few others).

These devices are much more powerful in their graphics than laptops with IGPs. I will be doing another build this year, probably my final build, and I will keep it in tact to remember the days when we could build our own computer.

Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.

I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales). Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

So depressing.

Click to expand...

It's because people don't need more speed. You don't need more speed, you said so yourself. The speed race for consumers ended with the Core 2 Duo. Computers are getting faster, but the applications that actually make use of the speed is getting rarer. This is a good thing, because people don't have to spend tons of cash on computers anymore, and we can focus on better performance/watt ratios. Why is this depressing? Did you really expect everyone would have a super computer in their home, even if that would be retardedly wasteful?

I'm a gamer. It's depressing because gaming technology isn't advancing. With all these idle cores and RAM, there should be more than enough hardware to run a pseudo-AI in games yet, it isn't being done. Where are the slew of simulators that we saw in the 1990s which pushed the bounds of what is possible? Where's the FarCry titles that attempt to render insane amounts of foilage? Where's all the physic's-based games that Aegia promised? Why did gamers kill great inovative titles like Spore because of DRM? Most of the innovations are coming from indie developers (e.g. Minecraft) but they don't have the resources to take the game design to the next level. Case in point: look at Conan O'Brien's review of Minecraft.

We're inventing ways to make computers dumber and slower (to the point they're virtualized on "clouds"), not smarter and faster (which created tons of optimism up until about 2008). Someone needs to modify AMD's logo and change it to "Gaming Devolved" and stamp it on the entire industry.

Intel is focusing on integrating more video processing power into its processors rather than just creating more powerful processors. I can see the reasoning behind that. Once they get a good architecture they can trim the fat, shrink it and slap it into the new iPad7 and make way more money than they could by selling us tech-jerks new hardware.

If we are going to get faster hardware, the ball really is in the software developers' park to make something that will really push current CPUs to their knees. Otherwise Intel will be more than happy focusing on increasing performance-per-watt over outright performance itself.

I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales). Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

So depressing.

Click to expand...

Back in the early 2000's Software was outpacing hardware. For a lot of what people wanted to do a single core Pentium 4 just wasn't enough anymore, consumers were starting to move more into digital photograph editing and digital movie editing. Functions that used to only be reserved for high end multi-processor workstations. Now the reverse seems true, the hardware has outpaced the software, and there isn't really a large consumer demand for faster hardware, the hardware we have or have had for a while, does everything pretty fast already.

Conroe was the end all solution. A computer from that time, with upgraded RAM, is still more than enough for the majorityof consumers. Imagine using a 1997 CPU in 2002..

Click to expand...

True that. And if you take into account that 1366x768 is the most popular resolution you don't even need a gaming grade graphics card. About a year ago I build a PC for a relative with a Celeron G540, H61 board, 4GB RAM, HD6670 GDDR5 and he couldn't be happier (he was "gaming" on a P4 3.2C + X800XT).

ALL so true......well almost all. The pad era will either fade away or spawn a pad with the power of a lap/desktop. Evolve or die guys...There will always be modders and overclockers..just in new formats...