Agreed btw I'm willing to bet that even the most ardent of Intel fans would think thrice(if not more) to spend ~700$ on Iris Pro, even if it's unlocked, cause you're better off spending a grand on an 8 core monstrosity (-:Reply

Depends on your workloads. The L4 cache would boost IPC and typically you can get the quad core chips to clock higher than the >6 chips. So for a less parallel task (gaming etc.), a Haswell with Iris Pro would make more sense than the 8 core Haswell-e.

Of course I wouldn't object to Intel releasing Haswell-e with some L4 cache which would make my above point moot. :)Reply

Those chips usually have significantly more L3, which has a much higher bandwidth than L4. If I had to pick between more L3 and a large L4 I'd probably pick the larger L3. As for games: I doubt they'll need that much performance even five years down the line.Reply

IDK. If memory serves, these Iris pros have a 128MB L4 cache. Further more, if you have a dedicated video card, that HUGE L4 cache is still available to the processor. I would rather have the large cache that extra procs that will more than likely go unused.Reply

If the gains from a large L4 cache were "worth it", Intel would be putting it on all their CPUs as a general way to improve performance. If you're going with a dedicated GPU, I can see this only being worthwhile if you've got some specialized task in mind which benefits disproportionately from using the eDRAM as a cache.Reply

Decade ago you could've made the same argument about L3 cache (and years before that about L2$ and also about L1$). With 22nm process it doesn't make sense to add large L4 to all (highend) CPUs, but at 14nm I can easily imagine that it can make sense to reserve some 35mm2 to 128MB L4 on-die (or half of that for 64MB). Probably we won't see it in Broadwell, but in Skylake it is possible.Reply

I still have absolutely no interest in iGPU graphics, especially when Nvidia is demonstrating better performance AND lower power consumption on 28nm with GM107 in mobile form. I'd personally just rather NOT pay the iGPU tax associated with buying a high end processor, and use that money towards any video card I want to get.Reply

it's a pretty safe bet that a broadwell with iris pro will have lower power consumption than a broadwell + any non integrated GPU... not to mention all the extra materials, resources, and physical space the non-integrated GPU takes.Reply

It's not a safe bet. 20nm GM107 Maxwell should be about 25-30% more efficient than it's current iteration, and the 840m + CPU is already way more efficient than when using a CPU with iris pro running full tilt.

While the 840m is certainly more efficient, I wouldn't take those numbers at face value.The 840m is paired with a 15w chip (i5-4200U, a dual core i5) while the Iris pro is running on a 47w chip (i7-4750HQ, a quad core i7). That does not explain the whole power gap, but I suspect the power draw of a theoretical Iris Pro + i54200 cpu would be sub-25w. Also, while Skyrim is a typically CPU intensive game, I suppose the low FPS numbers indicate Ultra settings, making it mostly GPU bottlenecked. Perhaps that is why Nvidia chose to pair the 840M with a low power cpu.Reply

look closely, that nvidia image you linked to is comparing a 47W CPU/iGPU w/iris pro to a 15W CPU/iGPU w/840M... which is irrelevant since of course a 47W cpu system will take more than a 15w ultrabook class CPU system... even when it has an non-integrated GPU.. the one that takes more power in that comparison will have an order of magnitude more CPU performance.

so let me revise my statement:

It's a pretty safe bet that a broadwell with iris pro will have lower power consumption than a broadwell with similar CPU performance with any non-integrated GPU.Reply

It's nvidias slideware as usual. It actually doesnt measure nothing nor adequately compare. It just releases slides that will drool away users towards their camp. Once there they'll figure out for themselves they smoke something bad.Only thing i would have dedicated gpu from preferably ATi, povervr, nvidia above anything intels are eye candy features that in past always overcome intels robus effective office gpu which lacks support for true graphics on any contemporary d3d/ogl applications. but always overpromising they'll improve that when it comes into fancy just around corner (since 2006) raytracing apps.Reply

It's the iGPU that ensures that AVX instructions actually bring a perfromance boost. The same goes for OpenCL applications, that, whilst are nonexistant at the moment, will work faster on a igpu that shares the same memory address space as the CPU than on a discrete GPU in most consumer/prosumer use cases. Reply

Intel would need to bring some seriously big changes to their iGPUs to make that even remotely possible. AMD is a much better bet in that regard because of HUMA and their more compute-friendly GCN architecture.

You're thinking of this from a desktop-centric perspective. The high-end iGPU is primarily for mobile. It would be weird to just cut it out of the processor for desktop. Also, you could stream 4K stuff easier with it than with weaker iGPUs. Most people are still on SB/IB.Reply

Agreed - that's going to replace my 3770K! The eDRAM will keep the cores happy during heavy BOINC number crunching and will help significantly for iGPU number crunching (Einstein@Home) to supply that increased number of shaders. DDR3-2400 is barely enough for HD4000 with just 16 shaders!Reply

The real story is if they maintain socket compatibility. They're not really known for that but since none of these announcements seem to be consumer chipset feature centric there's a chance they're not planning a new chipset for broadwell.Reply

Actually they usually do keep the same socket if it is just a die shrink of the current architecture. Like when 32nm sandy bridge shrunk to 22nm ivy bridge it kept the same socket then when IB changes to haswell it was a new architecture still on 22nm so the socket changed up broadwell is just a die shrink to 14nm it will not have ddr4 or pcie 4.0 or sata express so I don't see any reason why it would not use the same LGA 1150 socket.

When intel goes from 14nm broadwell to the new skylake architecture on 14nm which will bring ddr4 pcie 4.0 and sata express to mainstream desktop users we will have to see a new socket for the new features. Reply

broadwell will also still only have 16x pci-e lanes on the mainstream socket just like haswell does. Nowhere close to the 40 lanes haswell E has on the enthusiast socket, we will have to wait for skylake 14nm architecture changes and even then the mainstream lga 1151 socket will get 20x pci-e lanes still very short of the 40 lanes the enthusiast socket gives.

I hate how intel does that. They purposefully cut the lanes so you'll buy the higher enthusiast boards and chips to get the full 40 lanesReply

2011 isn't enthusiast centric, x58,x79,x99 are really hybrid boards that allow workstation regular PC hybrids similar to the Titan series GPUs, fact of the matter is for an enthusiast especially gamers you really don't need anything beyond the PCIe 3.0 x8 in a dual configuration(SLI/Crossfire), you will never bottleneck at the bus level, now for workstations well lets just say 3-5k GPU cards x4 in some cases isn't really ever enough.Reply

It is called synesis or notional agreement and is a regular feature of Br. English and is perfectly grammatical in American English as well although in America it is a more explicit emphasis on the collective nature it the noun. Usually in Am. English you will see the mass noun become a compound noun with the addition of a word like members or employees &c. if you want to use notional agreement.Reply

NO, just no, stop with the damn graphics already on the CPU die, I can understand this for OEM applications, but this does not belong in mid to higher end consumer products. A lot of are made to pay for this poor video solution that we never use, and most likely never will ever. The stupidity of the industry in the last 10 years has become a contest of who can price gouge, selling individual units at inflated costs to basically price fix the entire market with micro improvements, and holding on to outdated poor standards that have come to become a stifling agent. Reply

Actually, the lowest price for the consumer involves leaving the integrated graphics in place. Despite what the AnandTech comment section might lead you to believe, the vast majority of Intel's customers do not want or need discrete graphics, and your preference is the minority. Maybe not the minority of enthusiasts, but the minority nonetheless.

Producing separate versions of each core without the GPU just for you would almost certainly involve charging MORE because of the additional research, engineering, and manufacturing lines that would be required for these separate, minority-oriented products.

Also I am very sorry that you are disappointed with the iterative development process. I happen to think we've (cumulatively) seen some pretty drastic improvement in the last decade.Reply

seems pretty pointless to invest in iGPU in high end CPUs, but I think with new technologies like HSA we might be able to maximize the usage of the whole CPU/GPU. Now question is which standard is going to be the most popular and/or supported by Intel.Reply

Actually it's not *that* daft. If you imagine an engineer who does mostly CPU-heavy work but still wants to see it rendered in 3D, they probably don't need enough GPU horsepower to run Skyrim on Ultra settings, but they need more than the emulation you get with no GPU at all. Hence these high-end CPUs with mid-end iGPUs are a cost-effective way of getting that (assuming the company is being sensible and avoiding the Xeon price gouge).Reply

An i7-broadwell w/Iris Pro w/16GB RAM and a fast SSD sounds like a sweet CAD/3D workstation. On top of that you could probably get some Minecraft or TF2 on the go with £0 spent on a graphics card.Reply

It will be interesting to see what they come up with here. Any Ideas (or rough guesses..) on launch dates? I saw no reason to upgrade to the 3x/4x series by Intel but something like this will be a lot more tempting. Reply

several points to consider here , the fact pentium even the quad core does Not have AVX or AVX2 is a severe problem as a mass cpu today, as is the fact that people don't seem to realize that Intel have at best 2 or 2 tick/tocks to get the the generic UHD-2 7680 pixels wide by 4320 pixels high (33.2 megapixels) and requires 16 times the number of pixels as the current 1080p HDTV. It is the generation after 4K, which is set at 3840 pixels wide by 2160 pixels high.

the Japanese public broadcaster NHK is planning to give a demonstration of "8K" resolution content over at single 6MHz bandwidth UHF TV channel at the National Association of Broadcasters (NAB) Show coming up in Las Vegas, Nevada, April 5 to 10.

This will be the first demonstration of an 8K Super Hi-Vision over-the-air broadcast outside Japan, according to the NAB Show organizers.

Engineers from NHK are also set to present details of a long-distance, single-channel, over-the-air 8K test broadcast conducted recently in Japan.

In order to transmit the 8K signal, whose raw data requirement is 16 times greater than an HDTV signal, it was necessary to deploy additional technologies

This was in addition to image data compression. The broadcast uses 4096-point QAM modulation and MPEG-4 AVC H.264 video coding.

Flagship broadcasting events such as the Olympic games or the soccer world cup often do much to stimulate switchover. Tokyo's successful bid to host the summer Olympic games in 2020 is seen as the event which may mark the beginning of the 8K era...Reply

When all of this 8K stuff is ready..... well then I'll just use my system with an I7-7770k to run it.....So, I'll get several years of satisfactory computing and media consumption with my shiny new I7-5770k.Reply