195 Comments

I don't understand the logic of selling a high end CPU with the best IGP. Seems like anyone running an it isn't going to stick with the IGP for games, and if they aren't gaming, then what good is that high-end GPU? Maybe the entire "Core i" line should use the HD 4000. Reply

I feel like you missed that part. He's not saying that only gamers use high-end CPUs. He's saying that gamers using a high-end CPU won't care about the high-end iGPU because they won't use it. Also, non-gamers who need a high-end CPU generally won't see the benefits of the included high-end iGPU. So, he proposes that the better niche for the high-end iGPU would be on the more affordable CPUs, because then budget-minded gamers could buy an affordable CPU that has a relatively powerful iGPU integrated into it. Reply

I think it's a long time away from approaching 560m performance. If you're going to do any remotely serious gaming on a laptop it's still best to get a dedicated graphics card.

I'm still sticking to gaming on a tower, so these CPUs (esp the AMD llano) make sense for me in laptops. Don't ever see myself gaming on a laptop unless I completely get rid of the towers in my house... which won't happen anytime soon (if ever.)Reply

I felt the same way when I was shopping recently. I WANTED to buy a Llano-based notebook (inexpensive, better graphics vs. Intel). The problem is there's no such thing as a slim and light Llano. Every OEM sticks you with the same configuration: six pounds and 15.6" turd-768 resolution screen. It's bizarre.

For the sake of competition, I hope Trinity will get some better design wins.Reply

If you look at the gaming charts, the resolution may go past x768, but the settings are on LOW, and don't give us a minimum frame rate, so the answer is: That's all that llano can handle is low end low rez.So AMD forces the giant .lb weighted monster as a selling point.Reply

I agree with you there. To get those "$100 mid range GPUs" on a laptop you need to bump up the cost by around $400 to get to one that simply can have one. Most laptops currently do not have discrete GPUs.

I am glad to see that integrated graphics from both Intel and AMD can now be compared with low end cards like the GT520 and GT440 without it becoming a laugh. Also that they are actually completing the tests well now. That is a rather major step. I remember some reviews of integrated graphics that resulted in a lot of either "could not complete" or "the bar is too small to fit a number on" entries.Reply

The IGP provides the QuickSync implementation. It would be insane to not include the silicon for it on the high end system. In addition moving forward you can get compute work out of the GPU so why would you ever not include it.Reply

Because gaming isn't the only thing that uses graphics cards. For instance, more and more video editors use the graphics card for doing video decode/encode/applying effects. So having a high performance graphics engine to go along with the high performance CPU can be a really nice thing.Reply

It's not only the extra money.. Apparently the 2500K does better in a fair number of games over the 2600K (in part.. i think due to Hyper Threading) and the graphs seem to support that (altho maybe not for the reason I mentioned)

Looking at the 3700 series though it beats out both the 2500K and 2600K so I think that one is going to be of special interest to gamers.. moving forward.Reply

If the statement "the significant gains we're seeing here with Ivy will pale in comparison to what Haswell provides" is true then I'm looking forward to Haswell very much. I'll finally be able to dump discrete GPU as I only use relatively mdoest dislay resolutions, and instead pour the money into even quieter cooling solution. Silence, sweet silence :)Reply

You crack me up... that was truly a funny response.Seriously though, Llano isn't that bad for a generic/cheap build. I did pick one up to build a machine for my mom. The mobo and the proc. were justified by the price. I knew it wouldn't be powerful, but it's fairly energy efficient, has decent graphics and the money I saved went toward the ssd. Most people I build/fix computers for, don't come close to using them to their potential, so price becomes the biggest factor. Would I buy one for myself? No, I'll stick with the i7 I currently have and when I build my next machine, it looks like it'll be an Intel also.Reply

"Nice to see AMD winning where it actually matters for most consumer applications."

I dont see how you can look at these (or any) benchmark and call it a win for AMD. Intel is smoking them. A few useless integrated graphics benchmarks and you call it a win? Hey, I hear RIM is looking for a new PR rep, they could really use a guy like you. ;) Reply

IGP performance is nice, but no comments about the subjective quality, i have seen side to side HD Graphics 2000 vs Radeon IGP and the graphics quality was night and day, with the radeon being the day...I dont know whats needed to do properly integrated graphics, but seems intel still lacks...Reply

Yes,people standard are different. For gamer intel IGP might suck, but it's more than enough for me.If I buy Llano, the graphic core might be just a wasted silicon because I don't really do gaming.Buy 1 if you only need 1Reply

You mentioned in your intro about the Intel-Apple exclusivity agreement being up and Apple constantly pushing Intel for better GPU performance. Do you think Ivy Bridge has made sufficient gains in GPU performance to keep Apple on board? Have you had a chance to test Ivy Bridge's IGP OpenCL performance since that seems like a particular area of interest for Apple?Reply

I think its sure that they will. They chose a weaker CPU in favour of a stronger IGP (9400 and Core 2 Duo) before, but now we're at a point where the HD4000 would be more than adequate for Mountain Lion and probably onwards, plus Intel is way ahead with 22nm and the resulting power draw as well as CPU performance, and I think Apple uses Quicksync for Airplay which is Intel-only. Reply

The top end models which would probably be paired with a discreet card get decent integrated graphics, while the low end ones which will probably be standalone get cut down IGPs. Odd. If anything I think on the top end people would want models with less space used on integrated graphics with that headroom used for higher clocks or lower prices, even the cut down IGPs can do Quicksync.

Also a suggestion for the full review, we know pretty much to expect from the HD4000 performance wise, but what about image quality? AMD and Nvidia improved things generation after generation, and I doubt Intel got it right with their first or second serious foray into lower-midrange graphics. Reply

Seriously, this is too much, its fine that you have an opinion and I might not have a problem with it if you posted it once, but you post the same damn thing on every article whether its related or not and usually multiple times, someone just please do us all a favour and ban this guy and delete his comments? Reply

CPU perf pretty much as expected,GPU perf somewhat dissapointing ,i thought they'll at least aim to match Llano but i guess it is ok for 1MP laptops screens if mobile parts perform close enough (and a couple of big ifs when it comes to image quality and drivers).Any opinions yet about QuickSync encoding quality?Reply

You are not being serious, are you? The CPU gets 10% in CPU sensitive benchmarks and GPU gained 40-60%. Even taking out 10%, its still 30-50%, which btw isn't true as games aren't very sensitive to CPU changes as applications do.Reply

What are you talking about? As long as AMD has a better iGPU there is plenty of reason for them to be viable choice today. And if gaming iGPU performance holds on against Intel there is more than just hope of them getting back in the game in terms of high performance comput tomorrow.Reply

I'm pretty sure even 16x AF has a sub 2% performance hit on even the lowest end of todays GPUs, is it different with the HD Graphics? If not, why not just enable it like most people would, even on something like a 4670 I max out AF without thinking twice about it, AA still hurts performance though. Reply

AF has greater performance impact on low end GPUs. Typically its about 10-15%. It's less on the HD Graphics 3000, only because their 16x AF really only works at much lower levels. It's akin to having option for 1280x1024 resolution, but performing like 1024x768 because it looks like the latter.

If Ivy Bridge improved AF quality to be on par with AMD/Nvidia, performance loss should be similar as well.Reply

AF requires more samples in cases of high anisotropy so I guess the TMU load increases, which may also increase bandwidth requirements since it could force higher LOD in these cases. You'll only see a performance difference if the AF causes the scene to be TMU/bandwidth limited instead of say, ALU limited. I'd expect this to happen more as you move up in performance, not down, since ALU:TEX ratio tends to go up along the higher end.. but APUs can be more bandwidth sensitive and I think Intel's IGPs never had a lot of TMUs.

Of course it's also very scene dependent. And maybe an inferior AF implementation could end up sampling more than a better one.Reply

Except the quality is the same as competing AMD products if not worse because of driver issues, but you lose 20-30% performance in every scenario versus the last gen Llano APU. The facts are in this very review.Reply

""It's just a driver issue, AMD/Intel will fix it!""It's just the review units sent out, AMD/Intel will have a BIOS update at the official release that improves performance!""If you overclock it to hell and back, it can almost sort of maybe compete with Intel/AMD!""Oh look, there's a new update out that improves performance! Sure it's only 1% performance, applicable in only certain scenarios, but it's better than nothing!" Reply

Aside from that NOT being what I said at all... you do realize you justified the reasoning in your post, right? They're bribing Intel. That doesn't mean they did nothing wrong, it's a BRIBE. Besides, Intel is just as guilty as Microsoft of OEM threatening and hand-holding in the 90s.Reply

They don't really care to. The point of a business is to make money, not have the best products. The latter only gets solved when AMD gets serious in competing with Intel on power/performance again.Reply

Except... Intels IGP drivers on Windows are bad already. They are allot worst on the Mac.Historically Intel has never supported it's IGP's to *any* great length and even had to throw up a compatibility list for it's IGP's so you know what games they could potentially run.

AMD's CPUs are going to die...sucks to be an AMD fanboy. However, whatever they are doing with their dedicated GPUs, they are doing something right...if they can manage to pull their act together on the driver side, I think AMD would live as a GPU company...Reply

I'm sorry, but Llano APUs will stay on top for quite a while; Intel is still at heart a CPU, Llano is part GPU...if AMD can get drivers the quality of nVidias, they will most likely do extremely well on that front.Reply

I really enjoyed the added compilation benchmark. This site has the most comprehensive collection of benchmarks that I've seen, it's a one-stop shop for most of my reviews. Keep up the great work!Reply

Would be great to see power benchmarks of the IGP, especially vs Llano and the HD 3000. Let's see if the graphics improvements have come at the price of yet more power consumption or if intel has managed to keep that down.Reply

Until AMD goes out of business. Then Intel gets lazy again, and the price of even a mid range CPU creeps back up above 600 dollars. You might be too young to remember the 500 dollar price tags on the first gen P3s, when Intel had no effective competition from AMD.

Its not in the consumer's best interests for AMD to die off.

And, FYI, their GPUs are top notch and excellent, across the entire market. Downside is, they're basically carrying the company right now and that's not sustainable. Reply

it doesn't really matter if the igp isn't that great most people don't buy them for their graphics power.I get the feeling that maybe intel is just putting them out there to keep it's base solid against AMD,Not that it needs it and i'm an amd fan. i found something the other day that will possibly change how tomorrows processors will use light instead of electricity.

it would be cool to see a 4ghz clocked nehalem shuffled in the mix. I'm sure I'm not the only one rocking an i7 9xx wondering how much actual productivity gains are to be had with the new tech. I personally don't like to upgrade until the new gen's retail performance out-does my previous overclocked performance by a solid 15%.Reply

I understand that will be part of the new chipsets which haven't been tested here, but I'm also very interested. As a matter of fact, I have a few HTPC customers waiting for Ivy Bridge for this sole reason.Reply

I don't find the silence about 23.976 fps playback very promising. This is new chipset "Keep in mind that this is a preview using early drivers and an early Z77 motherboard" .... "Intel Z77 Chipset Based Motherboard"

I find three possibilities:1. They are not going to fix it with Ivy bridge.2. They are not ready with the drivers.3. They are ready and everything is fine but keeping silen becoause they need to sell old chips.

Intel's always the best, EXCEPT WHEN THEY'RE NOT! Athlon 64. Since AMD's sticking with Bulldozer's base architecture for at least a couple generations, they won't be competitive for a while, but that doesn't mean they'll never be competitive.Reply

At the very least, AMD need a less power hungry successor to Bulldozer. From the Xeon review, it's mentioned that they should be in a position to do this, and could at least clock the thing a lot higher and still use less power than Bulldozer. Regardless, that IPC deficit is a killer - the following page is so telling of the architecture's current limitations:

1. General curiosity: You stated you did not get a sanction or support from Intel for this preview. I believed that sort of a thing isn't allowed before the release date. How do exceptions like this work?

2. Specific: I observed most of the discrete GPU tests were at 1680x1050, where there . Any reason for this? I guess it is since this is just a preview. Am I right? Any other reason?

1. If you want to officially review the chip, you sign an NDA and Intel provides you with it. Here he got access to it from a partner, who probably broke their agreements but Anand never signed any agreement so he can publish whatever he wants.

2. I would think so ,and in GPU bound scenarios I wouldn't expect much change at all.Reply

1) Generally what happens with previews and first looks is that the company producing a product (Intel) will send out press samples to reviewers if the reviewers will sign a Non Disclosure Agreement (NDA). When the NDA expires (generally the same time for everyone), the reviewers can post their findings to the public.

This is done (I assume) to give reviewers enough time to thoroughly review a product without having (theoretically) to worry about having information leak until the company wants it to get out.

If, on the other hand, a reviewer acquires a product via other means so there is no NDA that they have to sign in order to get the product... well, they're not under NDA, so they're free to disclose whatever they want.Reply

WHY!!!!? Does Intel HAVE to disable Hyper Threading on the sub 300 dollar CPU's? It's not like having in ENABLED costs them anything more at all. It would just be providing their customers with a better product. This shit is infuriating. It's there on the chip no matter what, HT should just be on every single Intel chip no matter what. That shit pisses me off SOOOO much.Reply

Intel's products get cheaper with smaller dies and with competition. Without competition, their dies cost the same to make, but they rob and loot your pockets and make obscene profits off you because your hated AMD no longer exists as an alternative supplier of good chips.Reply

Intel are strong in Software everywhere except the Gfx drivers department. No wonder why others call Anand a pro Intel site, i dont want to believe it, until all the article continue to label Intel are hard at work on Gfx drivers when they are clearly not. They are better then what they are used to be, but still far from good.

Graphics Quality on Intel IGP are not even close to what AMD offers.

Even if Haswell double the performance of Ivy they will still be one generation behind AMD.

I continue to wonder why they use their own GPU on Desktop / Laptop and not on Mobile SoC. They could have used PowerVR on Desktop as well, developing drivers for one hardware will simplify things and hopefully have bigger incentive to increase software R&D.Reply

>>No wonder why others call Anand a pro Intel siteWhat should he do, fake the benchmark results to make AMD look better than they are? Anand can only report his findings, he does this truthfully. Some people do not want to accept reality and prefer to shoot the messenger. Direct your frustrations towards AMD, not websites which report results of benchmarks.

From past benchmarks you can see the results at Anandtech are that different from other websites, AMD is getting destroyed on CPU perfomance and performance / watt metric.

>>I continue to wonder why they use their own GPU on Desktop / Laptop and not on Mobile SoC. They could have used PowerVR on Desktop as well,

FYI, they are dumping PowerVR in near future as well. Already covered on many websites, google it. PowerVR was a temporary fix, or rather an attempt at a fix which was more of a hassle and didn't work in the marketplace anyway.

They are now comitted to improving their own iGPU and drivers. This will take time for sure, Intel marches to its own beat.

The simple fact is that with the much weaker Sandy Brdige iGPU they outsold AMD 15 to 1, so even though the Ivy Bridge iGPU has not surpassed AMD yet, Intel should continue to do really well.

>>i dont want to believe it, until all the article continue to label Intel are hard at work on Gfx drivers when they are clearly not.

You can believe whatever you want to believe, this is not about beliefs but facts. As a user of Sandy Bridge and linux I know better than most just how much Intel drivers suck. In fact, their linux iGPU drivers suck much worse than Windows version (hard to imagine, but true) and weren't truly ready until Mesa 8.0, more than a year after release of the hardware.

But I also know they are working on things like SNA which in early test already offers ~20% performance boost.

No word on when it will be consumer ready, but Intel are working and steadily improving on drivers side as well. Perhaps not at the pace you want. You do not have to accept reality if it is so difficult for you, don't blame websites for reporting reality, however.

I am almost grateful Intel is not 'good enough' on GPU side as yet. It keeps AMD alive another year. Hopefully.Reply

PowerVR has lower performance and fewer features, so would not be a good PC solution. I'm also sure that Intel would rather have its own solution, it's just that it can't yet compete with PowerVR at the low power arena. I imagine that if Intel succeeds in mobile space it will try to create its own low power 3D core.

As for graphics drivers, I'm sure Intel is hard at work at them, but probably has fewer people than AMD on that. Far as I can see it's no longer the case that reviews with Intel graphics keep talking about what didn't run correctly, which means that things are getting better.Reply

I totally agree. Intel is again going to cobble the lower end with the HD2500 graphics so that people that don't need the i7 cpu have to buy a discrete video card. I really wish review sites would hammer Intel for this and pressure them to include the better integrated graphics. It's not like the HD4000 is so good that people will buy an i7 just for the graphics.Reply

HD4000 takes up more die space which means it costs them more. That's all intel cares about, they don't give a shit about what people need at the lower end.

They were forced to start using HD3000 graphics in all their lower end chips because of Llano. The 2105 basically replaced the 2100 at the same money so they would be less embarrassed by Llano. That's what competition does.Reply

I like this tick. The CPU performance goes up by as much as I expected and the iGPU side goes up significantly.

If I had the spare change to throw around, I'd upgrade from my 3.8GHz i7 860. But as it is now, an upgraded CPU wouldn't do much for me in terms of gaming performance and I rarely do CPU intensive tasks these days. The chipset and native USB 3.0 are nice, but I'll wait for Haswell next year and get a good GPU or two instead. Reply

"Generational performance improvements on the CPU side generally fall in the 20 - 40% range. As you've just seen, Ivy Bridge offers a 7 - 15% increase in CPU performance over Sandy Bridge - making it a bonafide tick from a CPU perspective. The 20 - 40% increase on the graphics side is what blurs the line between a conventional tick and what we have with Ivy Bridge."

"Being able to play brand new titles at reasonable frame rates as realistic resolutions is a bar that Intel has safely met."Reply

The review is good, I really like that you added the compilation benchmark for chromium -- good job!

I'm a little disappointed in the lack of overclocking information. What is the point of reviewing the K edition of this chip without even doing a simple overclock with a comparison to 2600K in terms of power draw and heat?Reply

That is because this is NOT a review...it's just a preview. I'm sure they will do some overclocking testing in the full review later. Those results would be more meaningful then anyway as this is still early hardware/drivers.Reply

It would have been interesting to see. Personally, I don't care for IGP, as they sit disabled anyway. Right now, it seems like it's a 7% clock for clock perf increase, which is very poor for one process node. Knowing where the clocks can be will let everyone know exactly how much faster the CPU can be over SB.Reply

Agreed! I was just about to post that same comment. It doesn't make much sense to compare it to a lower clocked SB product. Well unless you wanted to make the IB look better. Now I'm going to sift through anand's past reviews to see what kind of gains the 2700 has over the 2600. Reply

Great review. You guys know your stuff. I've been waiting for a review like this since IvyBridge was announced.

However, I'll still "cling to my Core 2" since it does the job now, and I'll postpone my upgrade till next year. You make it seem like Haswell is a good reason to wait. I bought the system in early 2010, and I usually upgrade every 2-4 years. 3 years sounds just right. I'll be investing in SSDs since you talked me into it though, it seems a better upgrade at the moment.Reply

The ivy bridge 3770k is a direct replacement for the sandy bridge 2700k which is only a small upgrade from the 2600k yet still missing from the benchmarks to allow a direct architectural comparison.

Intel badly need powervr in its graphics core.... will they finally use a multicore Rogue series 6 core in the next generation (Haswell???) for some decent performance in their IGP???? They developed easily the fastest graphics core in the arm soc tablets/phones inside the ipad 2/iphone4s now its time to save intel (one of imgtechs biggest shareholders along with apple). Intel need to ditch this old weak igp core architecture and get with the times....

The amd llano even with its terribly weak cpu core still clearly outpaces this new improved intel hd4000 core in these non gpu limited tests. If amd had a faster cpu they would be even further ahead in regards to graphic capabilities, which appear cpu limited in many cases too(see discreet gpu tables to get an idea of intels cpu advantages).

Where are the in game checks on intel's notorious poor image quality, much like when radeons are compared to geforces to ensure these are even producing an acceptable image for the performance they give and not cutting corners???

Happy with the lower power and performance cpu gains of Ivy Bridge. Disappointed in the weak old graphics once again, which fail to match llano even with a far stronger cpu dragging it along... Reply

Is this some kind of joke? It may be comical, but it sure ain't funny. intel themselves had slides circulating around showing at least 2x performance increasee over last generation. Now they show up with not even half that and Anand falls to his knees in praise.. Seems a little fishy to me where have I seen this before....Right, the primary elections in the US! Same shit, the elite give the mainstream media their marching orders, and the main stream media sets out to brainwash the mass population with that message. And you continue to lead the charge on downplaying image quality and functionality, ever since you became intel's mouthpiece. Where are the days of proper image quality comparisons, and feature benefit to consumers. That's all dropped off the radar because intel has abysmal and atrocious graphics capability and know how. They're the WORST in the industry, and yet he we have good ol' anand patting his buddy on the bumb ensuring that intel will ever have a need to actualy compete. They can just hand off money to the pieces' of shit in the world and have them manipulate the perception.ticsReply

Maybe I misread the article or read a different one. It came across to me that Anand was mainly comparing the HD4000 to HD3000. In which case there is generally a notable increase in performance. It's not 2x the HD3000, but doing a quick search trying to find these slides you mention showing such an increase came up with nothing. Only found one on Tom's which was a leaked slide comparing HD2000 to HD4000. If you could link some of those that would be great. Also, in just about every case where the HD4000 was (almost inevitably) beaten by AMD in graphics performance, it was pointed out. Reply

I wonder how much of the improvement in the performance to power ratio is due to the trigate technology. In same ways, I was expecting a bigger jump around 20%, but since they also dropped the power by 30W, that says a lot. Looking at his from the perf/power perspective makes it a bigger deal than it sounds from a 5-15% CPU gain.

Still.. for some reason I feel a little disappointed. I thought trigate would change things even more in conjuncture with 22 nm process.

Well the dilemma for Anand is apparent. If he stops writing those previews that is nice to Intel, someone else will get the oportunity and all the info. He can write two bad previews and the info and early chips just stops comming. Intel and Anand have a business to run, and there is a reason Intel gives Anand the chips (indirectly).

He have a "deal" with Intel, the same way we have a deal with Anand when we read the review. We get the info - bended/biased - and then we can think ourselves. I think its a fair deal :) - we get a lot of good info from this preview. The uninformed gets raped, but its alway like that. Someone have to pay for the show.Reply

1) Your 65 nm cpu would get the shit blow out of it by IB at the same clock speed in single threaded applications. Assuming 15% improvements in each of the tick-tocks since Conroe, a 1.8 Ghz IB would probably be about the same as a 3 Ghz Conroe.2) Discrete graphics vs. integrated graphics. Intel isn't trying to compete here so it's a stupid comparison.Reply

"There's not enough of an improvement to make existing SNB owners want to upgrade, but if you're still clinging to an old Core 2 (or earlier) system, Ivy will be a great step forward."

Basically all the laptops in the last few years for business have been bought with C2D. I think with Ivy, it's a great time to upgrade them all and see a good improvement. Same for family members too. I can't wait to try them out! Thanks for the review Anand.Reply

This article blows because there's no overclocking results. We're not looking for a fine tuned overclock. Just give us the rough and dirty! My money is on 5 ghz with minimal effort using an air cooler.Reply

I have to disagree with you on the i920 being such a huge leap. As someone who goes thru virtually every cpu line up for AMD/Intel I'd have to say the C2D (or quad) 6x series was the biggest leap forward in the past decade. Before that it was the A64 and X2 variants (altho.. we didn't get alot of use out of those secondary cores)Reply

But i agree, this sacred, aura, "this is not sanctioned by Intel" is a pain to read. It makes thesse articles a little bit difficult to start reading :)

But how profitable, and how good a business do you have if you dont have "good conections"? - charlie uses his for underhand information, anand his to get info before the others. Its very obvious for us to interprete Anands article because we know the obvious, - it have to be profitable for both anand and Intel. But what about Charlie, what is the motives for the people leaking info to him? - its not quite so obvious and transparrent.Reply

"Sure, he was comparing Intel graphics to Intel graphics, except he wasn't, because he himself threw Llano in there to compare."

By the same token, if he had not included Llano results people would be wondering where they were and complaining that they weren't included. Puts Anand in a catch 22 when deciding whether or not to include Llano.

There is validity to the complaint about the numbers being incorrect. Those should be looked at and corrected. Glossing over the results and no mention of Llano being more capable, again, this was mainly to compare Intel v Intel in a preview of their new chip and improvements they've made since last gen. Sure, he could've been more thorough with the AMD v Intel side, but that's not really what this article was about. We could also go to a steakhouse and complain there's not a large vegetarian meal selection too.Reply

it doesn't really matter if the igp isn't that great most people don't buy them for their graphics power.I get the feeling that maybe intel is just putting them out there to keep it's base solid against AMD,Not that it needs it and i'm an amd fan. i found something the other day that will possibly change how tomorrows processors will use light instead of electricity.

I'm an electrical engineer, doing intensive "spice" simulations.I want to know if, as it requires a lot of floating point calculations, does it worth to wait for Ivy Bridge instead of buying right now a laptop with a quadcore Sandy Bridge? I expected Ivy Bridge for March and i've been waiting since last december :(.To buy now would be very comfortable, as i'm in the simulaiton phase of my project. To buy later, I believe, would make more sense in term of pure performances . But how much sense is the question....

Thanks for sharing

PS (another thing is also theuse of 1600 memory instead of 1333, which might be doing it for another software I use)Reply

I wonder how IvyBridge perform in term offloating point calculations as I do intensive electrical simulations.I urgently need an upgrade and would definitely go for a Ivy Bridge. But I've been waiting a long time now and Ivy Bridge may again been delayed.Does anyone have an advice about it?

No, it is just out of question for me to overclock. I wanna buy a profesional laptop (w520 lenovo). SO no way to teak it.Fact is memory will be 1600 Mhz and the processor a bit stronger with maybe a better memory controler.At one month of the release, it worth to wait it.Just wanna make sure that in my particular case it really worth it cause i'm tired of my heavy old laptop. I buy this damn machine just for working after all. At home, my E8400 is still upto date for what I do with it.Reply

@ArnoI'd consider a few aspects:-Do you need to use precision external gear, -like we audio people do with soundcards- and hence need ExpressCard or Thunderbolt connectors? Then I'd expect May-Jun launches will bring those professional Laptops and Ultrabooks.-If portability is important, factual Sandy bridge battery capacity is near 4 hours whether Ivy Bridge battery will extend real usage around eight hours for similar performance.-Furthermore USB 3.0 will be native, something important since most renesas boards have been far from perfect and just their recent (Feb/March 2012) releases seem to finally have nailed efficiency.. Problems with USB 3.0 equiped Sandy Bridge laptops abound in forums, and that is in professional brands.-If you were questioning about SandyB vs IvyB desktops, you could still buy now the former and later upgrade for the later CPU, but with the mobile platform, Intel has stated that H67M -their actual chipsett platform, also named Cougar Point- Upgradeability is not going to be feasible, despíte it could be technically possible easily..Therefore, there a many reasons pointing to wait. Since sales are very low, any are choosing this route.Reply

Thanks Nexing for u answer. Actually, i totally agree with you on:portability => IB is a shrink and must be more power efficient for an equivalent task load. Seems that the test proves it. moreover, I will work a lot in trains or outdoors (visiting customers), so it is definitely a +.USB 3 => u feedback is very interesting. I myself think that "native" versus "add on" USB3 feature must be better. And that was also a reason for me to wait when last december, i was already thinking of buying something new. Now i'm quite sure that it was the good thing to do.

For the rest, more than external gears, I need a processor good in floating points calculation. I do intensive electrical simulations so i definitely need it.

I took my decision and I will wait. This laptop will replace and desktop and laptop for work (and work only cause for internet or usual offices task, i definitely think a core 2 duo can make it); so better to catch the best. I will manage the present emergency I have, praying for Lenovo (or Samsung?) to offer new Ivy Bridge laptop as soon as possible. Let's make a bet: Lenovo got it ready to release and is just waiting for the official launch date....

@arno I'm GLAD you didn't read more carefully, because you posted the question, and Nexing's answer focused me on something I still wasn't considering as a major factor in my decision: USB3. Between your question and the response, I also got a better picture of how specific use is affected by the tech. So, I'm a waiter (tho I don't serve food 8^D ).Reply

Basically every 2D-based graphic designer/web designer doesn't need a discrete GPU for their work. The IGPs handle that workload fine (mainly because most of the processing needed for photoshop, indesign, illustrator or dreamweaver is CPU based). A discrete GPU gives you better performance with the very limited 3D stuff that photoshop offers which is situational at best for the vast majority of graphic designers.

3D artists and those that pump a ton of effects in video editing, they would benefit from discrete.Reply

Great article but.... where are the temps??? The few benches I have seen don't mention overclocking, and if they do, they do not mention temps. I am hearing this chip can boil water! I would think that would be as important as anything else...Reply

is it possible to fully load the igp with an opencl application, and not affect the cpu performance at all? From what I've read, it appears the igp shares the cache with the cpu, so will that affect performance?Reply

Generational performance improvements on the CPU side generally fall in the 20 - 40% range. As you've just seen, Ivy Bridge offers a 7 - 15% increase in CPU performance over Sandy Bridge - making it a bonafide tick from a CPU perspective

Should be :Generational performance improvements on the GPU side generally fall in the 20 - 40% rangeReply

Generational performance improvements on the CPU side generally fall in the 20 - 40% range. As you've just seen, Ivy Bridge offers a 7 - 15% increase in CPU performance over Sandy Bridge - making it a bonafide tick from a CPU perspective

Should be :Generational performance improvements on the GPU side generally fall in the 20 - 40% rangeReply

They give the drivers their own tweaks and bug fixes, but I doubt they could do something like add T&L without the manufacturers support. In fact, they didn't, unless they have bigger driver teams now. Reply

"Personally, I want more and I suspect that Haswell will deliver much of that. It is worth pointing out that Intel is progressing at a faster rate than the discrete GPU industry at this point. Admittedly the gap is downright huge, but from what I've heard even the significant gains we're seeing here with Ivy will pale in comparison to what Haswell provides."

Personally, I believe on-board graphics will never be on par with a dedicated graphics part. And it is obcessive-compulsive ridiculous to compare the performance of the HD4000 with discrete graphics and complain its not as good.

The HD4000 is meant for providing graphics for business and multi-media computers. And for that purpose it is outstanding.

If you want gaming or engineering workstation performance, get a discrete graphics card. And stop angsting about how bad onboard graphics is to discrete graphics.Reply

No one *needs* integrated graphics. But not everyone needs discrete graphics. The higher performance an IGP has, the less people overall will *need* DGPs.

Not all games need dedicated graphics cards, just the multi million dollar re-hashed COD's that choke retail stores. There are literally thousands of other games around that only require a small amount of graphics processing power. Flash now has 3D accelerated content and almost every developer using it will target IGP performance levels. Almost all casual game developers target IGPs as well, they're not selling to COD players. Sure, most of those games won't need a hight end CPU as well, but people don't buy computers to play casual games, they buy them for a massive range of tasks, the vast majority of which will be CPU bound so faster would be better.

Also, as an indie game developer I hit performance walls with CPUs more often than I do with GPUs. You can always scale back geometry/triangle counts, trim or cut certain visual effects but cutting back on CPU related overheads generally means you're cutting out gameplay.Reply

"there's also the question of which one (CPU or GPU) approaches "good enough" first."

I was worried that my A6 3420 laptop would feel sluggish in windows and general tasks, especially compared to my 2500k desktop system. However, I've been very surprised and think it works just fine in windows.

I was also very impressed that the iGPU lets me play most newer games comfortably. I was able to OC my A6 3420 on my Samsung 3 series to 2.0ghz. It runs Crysis 2 on low at 1366x768 in the 25-30 fps range. Now to me that is not really playable, but I was surprised it could even run it. Other games like SC2, Arkam Asylum, CSS, WOW, have all ran like a champ. Most of them even on medium settings!

So I think if you want a cheap laptop (mine was $399), and you want the ability to play some games while still doing general tasks well, we have already hit that "good enough" stage on the CPU department. It will be interesting to see if Windows 8/Metro does anything to change this.Reply

Intel needs another Larrabee. It keeps cobbling together these graphics cores, which are always well short of the mark. Either Larrabee 2 or licence from Nvidia, but something has to be done about it in the long (possible mid) term. It makes perfect sense and, to me anyway, has the air of inevitability about it.