I have a 2011 MBP with Intel graphics driving the 27" Cinema Display and it runs Mac OS and two Windows instances in Parallels just fine. I can only imagine a dedicated GPU is needed for play. For work, Intel's integrated stuff is fine, and Ivy Bridge is supposed to be getting better.

I'm using a late 2006 MBP and the ATY Radon X1600 card with 256 Mb of VRAM is just not quite hacking it for some of my work. The laptop runs hot enough to leave red marks on my thighs when processing graphics flat out.

Does anyone know just how much faster the Intel Ivy Bridge processor with integrated graphics might be?? I could stand to have 10 to 20 percent faster graphics than I presently have, but even just as fast would be fine if the laptop ran cooler.

If Apple is planning on making the MBP more compact and lighter, that would be nice, however that also means the internals can't be throwing off the kind of heat my current MBP does.

The HD4000 is seeing double the results of the 3000 in 3D Vantage tests. The 3DMark 06 tests are nearly double and quadruple your current results. The X1600 is ancient and even integrated graphics smoke it now. You should see ridiculous improvements well beyond the 10-20% you are talking about.

You mean like this *points to 13" MBP* Apple said they couldn't fit a discrete card in there and keep things cool and maintain battery life. If they are removing the optical drive in the next gen, they should be able to do all that.

The Ivy Bridge GPU should get a 40-50% boost over Sandy Bridge. This will give it a score of 547.

So essentially, the GPU is equivalent to the 6630M in the Mini and 6490M in the last-gen 15" MBP. This performance is adequate for most games and will run most of the higher-end titles on low settings - it should be able to run Battlefield 3 for example.

It would be nice to see them moving up all the time but they will probably shave off 10W going with an IGP and this is the first Intel IGP that will handle OpenCL so a decent boost for FCPX on the entry laptops.

It will be a shame to see NVidia blocked as I think their GPUs and drivers are better than AMD's but I'm sure the chosen options will be good. They could even lower the price of the entry 15" laptop or put the money towards a 256GB SSD on the entry model.

One other thing to keep in mind is that IGPs tend to get more video memory than Apple's low-end dedicated GPUs and it goes up if you increase your RAM:

I could see a dramatic move to SSD happening this year at the expense of GPU increases and I would welcome it big time. I am sick to death of that stupid beachball. It would help remove some framerate drops from games too so although the framerate won't go up much at all, it will be easier to sustain at peak.

Just imagine if they shifted every Mac model over to SSD drives: 128GB on the entry MBA and Mini, 256GB on every model above that with options for 512GB on the highest models.

Side-step the graphics for the most part, dramatically increase boot times and loading times, cut power draw by 10W to boost battery life and have thin and light enclosures all round with instant-on. And USB 3 of course.

Pro is an example of Iconic Branding. It is meaningless marketing-speak for those who think product acquisition determines what sort of a person they are.

They are very nice laptop computers. "Pro" ain't got no meaning other than branding.

Yes, its marketing, but I am not sure what you mean by 'Iconic Branding'. I also don't think it is entirely meaningless. It signifies that the machine is of a higher spec. But whether you actually need a higher specked machine as a 'pro' is a silly question. If you are a plumber, or a web developer, a lower specked machine is usually ample.

Most 'Pros' in need of top performance wouldn't be buying anything but the top specked machine, anyway, so its all pretty much academic, imo.

The HD4000 is seeing double the results of the 3000 in 3D Vantage tests. The 3DMark 06 tests are nearly double and quadruple your current results. The X1600 is ancient and even integrated graphics smoke it now. You should see ridiculous improvements well beyond the 10-20% you are talking about.

The HD3000 was barely as good as the (ancient) 9400m and the HD4000 being 'twice' as fast, puts in more or less on par with the 320m (not even the GT330m, which is 2 generations old).

I'll need more than that awful unreadable site's benchmarks to convince me otherwise.

Here are some benchmarks on a website that wasn't designed by a toddler

Note the competing devices are bargain-barrel $40 GPUs, which is barely keeps up with. If you're happy running ultra-low resolution and low-quality texture games, you might see 30 - 50fps (note minimum framerates are more important than average ones).

No chance of Apple ever using Power VR graphics chips on the MacBooks? If a quad core graphics chip can power a retina display iPad i'm pretty sure there is a more advanced model available for laptop class graphics.

I have a 2011 MBP with Intel graphics driving the 27" Cinema Display and it runs Mac OS and two Windows instances in Parallels just fine. I can only imagine a dedicated GPU is needed for play. For work, Intel's integrated stuff is fine, and Ivy Bridge is supposed to be getting better.

OS X is now OpenGL 3.2+ compliant across the OS. Moving forward will be 4.x compliance and thus requiring baseline GPGPUs to have said compliance.

Sandy Bridge stops at OpenGL 3.0.

I have no doubt the Air series will move forward until the wall finally requires Intel to stop fooling people without a real discrete GPU ala APU via AMD and end their current approach of strapping on a knock-off GPU solution.

The Air with Ivy Bridge will have some Intel 4000HD IGPU.

Very few Ivy Bridge mobile support the 4000 HD graphics set up. The rest use the 2500 HD.

All AMD APUs will be OpenGL 5.x/OpenCL 2.x compliant by the time Intel is OpenGL 3.2/OpenCL 1.1 compliant.

Sorry, but that's not acceptable. Apple's solutions are more and more moving to OpenCL optimized environments for it's entire suite of sold Applications never mind their OS environments. Knee capping its customers with Intel's integrated graphics won't do for a Pro line of their Laptop.

AMD is meeting it's timelines for the AMD Radeon Mobile 7000 series release dates starting right after WWDC.

I'm expecting Apple to put an AMD Radeon 7000 series Mobile GPU on their new Macbook Pro solutions [presently they use the 6000M series and the low end model uses Intel 3000 HD].

I expect them to have one low end, entry model with an Intel 4000 HD and the rest with discrete 7000M series from AMD.

Anything less would be absurd and damage the brand in the Laptop market space.

The onboard GPU is a joke, it's always a joke, it makes me want to cry when I see new MacBook Air/MacBook Pro's without a dedicated AMD or nVidia GPU part. I can't believe Intel insists on damaging their brand by putting 5 year old GPU performance in brand new CPU's. I'm not asking for the onboard GPU to be be like a top of the line part, but if I'm paying extra for the onboard GPU, it should perform like an entry-level 100$ GPU and not an afterthought.

The fact that the Intel HD3000 is slower than 29$ throwaway GPU makes me wonder why Intel bothers doing this with the desktop parts at all. The laptop parts at least get some energy savings out of the deal. Apple doesn't even support the first generation Macbook Air(3-4 years old) in the current OSX because the Intel Video is is too weak. This should illustrate the point to Intel to stop making video parts with just enough performance to pass certification tests.

Blame goes to apple, not Intel. Try to get the same power per watt performance wont even get some of those fans running on modern gpu

I really hope this is just what the article says it is: a rumor, because if not even the mid-range MBP's have a real GPU, I'll not be buying a MBP at all. I don't want a 17" laptop, I'd very much prefer 13" over 15", and I'm not willing to get stuck with the kind of craptastic GPU's Intel makes, for a laptop I'm planning to use at least 3 years. No matter how many times they say their 'new' GPU architectures will 'double performance' or whatever, they always suck when you try to put them to work. Sandy Bridge was supposed to have twice the graphics performance as well with the HD3000, and while low-resolution, low-detail benchmarks were pretty favorable for Intel's claims, as soon as you ran it at higher resolutions or detail, performance completely fell apart. In practice, my late 2008 alu MacBook performs similarly in games and other graphics apps, as for example my colleague's mid-2011 MBP with an HD3000. In addition to that, the NVidia 9400M in my MB actually supports OpenCL and full GPU accelerated video decoding, while the HD3000 only supports partial accelerated video decoding. It's miserable. Which isn't surprising, considering the fact that the HD3000 is based on the same old architecture as earlier Intel GPU's that also sucked, and most of the performance increase it brought were only because it was better integrated with the CPU, not because the GPU itself was a lot more powerful.

Even if I'm not interested in playing BF3 or some other graphics-heavy games on my laptop, I'd still like to be able to play some less demanding games like the new SimCity that is due next year. I think you can almost forget running that with an HD4000 if it is only 50% faster than an HD3000. The HD3000 doesn't even run 4 year old games that aren't very graphics intensive at good frame rates, such as Anno 1404, which runs perfectly on my 2008 MB

If the low/mid range MBP's only have integrated graphics, I could just as well get the highest spec MBA, which would likely be faster for many tasks than the base 13" MBP models, unless the MBP's will also have an SSD by default.

Well to be honest its not like anyone uses a macbook for gaming. Indeed, ANY laptop for serious gaming.

For the kinds of games likely to be played on a laptop, ivy bridge is fine.

If you want BF3 at ultra settings in high res though, well dedicated graphics is the only way....for now.

Quote:

Originally Posted by ninadpchaudhari

Guys but y only NVidia ?

in fact AMD has a gr8 name in gaming industry ...

then y only Nvidia ??

Quote:

Originally Posted by jnjnjn

You must be jealous you can't buy one.

J.

Quote:

Originally Posted by boriscleto

So I haven't been using Aperture on my 2009 mini for the last 2 years?

Quote:

Originally Posted by Patranus

Who gives 2 shits what Apple puts into its refreshed MBP as long as there is a performance gain.
They could put a graphics card from a Macintosh Color Classic and if they could get better performance out of it than what they are getting now, why would anyone care?

Really don't get why people focus on specs. A computer is a tool to get a job done. It doesn't matter how it does the job as long as it does the job better and faster than the previous generation.
No, a computer isn't a tool to measure your penis size.

Pro's may use their laptops for editing but anyone who relies on them for rendering or any serious work is just misguided. I used to be a specialist and I always put customer's need for a laptop into question. Laptops are portable, desktops are powerful, that's all there is to it.

Well, sort of. But two things regarding Apple have changed this for pros in both audio and image work, and they are that high end MBPs have shown to be very suitable for heavy lifting in these areas (especially with an external monitor) and that Apple desktops have been extremely underwhelming for too long. Pros who don't need the slots have not been motivated to go the desktop route, since the desktops have been so long in the tooth for so long. When that changes it will change, but whether it's the most powerful rig or not, OSX pros that aren't rendering haven't bought many desktops in recent years, so the MBP market for them has been very strong. You'd be surprised how many highest end photographers with Macs don't have a desktop anymore. A MBP with an awesome monitor has been the route for many.

'Pro' is not a brand, its a model. Or rather, it is a model classification. Apple is an iconic brand. The 'MBP Pro', many would argue, is an iconic product.

I would say it's both a brand and a model. A simpler example would be the iPhone 4S. That is the official branding * a type of product manufactured by a particular company under a particular name and the model which refers not to only to the Speedier version of the iPhone 4, but also signifies, at least to us, that it's the 5th gen iPhone.

This bot has been removed from circulation due to a malfunctioning morality chip.

His point was completely valid. Why would they have to release them? We know what the old generations does and we can speculate on what using Ivy Bridge's HD4000 will do by using existing benchmarks on the web. No matter how much magic you think Apple has, they can't make the HD4000 outperform a dedicated solution. Fact.

I've been using a 17" MBP with the NVIDIA 9600/9400 GPUs since early 2009 and have been considering getting another 17" MBP when the Ivy Bridge models come out later this year. I like the proposed changes, such as elimination of the optical drive and conversion to SSDs but I will not upgrade if Apple doesn't have at least a capable OpenGL/CL compliant card on board. I would prefer this to be an NVIDIA card but a good AMD card would be acceptable. If Apple can't deliver on a good high-performance machine then I may have to go with a Clevo chassis based portable running desktop components and a FreeBSD OS.

Apple managed to fit a Geforce 5200 into the 12" PowerBook and get good battery life around 6-7 years ago. You'd think they'd be able to take some miniaturisation cues from the iPhone and fit a dedicated GPU into a 13" chassis too.

Note the competing devices are bargain-barrel $40 GPUs, which is barely keeps up with. If you're happy running ultra-low resolution and low-quality texture games, you might see 30 - 50fps (note minimum framerates are more important than average ones).

Yeah, their page doesn't look the best, but benchmarks are benchmarks, whether you can be bothered w/them or not. Also the cards that the 4000 is equivalent to are $60-$80 cards, not $40. Minor difference yes, but we are also talking about laptops and battery life is viewed as much more important to Apple than it is to most Windows based makers. You are bringing up Crysis and Metro benchmarks, neither of which are on OS X I don't think.

Does it play WoW fine? Yep. Will it play Diablo 3? You betcha. I could give a shit otherwise. I still think this rumor is bogus and not only will we be seeing AMD in the new MBP line, but we will see a discrete card in the 13" if they get rid of the optical. Which will make this whole discussion irrelevant.

Quote:

Originally Posted by simtub

No chance of Apple ever using Power VR graphics chips on the MacBooks? If a quad core graphics chip can power a retina display iPad i'm pretty sure there is a more advanced model available for laptop class graphics.

Oh look, every freaking mobile model uses the best possible GPU. Maybe b/c many desktop purchasers will be using discrete anyway? Before you make claims like the one I quoted about the 4000 and 2500, try having some idea what you are talking about. The 2500 is a desktop option only.

I have a 2011 MBP with Intel graphics driving the 27" Cinema Display and it runs Mac OS and two Windows instances in Parallels just fine. I can only imagine a dedicated GPU is needed for play. For work, Intel's integrated stuff is fine, and Ivy Bridge is supposed to be getting better.

I really don't understand why threads about GPUs always degenerate into discussions about gaming. Really guys gaming has nothing to do with it. First this is a MAC and one would have to nuts to buy one for hardcore gaming. Second, GPU use has long ago been woven into apps and the OS, this acceleration only becomes more deeply embedded into the software of the future.

In any event one thing that needs to be understood is that I really believe we are close to the point where on chips GPUs will be the low cost solution to midrange performance. However that isn't Intel at this stage. So even though it would be foolish to release any sort of Pro marketed computer today without integrated graphics that won't be the case in the future.

Is Apple where to go to integrated GPU only platforms they would have to start looking at AMD closely where GPU performance is important. AMD simply has a sounder product map for the future. Even today AMD hardware supports standards that Intel doesn't. In very real terms Intel GPUs are nothing but junk for professional use. To say otherwise is to disregard the needs of professionals which by the way can vary widely.

Now that more info has come out it's pretty clear it won't be a major shift. Only change is that instead of only the lowest models having integrated graphics it's looking like half the midline will as well. And the upper tier MBPs aren't losing GPUs so there's not much to argue about.

Now that more info has come out it's pretty clear it won't be a major shift. Only change is that instead of only the lowest models having integrated graphics it's looking like half the midline will as well. And the upper tier MBPs aren't losing GPUs so there's not much to argue about.

What is with some of you? This is not more info. This is a rumor, and not even a very credible one. There's a good chance it's just one of the blogs fishing for page hits.

What is with some of you? This is not more info. This is a rumor, and not even a very credible one. There's a good chance it's just one of the blogs fishing for page hits.

Nah mate, Nvidia and ATI have struggled with their GPUs for years now... Regardless of how accurate this story is, the truth is that in the mobile and tablet space PowerVR and ARM have eaten their lunch (and dinner).

In the laptop space nothing comes close to Intel's Sandy and Ivy Bridge power and efficiency for mainstream computing.

All that's left for Nvidia and ATI is the higher-end PCs and even then they've struggled for far, far too long... Driver issues, endless updates, chip failures, fabrication issues all the way back to 90nm, let alone <20nm, and so on.

While not doomed, they are relegated now to the niche, when just 3 years ago they were poised to truly be revolutionary with tech, GPGPU and so on. Suffice to say now we rely on phone graphics to power most of our games, phone graphics that will soon dominate console graphics, and "CPU" aka Intel graphics that will in a few years obliterate discrete graphics from mainstream computing.

My Nvidia 320M is a very capable graphics device, but the fact I can't play an age-old game like Psychonauts (literally, it's impossibly laggy), and almost NO GPGPU applications aside from Apple and others... Their time has come. The blame game of developers, Apple, Windows, drivers, OS X, OpenGL, etc. etc. is no longer acceptable. When I can get a superb once-in-a-lifetime experience with Mass Effect 3 on obsolete hardware (Xbox360), Nvidia and ATI... Today I say goodbye. (Yes, I know Nvidia/ATI are used in consoles, but that's almost the last time they did anything fantastic)

All that's left for Nvidia and ATI is the higher-end PCs and even then they've struggled for far, far too long... Driver issues, endless updates, chip failures, fabrication issues all the way back to 90nm, let alone <20nm, and so on.

While not doomed, they are relegated now to the niche, when just 3 years ago they were poised to truly be revolutionary with tech, GPGPU and so on. Suffice to say now we rely on phone graphics to power most of our games, phone graphics that will soon dominate console graphics, and "CPU" aka Intel graphics that will in a few years obliterate discrete graphics from mainstream computing.

Intel seemed to be having trouble with 22nm fabrication too. GPGPU supposedly hit a lot of limitations in areas where it was expected to be big. My point wasn't that we couldn't trend toward integrated graphics in Macs. My point was that they're just riding speculation for page hits. As soon as tech blog articles started to surface about Kepler, we saw a rumor that they'd be used in Macs. Now the rumor is that the Kepler rumor was true, but they moved on. I tend to doubt rumors that claim to have an actual play by play that pander to whatever seems likely to bring page hits at that given moment. Anyway I'm not arguing that discrete graphics may be a thing of the past within a few years in most machines given that they've made their way into light desktops in the past few years as a cost cutting measure.

Nah mate, Nvidia and ATI have struggled with their GPUs for years now... Regardless of how accurate this story is, the truth is that in the mobile and tablet space PowerVR and ARM have eaten their lunch (and dinner).

They struggle no more than any other chip maker. PowerVR has been doing very well but that has more to do with NVidia and AMD not focusing on the markets. Point in fact head rolled at AMD recently because they missed the portable market completely. That however had nothing to do with their technology.

Quote:

In the laptop space nothing comes close to Intel's Sandy and Ivy Bridge power and efficiency for mainstream computing.

Again this is simply wrong most people would be better off with an AMD Llano due to its far better GPU.

Quote:

All that's left for Nvidia and ATI is the higher-end PCs and even then they've struggled for far, far too long... Driver issues, endless updates, chip failures, fabrication issues all the way back to 90nm, let alone <20nm, and so on.

You really can't be serious when companies like Intel have had the very same issues. As for ARM all you are hearing about is their success stories, they have had some pretty dramatic failures in the market also.

Quote:

While not doomed, they are relegated now to the niche, when just 3 years ago they were poised to truly be revolutionary with tech, GPGPU and so on. Suffice to say now we rely on phone graphics to power most of our games, phone graphics that will soon dominate console graphics, and "CPU" aka Intel graphics that will in a few years obliterate discrete graphics from mainstream computing.

Why do you think that AMD bought ATI and further why did ATI not resist the take over? It was because it was obvious in the industry that SoC technology was and is where the industry is going. This move has given AMD a significant advantage in certain markets.

Quote:

My Nvidia 320M is a very capable graphics device, but the fact I can't play an age-old game like Psychonauts (literally, it's impossibly laggy), and almost NO GPGPU applications aside from Apple and others...

There are also plenty of games not playable on iOS devices. You made a bad buying decision in going with the 320M, crying in the forums about it does not help support your position.

As to GPGPU acceleration, it is used by almost anybody that can benefit from it. Seriously I'm not sure why people can't grasp this, GPU acceleration is only of value if there is a fit with the application. If you look at things like Safari there is partial us of GPU acceleration there.

Quote:

Their time has come. The blame game of developers, Apple, Windows, drivers, OS X, OpenGL, etc. etc. is no longer acceptable. When I can get a superb once-in-a-lifetime experience with Mass Effect 3 on obsolete hardware (Xbox360), Nvidia and ATI... Today I say goodbye. (Yes, I know Nvidia/ATI are used in consoles, but that's almost the last time they did anything fantastic)

So what you are saying is that you can't get good results on PC hardware for gaming? Many would argue with you there. I on the other hand would say who cares, I'm running a Mac which has no share of the gaming market. That however does not mean that I don't need a decent GPU on any machine I buy. Like it or not a decent GPU has a huge impact on system performance or feel.

One other thing related to GPGPU computing, OpenCL has been extremely successful for an Apple initiative. Adoption has been fairly massive, so please don't squeak about the lack of OpenCL based programs.

Charlie Demerjian is the author of the rumour. For me this source is even less reliable in predictions than DigiTimes speaking on the iPad HD, especially when the rumour has something to do with NVIDIA. Does his name ring a bell? Do you remember a statement that Apple switching MacBook Airs to ARM processors - its a done deal? Yep, same guy.

Charlie was (is?) an ATIs attack pet from the time before AMD ownership of Canadian company. Usually his rumours came in two parts:

1. NVIDIA has some contract secured with company A.
2. NVIDIA loses contract with company A due to quality / yield / drivers issues.

Part one followed part two without an independent source confirming any of the statements. I dont say that the rumour here is totally in Charlies imagination, its just when Charlie says its a done deal it actually means some tests were conducted.

Of course, maybe Im being biased here, but Apple Insider, could you please mention the name of the source in the news in this case? Would save me from reading.

The HD4000 is seeing double the results of the 3000 in 3D Vantage tests. The 3DMark 06 tests are nearly double and quadruple your current results. The X1600 is ancient and even integrated graphics smoke it now. You should see ridiculous improvements well beyond the 10-20% you are talking about.

You mean like this *points to 13" MBP* Apple said they couldn't fit a discrete card in there and keep things cool and maintain battery life. If they are removing the optical drive in the next gen, they should be able to do all that.

It's going to depend on the application and how old a computer you're replacing. I have a 5 year old MBP and performance isn't really much of an issue for my work. I could use modern integrated graphics and not miss the dedicated GPU. Besides, I don't think there's any way that Apple's going to offer only integrated graphics across the board. They will certainly continue to offer discrete graphics.

The other important factor to consider is that there are tradeoffs. For people who don't need the performance of discrete graphics, getting rid of the GPU means that you have more power left for other things. You also save money and have room for a bigger battery. You could use that extra money and battery power to use a faster CPU or more cores in the CPU. So for some purposes, you could end up with a FASTER system by dropping the discrete GPU. (but, again, I don't see that happening across the board).

Quote:

Originally Posted by mgsarch

His point was completely valid. Why would they have to release them? We know what the old generations does and we can speculate on what using Ivy Bridge's HD4000 will do by using existing benchmarks on the web. No matter how much magic you think Apple has, they can't make the HD4000 outperform a dedicated solution. Fact.

His point was that there's no point in getting excited about a silly rumor from an unreliable source. If you want to make a hypothetical argument, that's one thing, but acting like this rumor has any validity is a waste of time.

"I'm way over my head when it comes to technical issues like this"Gatorguy 5/31/13

Higher end 15- and 17- inch MacBook Pros launched early last year relied solely on AMD graphics.....

None of the 13-inch models have ever had discrete graphics.

And as long as Apple keeps that trend going, I won't ever buy another MacBook again. Currently they have the excuse of not having enough space on the 13" due to the gigantic and useless Optical Drive. Should they remove it and CONTINUE to not offer a dedicated GPU option for the 13" model, I'll have to switch back and retain my iPhone as only Apple product \

iPhone 4S 64GB, Black, soon to be sold in favor of a Nokia Lumia 920Early 2010 MacBook Pro 2.4GHz, soon to be replaced with a Retina MacBook Pro, or an Asus U500