Apple is already using it in their Snow Leopard apps like Mail, Finder, Safari... (list goes on). Developers can use GCD right now, it's just up to them when they update and release a GCD capable application.

In other words, if you app can do GCD, then it can already do it, no need for 10.7 and so on.

It will take a couple of years for the developers to adapt to the new technology. So, it will take a while so just wait and you will defiantly notice the new tech because it will make the app (especially if the app is power-hungry) much faster and so it can use multiple cores.

GCD challenges a lot of what developers thought they knew about threading. In a good way, but I shouldn't be surprised if only a small minority ever really gets comfortable with it and learns to use it well.

I would go out on a limb and suggest that probably even Apple's own apps do not exploit GCD to its fullest, because of the large legacy codebase to be adapted to an entirely new way of thinking about organizing code tasks.

It will take a couple of years for the developers to adapt to the new technology. So, it will take a while so just wait and you will defiantly notice the new tech because it will make the app (especially if the app is power-hungry) much faster and so it can use multiple cores.

Click to expand...

So in reality what was the big hype for Snow Leopard? 64 bit that the average consumer wont even take advantage of?

Snow leopard was the groundwork for future operating systems and software.
It is not hype, it is the way technology is going. Multithreading is is the way we are headed.
Just because you don't see a benefit now, does not mean there is none at all.
Give it time, allow more developers to start making use of the new operating system features.

Snow leopard was the groundwork for future operating systems and software.
It is not hype, it is the way technology is going. Multithreading is is the way we are headed.
Just because you don't see a benefit now, does not mean there is none at all.
Give it time, allow more developers to start making use of the new operating system features.

Click to expand...

+1

Programming for multithreading is extremely difficult. Apple greatly simplified this processes which is what the hype is about. Its here, now developers are able to take advantage of it in future iterations of software.

Multicore support generally means re-writing the entire program so its not something that will happen instantly.

So in reality what was the big hype for Snow Leopard? 64 bit that the average consumer wont even take advantage of?

Click to expand...

This may be hard to understand (instant gratification and all that), but these are all underlying technologies that need to be put into place at some point in time, and will actually be used at some point in time later. If Apple hadn't put these things into Snow Leopard, then applications using them would just arrive two years later than they do now.

And that's also a reason why Snow Leopard was $29 only: So that developers can use new features without having to care about older computers. Anyone who is not willing to pay $29 for Snow Leopard isn't going to pay for the applications that I write anyway.

This may be hard to understand (instant gratification and all that), but these are all underlying technologies that need to be put into place at some point in time, and will actually be used at some point in time later. If Apple hadn't put these things into Snow Leopard, then applications using them would just arrive two years later than they do now.

And that's also a reason why Snow Leopard was $29 only: So that developers can use new features without having to care about older computers. Anyone who is not willing to pay $29 for Snow Leopard isn't going to pay for the applications that I write anyway.

Click to expand...

That's right. And that's the correct way to think about it. Apple seemed to get this right with the SL pricing.

As individuals above have been saying, it is not that technologies like GCD and OpenCL are useless, it is that they are new. As such, they haven't yet been adopted by developers, despite the theoretical performance benefits. Contrary to public belief, when new APIs, instruction sets and programming techniques are introduced, they don't just magically increase application performance automatically: there is a learning curve for developers with respect to employing and benefiting from new low-level stuff.

Case in point: all of the continuously evolving fancy vector extensions to the x86 architecture, such as SSE/2/3/4. It took years for developers to fully leverage the potential of SSE2. Moreover, it has been more than two years since the introduction of SSE4, yet only a mere handful of applications (if that) are even capable of utilizing the instruction set.

MacRumors attracts a broad audience
of both consumers and professionals interested in
the latest technologies and products. We also boast an active community focused on
purchasing decisions and technical aspects of the iPhone, iPod, iPad, and Mac platforms.