Imagination Technologies announced on Tuesday that its next-generation PowerVR Series6 GPU core family, which could find their way into Apple's future iPhones and iPads, will offer 20 times more performance than the current generation.

The PowerVR G6200 and G6400 GPU IP cores are the first in the Power VR Series6 GPU family from Imagination Technologies. The chipmaker said its latest mobile processors are a new benchmark for high performance with low power consumption.

The G6200 will feature two compute clusters, while the G6400 will sport four clusters. Imagination Technologies said the chips will find their way into smartphones, tablets, PCs, consoles, cars, TVs and more.

Of course, one of the biggest users of Imagination Technologies chips is Apple, which features their graphics processors in its custom-built ARM CPUs that power the iPad, iPhone, iPod touch and even the Apple TV. The A5 chip found in the iPhone 4S has a GPU clocked at 800MHz, which is 73 percent faster than the A4 processor that powered the iPhone 4.

The new Series6 GPUs are based on the PowerVR Rogue architecture, which Imagination said will enable devices to provide "ultra-realistic gaming" and more complex applications. The Series6GPUs can deliver 20 times or more of the performance of current GPU cores through an architecture that is five times more efficient than previous generations.

"Based on our experience in shipping hundreds of millions of GPU cores, plus extensive market and customer feedback, we have been able to set a new standard in GPU architecture, particularly in the areas of power, bandwidth and efficiency -- the key metrics by which GPUs are now judged," Imagination Chief Executive Hossein Yassaie said. "We are confident that with the Rogue architecture we have a very clear technology advantage and an exceptional roadmap for the PowerVR Series6 family which our partners can depend on."

Imagination announced last June that the "Rogue" processors were being licensed to six partners. Apple was not named among those partners, but is a major shareholder in Imagination Technologies. AppleInsider first reported in 2008 that Apple had purchased a 3 percent stake in the company, and its share grew to 9.5 percent in 2009.

Though there haven't been any concrete indications about Apple's next-generation mobile processor, it's possible that a so-called "A6" chip, rumored to appear in Apple's third-generation iPad, could feature Imagination's Series6 GPUs. Apple is rumored to launch its next iPad in March, a year after the iPad 2 was introduced.

If true, holy cow! Twenty times more power ... Sheesh. This will make gaming pretty amazing.

sometimes benchmarks such as that mean that the max performance of all the registers etc on the chip are equal to 20 times the predecessor - which does not necessarily translate into 20 times the overall performance. especially when you consider bus interconnects and the rate at which bits can be pumped into and out of the unit. CPUs for example have a given number of registers available to process bits in every tic of the clock - but if the threads running don't have data that needs to be processed by a given register in a given tick of the clock - that component goes unused for that tic of the clock.

in addition - there may be capabilities in the chip that only a subset of applications will use - anyone remember Alti-Vec?

not trying to be a downer here and I certainly do not know the technical details of either the current or the future chips - just pouting out that typical bench-marketing has a long history of taking the single data point that looks best in a presentation and promoting that - when in reality the truth is something more like maybe twice the overall performance in a real world setting with some operations seeing no benefit and special cases with re-written code getting that advantage.

also keep in mind the weakest link in the chain is often what determines the user experience and to fully utilize the new chip may also require other components to be upgraded as well before the full benefit is realized.

\ It sounds like a GPU pissing contest. Is all that power actually necessary in a smartphone? I can understand if they're going for battery life, but 20X power seems to be going a bit overboard. I suppose I just can't grasp what that power is going to be used for since it's most games that need to push lots of pixels around. Is it the HD display that requires all that GPU power?

\ It sounds like a GPU pissing contest. Is all that power actually necessary in a smartphone? I can understand if they're going for battery life, but 20X power seems to be going a bit overboard. I suppose I just can't grasp what that power is going to be used for since it's most games that need to push lots of pixels around. Is it the HD display that requires all that GPU power?

On the fly augmented reality matching 3D models with real world objects making for millimeter^3 precision in physical positions.

Want to fix your car, bring up the virtual mechanic who will highlight where to look for what.

Want to locate a name at the The Wall? (Vietnam Vets memorial) Point the phone camera at it, it will highlight it.

Want to find a person in a crowd who wants to find you too? Their phone will help your phone do it and it will be real world visual, not birds eye view pin positions.

Just think of the MMOG applications!

It takes a distinct lack of willingness to imagine to think that mobile GPU power isn't usable.

Of course, with claims like those you're better off waiting for benchmarks. Anyone would still be dumbfounded with a 20x performance increase.

I would have guessed that Apple would use a SGX543MP4+ (like that used in the Sony PlaystationVita, along with quad-core ARM Cortex-A9) and hold off on the SGX 600 series for 2013 products.

Hey, if they're plan to use the SGX 600 for 2012 iOS products, maybe there's hope that Apple will also incorporate dual-core ARM Cortex-A15s (instead of quad-core ARM Cortex-A9), which is more power efficient and a better performer.

And iMacs, and Power Macs, and MacBook Pros, and MacBook Airs, and several components (other than CPUs) in iPods, iPhones and iPads. And data centers, and IT services, and really, they're actually a research company.

Also don't be surprised if relatively soon, Intel does start fabbing and even co-designing the SOCs for iPods, iPhones, iPads and Apple TVs as well.

and we're starting to see some full game ports like GTA 3. Maybe we'll get a large back-catalog of PC games ported over to iOS. It should easily handle the full Modern Warfare titles although games will probably be downsized to prevent hefty download times or even split into parts.

Agree. Intel got fat, happy, and lazy over the past two decades. An entire generation of Intel employees has no idea what it means to actually compete. Intel crushed AMD by designing their compiler to create sub-optimal code that crippled anything other than genuine Intel CPUs. They won the "megahertz race" against PowerPC by ramping their clock speed while lengthening their instruction pipeline. With no effective performance improvement for end-users.

And now all they're doing is milking the old x86 design for as long as they can get away with it. They're running into the wall with respect to Moore's Law. Sure, they might still be doubling transistor density every two years, as Moore's Law predicts. But the benefit to the end user from that doubling is negligible.

Oh wait. I said "all they're doing is milking the old x86 design." That's not true. They're also getting crushed in the GPU market, with no foreseeable end to that particular misery. It's getting so bad that they actually faked an Ivy Bridge graphics demo. On stage, at CES, with the tech media in full attendance. An Intel stooge pretended to play a game being displayed on a large screen behind him when, in fact, it was a pre-rendered video playback. The VLC media player control panel was the giveaway. He even admitted that the demo was being run "from backstage."

And one more thing: Intel is dead on arrival in the mobile market. The hottest market. The market that is being overrun by ARM-based devices. And no amount of fake demos or outright lies can hide that fact.

Intel Atom - Big in netbooks. But netbooks are a dying breed, a holdover from the 20th century.
Intel Tegra - An inefficient, power-hungry ARM implementation. Used in quite a few failed Android pads.

If Intel doesn't up their game, they'll be left behind, along with Microsoft. Stuck in the past, when desktop computing was the freshest thing cracking. Trying to relive the glory days of the '80s and '90s and failing. Times change.

And iMacs, and Power Macs, and MacBook Pros, and MacBook Airs, and several components (other than CPUs) in iPods, iPhones and iPads. And data centers, and IT services, and really, they're actually a research company.

Also don't be surprised if relatively soon, Intel does start fabbing and even co-designing the SOCs for iPods, iPhones, iPads and Apple TVs as well.

Intel makes for pc and mac

Intel is positioning itself to compete directly with arm. They will eventually make for android and hopefully apple.

Interesting ideas and no offence, but this seems like all pie-in-the-sky futurist stuff to me.

Sorry, I can't let this stand. The iPhone itself, with it's insane screen, processing and graphics power and connectivity would have been considered "pie in the sky" vision stuff 5 years ago. Things are moving so fast and developers are finding ways to use whatever can be offered them. Hiro may not have the specifics (though they look interesting to me) but I have no doubt that power will be used--and fast...

Sorry, I can't let this stand. The iPhone itself, with it's insane screen, processing and graphics power and connectivity would have been considered "pie in the sky" vision stuff 5 years ago. Things are moving so fast and developers are finding ways to use whatever can be offered them. Hiro may not have the specifics (though they look interesting to me) but I have no doubt that power will be used--and fast...

disagree. Technology is progressing just fine. iphone is not pie in the ski technology.

disagree. Technology is progressing just fine. iphone is not pie in the ski technology.

was in japan. They have 75 mg LTE with 32 mg as the norm.

OK. On not sure what we disagree about, though.
Of course the iPhone isn't pie in the sky now--it is here. 5 years ago, however, I'm sure rational people in the industry would have told you that the iP4S's current specs would be a) impossible in a small form factor with decent battery life or b) pointlessly over speced (why would a phone need the screen resolution and graphics power of a laptop?)

I'm not sure what this has to do with how many "mg" the LTE network in Japan can deliver...

From Anandtech's story on this:
"Finally, as PowerVR is an IP vendor, there isnt any kind of timeline on availability as this is up to their customers. The only SoC announced to use Series 6 so far is ST-Ericssons Nova A9600, which is not scheduled to arrive until sometime in 2013. Given the fabrication ramp-up schedule for most SoCs, any Series 6 equipped SoC is still a year out if not more; in the meantime there are still a number of ARM A15 + SGX543/544 scheduled for later this year. As for larger and more capable GPUs such as the SGX545 the release gap has been closer to 2 years, so DirectX 11.1 SoCs in particular are almost certainly 2014 products assuming PowerVR gets a DX11.1 GPU design out this year."

sometimes benchmarks such as that mean that the max performance of all the registers etc on the chip are equal to 20 times the predecessor - which does not necessarily translate into 20 times the overall performance. especially when you consider bus interconnects and the rate at which bits can be pumped into and out of the unit. CPUs for example have a given number of registers available to process bits in every tic of the clock - but if the threads running don't have data that needs to be processed by a given register in a given tick of the clock - that component goes unused for that tic of the clock.

in addition - there may be capabilities in the chip that only a subset of applications will use - anyone remember Alti-Vec?

not trying to be a downer here and I certainly do not know the technical details of either the current or the future chips - just pouting out that typical bench-marketing has a long history of taking the single data point that looks best in a presentation and promoting that - when in reality the truth is something more like maybe twice the overall performance in a real world setting with some operations seeing no benefit and special cases with re-written code getting that advantage.

also keep in mind the weakest link in the chain is often what determines the user experience and to fully utilize the new chip may also require other components to be upgraded as well before the full benefit is realized.

Well, the dual core version Apple uses now wipes the competition away, and that's including the graphics on the new quad core Tegra 3 SoC.

Of course, with claims like those you're better off waiting for benchmarks. Anyone would still be dumbfounded with a 20x performance increase.

I would have guessed that Apple would use a SGX543MP4+ (like that used in the Sony PlaystationVita, along with quad-core ARM Cortex-A9) and hold off on the SGX 600 series for 2013 products.

Hey, if they're plan to use the SGX 600 for 2012 iOS products, maybe there's hope that Apple will also incorporate dual-core ARM Cortex-A15s (instead of quad-core ARM Cortex-A9), which is more power efficient and a better performer.

Interesting ideas and no offence, but this seems like all pie-in-the-sky futurist stuff to me.

There's "lack of willingness to imagine" and there is "my imagination is working so hard I have no clue what's real anymore."

The extra graphic power touted here is basically all about games.
This other stuff won't arrive until iPhone version 15 or so.

Not just games. It works with Open CL as well. That's very important for computing tasks. I use 3D CAD on my iPad 2 now, and a much improved SoC would be very welcome. The four times as many pixels in the new screen would consume a lot of that power to just stay even. That would leave five time's as much power.

Apple does need a much faster memory path for this as the graphics now is limited by that. I would imagine that this would be taken care of.

But there's no such such thing as enough computing power. Developers use up all the computing power available, and then cry for more. Right now, even the most advanced games don't have the graphics sophtication of desktop, mor even console games, so several times as much graphics horsepower is needed to try to match it. As iPads are being considered as console substitutes, with AirPlay, they will have to match the competition. As there are now two game comptrollers for the iPad out,things are falling in place. The last major area is more power.

\ It sounds like a GPU pissing contest. Is all that power actually necessary in a smartphone? I can understand if they're going for battery life, but 20X power seems to be going a bit overboard. I suppose I just can't grasp what that power is going to be used for since it's most games that need to push lots of pixels around. Is it the HD display that requires all that GPU power?

If the power of the GPU can be harnessed for CPU calculations, the possibilities are endless. It can help process any intensive task the device needs to do. I'm thinking speech and visual recognition, augmented reality applications, live analytics, etc.

On the iPad, it would be perfect, on the phone I dont think it is practical, maybe a portable gaming device. As it is the iPod is losing ground to the iPhone they might make that more of a gaming device. It will be a easy switch as the platform is already developed and rocking. I think most of the users wud prefer the phone to be just that than a gaming device. What use is the smartphone when the battery is drained because of all the games one plays and there is no power socket to charge it up. lol..

If the power of the GPU can be harnessed for CPU calculations, the possibilities are endless. It can help process any intensive task the device needs to do. I'm thinking speech and visual recognition, augmented reality applications, live analytics, etc.

All of that in a super thin iPhone. I have already noticed a distinct increase in heat production from my iPhone 3GS -> 4 -> 4s. Yet, I have not seen anything truly spectacular that the 4s' 2x cpu power / 7x gpu power helps with other than a single game.

I would much rather see Apple work on SERIOUSLY increasing battery life beyond what is currently considered acceptable. As it stands Apple seems more interested in taking advantage of new battery breakthroughs just to make their devices thinner. How about keeping a device the same thickness AND using a better form of battery to give us battery life that wouldn't require a charger at home, in the car, at work and in our bags!!!!

Of course all this power is irrelevant if Apple doesn't decide to someday increase the size of the screen on the iPhone. And yes, I have actually seen people post here and on macnn who seem to believe that Apple never needs to increase the screen size.

Man, another unbelievable trend. Nvidia and ATI which once ruled the roost, which Voodoo and SGI once ruled.

With PowerVR back from obscurity and having a second wind like Apple, unless something messes up bad they are going to give Nvidia and ATI a big run for the money, and Intel will very soon be eating their dust.

I've said it over and over again, by iPad 4 and OpenGL ES 3 or whatever running Unreal Engine 4 full-spec, it should be better than PS3 and Xbox360, except portable, quiet, and more versatile. Alongside an Apple HDTV which will probably be 2K res specifically for iOS kinds of stuff...

I love Unreal Engine 3 (full-spec for console and PC, not iOS...yet)... I think it is one of the most capable and beautiful gaming engines, alongside Valve's Source. What Epic has been able to cram into 5-year old hardware, which really is ancient stuff, is impressive. OpenGL ES is alright but it looks like DirectX 6. Time to blow it wide open by the time the iPad 4 rolls around - I'm talking god rays, better shadowing, 4x MSAA/CSAA/etc, really nice polygon and texturing capabilities, etc. Essentially Core i7, DirectX11, ATI 5870 2GB VRAM kind of graphics on a tablet. Should be here by 2014 if not earlier.

And the Deus Ex : Human Revolution menu ~ beautiful... Imagine apps running in 3D like that with all the shadowing, textures, 4x MSAA/CSAA.

And don't forget physics. Xbox360 and PS3 have quite basic physics. Some of the best stuff from Havok and PhysX, on iOS platforms, will be impressive when the platform supports it. A genuinely destructible (or constructible environment ~ think those slow-mo-reverse film scenes) for gaming and other apps, physics applications and calculations. Wow. Mind blowing stuff.

How ironic that a top-line PC now can do so much but is so horribly crippled by Windows, is hot and heavy and bugged by endless driver and hardware problems when trying to run the latest games and software. I switched from PC gaming to Xbox360 in September and have never been happier, though I get a little picky when I know what I see on the HDTV was what I had on my PC three or four years ago.

The only sure thing is that the next iPad will need as much cpu and particularly gpu power as possible if it's going to drive a 2048x1536 display.
I hope Apple doesn't go there because I think we will see a performance hit. Especially in 3d gaming.
We're talking about desktop resolutions being driven by mobile GPUs.

The only sure thing is that the next iPad will need as much cpu and particularly gpu power as possible if it's going to drive a 2048x1536 display.
I hope Apple doesn't go there because I think we will see a performance hit. Especially in 3d gaming.
We're talking about desktop resolutions being driven by mobile GPUs.

Who cares if there's a performance hit? I can't think of a BETTER way to use all that extra power than to have a retina display. It seems to be the iPhone 4S/iPad 2 are barely being tasked as it is, and the next generation will have internals which are order of magnitudes more capable.

Also, these 'mobile GPUs' you speak of are much more efficient than desktop GPUs, and more powerful than desktop GPUs of just a few years ago. Infinity Blade 2 on the ultra-thin iPad2 still blows my mind. Imagine what the A6 paired with the next generation of mobile GPUs could do.

Who cares if there's a performance hit? I can't think of a BETTER way to use all that extra power than to have a retina display. It seems to be the iPhone 4S/iPad 2 are barely being tasked as it is, and the next generation will have internals which are order of magnitudes more capable.

Also, these 'mobile GPUs' you speak of are much more efficient than desktop GPUs, and more powerful than desktop GPUs of just a few years ago. Infinity Blade 2 on the ultra-thin iPad2 still blows my mind. Imagine what the A6 paired with the next generation of mobile GPUs could do.

If your iPad will start feeling sluggish I'm sure you will notice and also care.
Infinity blade 2 while being impressive is basically smoke and mirrors. That's why it looks so much better than every other game. By that I mean that there are a lot of tricks involved in the background to make it look good. And sometimes there is a slight slow down. And that is on 1024x768. Now imagine having that with 4 times the resolution. Suddenly the GPU will be bogged down. Things will be even worse for open world games. I think the quality of the new games will be either at the same level or below the ones of this generation meaning no improvements in polygon counts and textures etc. Which will be a bummer.
And while what you say is true that mobile GPUs are at least as powerful as a few years ago you will also remember that a few years ago we weren't also driving resolutions as high as what the new iPad might have.
I sure hope so everything will run smoothly and we will have a similar or better experience with the new iPad but I'm still a bit wary.

PS: What could be done is maybe force the program to run on a lower resolution (1024x768) than the rest of the os. But I'm guessing this hasn't been built on the iOS.

sometimes benchmarks such as that mean that the max performance of all the registers etc on the chip are equal to 20 times the predecessor - which does not necessarily translate into 20 times the overall performance.

Some time ago they said it did 210 GFLOPS. That 50% than what the Nvidia 320M found on 2010 13" MBPs can do.

Any further doubts?

iPhone 4S 64GB, Black, soon to be sold in favor of a Nokia Lumia 920Early 2010 MacBook Pro 2.4GHz, soon to be replaced with a Retina MacBook Pro, or an Asus U500

For anyone who thinks there will be a 20x increase in gpu power in the next 12 months, I have a bridge to sell you.

Possibly in twelve months. The first chips won't be this fast, but the second generation will be. Look at how much faster the current one is from the one the 4 and the iPad 1 used. My devices are much faster. It's very noticeable.

The only sure thing is that the next iPad will need as much cpu and particularly gpu power as possible if it's going to drive a 2048x1536 display.
I hope Apple doesn't go there because I think we will see a performance hit. Especially in 3d gaming.
We're talking about desktop resolutions being driven by mobile GPUs.

I do a lot of reading on my iPad 2. Despite what some people think, it's great for that. But, there are times where the type isn't sharp. One area is in magazines. Mostly, they're quite readable without expanding the page size. But on some, it's more difficult. That's true for the web as well. Quadrupling the number of pixels in a character will allow 6 point text to be as sharp as 12 point text.

My 4S is vastly sharper than my old 3G. The difference is astonishing. Things that were barely readable before are easy to read now. I expect the same thing to happen on an iPad.