That's what caught my eye. Sounds like the low performance chip will be running most of the time and they'll only fire up the quad core for really CPU intensive stuff. That could save energy, although possibly at the expense of performance when in 'low speed' mode.

I believe that the A5 and A6 have on-demand cores - which means that they only need to run one core if the workload demands it. That achieves much of the same thing without requiring a separate chip. Time will tell which approach is better.

The problem for NVidia is that the Tegra 2 didn't even even really challenge the A5...it was pretty much dead on arrival in terms of graphics performance, which is Nvidia's strong suit. They say that Kal-El is twice as fast as the A5 in terms of performance like video encoding. Considering they are benchmarking 4 cores against 2 with a test that is easy to split among cores, I would certainly hope so. The real test will be if they can advance the field with single-core performance and especially with graphics performance. Just equaling or nominally improving on the A5 won't cut it...you know Apple's A6 will blow that away. When the benchmarks are finally published, it will be interesting to see what Kal-El is really capable of.

Yes, but history suggests that being skeptical is appropriate. The made all sorts of performance claims in the past that weren't true, so it makes sense to take their current claims with a grain of salt.

Quote:

Originally Posted by Sevenfeet

And to answer your question, Apple has invested billions in their own processor designs (acquisitions, research, paying for fabs). It's unlikely they would just dump it now and go to Nvidia...not when they have been the performance/power leader for the last few years.

But IF Nvidia were able to offer a clear performance advantage, there would be nothing stopping Apple from switching.

"I'm way over my head when it comes to technical issues like this"Gatorguy 5/31/13

It does. The fifth core is a very low power ARM7 core that is used for low performance tasks (think of it as a processor for the original iPod), and when nothing heavy is running the chip will completely power down the whole core Cortex-A9 complex. Quite clever idea.

Nota bene that the ARM7 core isn't using the high performance ARMv7 ISA that is the Cortex-A8/A9/A15 is using, it's using the ARMv4 ISA, that is much older and less performant. For code to run on ARM7 cores, it need to be targeted for this particular architecture (or Tegra 3 which includes it). Deploying on Tegra 3 must take this into account and hand set manufacturers using it will probably do some optimizations to the operating system but App-developers most certainly won't.

Even if it's a clever idea, In the real world can't see many Android devices ever going into a mode where the main CPUs are powered down, running only on the companion core.

I'm delighted to see people pushing the ARM forward. I think Apple is pragmatic enough that if nVidia does come up with a really terrific design that blows away Apple's next candidate processor, they could just switch to the nVidia chip. They're all just ARMs, right? From a developer standpoint, nothing would need to change. We wouldn't even need to recompile.

Can someone with knowledge on the matter answer these questions: are these things pin compatible? Do they all have identical instruction sets, or does nVidia or Apple get to add custom instructions without breaking their licenses?

ARM is the overriding SOC architecture which means that physically thay are the same or similar, but for example, Samsung used Intrinsity to redesign the logic for the Hummingbird chipset which was based on the 45nm ARM Cortex A8 architecture using the ARMv7 instruction set. That customization allowed for a 5-10% reduction in instruction set management overhead. Now that Apple owns Intrinsity, we have to wait to see how they handle existing licensing to Samsung or if Samsung now has to go out and find another customizer. It also depends on which GPU they choose to use built into the architecture.

TI's OMAP processors use the Cortex A8/45nM, but uses IVA 2 imaging accelerator that supports hardware encoding of camera sensor data, paired with an integrated signal processor that handles image and video capture, and better battery life. They also have leveraged the PowerVR SGX530 GPU (same as used in the iPhone 3GS). But they could also leverage using ARM NEON for multimedia. They should be bringing out a new series based on dual Cortex-A9 1GHz cores, but rumors also are out there that they are moving on to quad core architecture.

The "problem" is development in this space is pretty wild right now with ARM and Atom providing the architecture, but with a high degree of customizing going on by Samsung, Qualcomm, Apple and TI, as well as Intel on Atom.

nVidia will try to push as much graphics support into their ARM-based chipsets to leverage their expertise there, while the Apple acquisition of Intrinsity will cordon off that innovation space to future Apple ARM-based chipsets, leaving Qualcomm and Samsung to figure out their own approaches.

I've never understood the obsession with calling pre-announced products "Vapor". It screams of people with their head in the sand or acting like a child with its eyes shut and fingers in its ears screaming to try to ignore what is happening.

I have never understood why people hang about of forums trolling…
Really dude, get a life.

Anyway, will Intel be able to drive competition for this level of development with the Atom series?

No, still to much power use, to expensive, slow due to legacy issues.

If Intel wants to make Atom mobile, they needs start doing a lot more R&d (or at least make it more effective) than now, AND need to throw out everything not needed for Android and or possibly Windows mobile.

PC means personal computer.

i have processing issues, mostly trying to get my ideas into speech and text.

Yes, the megahertz war that eventually compelled Apple to abandon 20 years of Motorola architecture to move to Intel.

It wasn't the megahertz war per se, but the fact that the PowerPC producers (Morotola/IBM) were in the business primarily to supply game platforms - whose volumes were an order of magnitude higher than the number of Macs being sold. This meant that try as Apple might they couldn't leverage higher speeds/better performance from them. It was that which drove Apple to Intel, because Intel was willing to work with Apple to source the kind of performance that Apple was seeking.

The evidence of this was produced recently in the "ultrabook" standard Intel offered to the other PC makers to provde a competitive platform for the now popular MacBook Air. It was Apple driving the performance for Intel, not Dell or any other PC maker.

If you are going to insist on being an ass, at least demonstrate the intelligence to be a smart one

Why does every product that comes out have to be as a challenge to Apple? The headline writers on this site are automatons.

Because Apple is king. You don't compare to the also-rans.

For instance, you'll be hard pressed to find many articles describing the iPhone 4S as a Samsung Galaxy S II-killer. I found one on DroidDen.com that clearly thinks the 4S is so bad that it loses to the Galaxy S II with a score of 1 to 8, only giving the 4S a point for tying on the camera specs.

Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"

Give me custom case and motherboard fabrication and I could design a strong competitor for that segment using stock components for everything else.

This isn't to say that the MBA isn't a fine machine. On the contrary, it's one of the best computers available in the world today. But it's not magic, it's a computer.

The fixation on aluminum cases is a bigger hurdle than the CPU, and those vendors would do a better service to their stockholders to stop whining about the CPU and focus on that much more real issue.

But aluminum is not the only way to make a case. For the segment that wants the MBA, they'll buy the MBA. The opportunity is in serving that segment through a different enclosure.

By not recognizing the real opportunity with enclosures, and allowing themselves to be needlessly distracted by the fantasy that they need a different CPU, the designers in these companies exhibit a level of lameness that should rightfully shame them from showing up to work in the morning. I have no idea why CEOs put up with such idiocy, failing to recognize how it makes themselves look idiotically impotent.

I see. So we're supposed to believe that you know more about computer design than HP, Dell, Intel, Acer, Asus, Toshiba, Sony, and Lenovo put together?

Wow. How do you fit all that knowledge in your head?

"I'm way over my head when it comes to technical issues like this"Gatorguy 5/31/13

Well both tegra 3 and psp Vita are using cortex9 quad-core CPU. If apple can leap frog them to AR 15 quad-core CPU/gnu in 28nm process they will be king for another year. With better hard wear/softwear integration and maybe being ableto clock their CPU/gnu slower than the competition mantaining eir lead on battery time.

Both mobile phones and mobile gaming platforms will lose badlly to iphone 5 and iPad 3...

It seems we're more or less on the same page. There are many reasons why Motorola couldn't keep up with Intel's advances, but ultimately it was performance that compelled Apple to switch, as you note.

Not being able to reach the promised 3.0 GHz even a year after the promised date had already slid past. Not being able to get the G5 processors cool enough to go into notebooks. Apple was stuck w/no way to move forward. When another company is stagnating you (which is ironically what is happening now w/the Mac Pro line due to teh series of delays with the Xeon processors) you have to look for other solutions. Intel had a better roadmap and since OS X had been built for Intel as well since day 1, it was a feasible switch.

That's pretty accurate... lol Android is one choppy OS. When my finger swipes across the screen, the icons lag behind my finger swiping speed, which grows as my finger swipes. My finger got to the edge of the screen, but the icons my finger was just under when I first touched the screen were only half way across in the animation. They still haven't fixed this issue after all this time? The iPad and iPhone just feel natural to work with.

Rubbish, if your phone is doing that then take it back for a replacement, I have a slow Android phone and it doesn't lag on any of the screen swipes

Rubbish, if your phone is doing that then take it back for a replacement, I have a slow Android phone and it doesn't lag on any of the screen swipes

That was every Android tablet on display at a Best Buy I stopped in. Not just one tablet. They had 7 different tablets and they all had the same problem. Android is choppy. It's like they just don't do the finishing touches that make it a polished product.

I dare you to go do the same experiment. Take any Android tablet currently on the market, and swipe your finger to move from page to page and then do it to an iPad. There is a huge difference in the level of polish. It's like Google just didn't care.

In this one specific case, I'd even go so far as to suggest you do too.

Making a slender computer isn't magic, and with good fabrication it's not even a technology issue. What we have here is simply a management issue, and the fact that all of those manufacturers are claiming they need a new CPU to compete with the MBA which uses an existing CPU rather speaks for itself.

In all fairness, it may be that they actually do understand that it's nothing more than a distraction to say that an MBA needs a CPU other than the one the MBA uses, and that they're just not willing to publicly admit that they don't have a handle on fabrication.

But whether by ignorance or merely ruse, this whingeing about CPUs is an embarrassment for them.

Fine. Put your money where your mouth is. The Ultrabook market is a multibillion dollar market. Go ahead and make one and release it to the market.

Or, why not offer your super-advanced expertise to one of the existing vendors for, say, $10,000,000?

Please stop making yourself look foolish. It really bugs me when the idiot trolls think that their ability to type a sentence on a forum like this instantly makes them an expert.

"I'm way over my head when it comes to technical issues like this"Gatorguy 5/31/13

Yes BUT it really depends on how much multitasking you have on actually! If the system keeps limboing between these cores (Power->lowpower->power->lowpower) I guess it really makes no sense but if you only keep one application running then it would make sense.... But thats not what fandoids want to hear....

I guess this is why apple never did anything to get speedstep working on their chips (atleast production operatingsystems). So no speedstep even when idleing....

That's what caught my eye. Sounds like the low performance chip will be running most of the time and they'll only fire up the quad core for really CPU intensive stuff. That could save energy, although possibly at the expense of performance when in 'low speed' mode.

I believe that the A5 and A6 have on-demand cores - which means that they only need to run one core if the workload demands it. That achieves much of the same thing without requiring a separate chip. Time will tell which approach is better.

Quad core Tegra plus one companion (low speed) chip.

Yes, but history suggests that being skeptical is appropriate. The made all sorts of performance claims in the past that weren't true, so it makes sense to take their current claims with a grain of salt.

But IF Nvidia were able to offer a clear performance advantage, there would be nothing stopping Apple from switching.

Nvidia is not king in the embedded space next to ImgTec. They don't stack up.

As I said, give me custom case and motherboard fabrication and I could design a strong competitor for that segment using stock components for everything else.

Yes, the offer stands. Find any manufacturer willing to risk an afternoon meeting, and we'll see what we can do.

It's heartwarming that you're defending the companies you normally vilify, and I guess I should be grateful that you're merely calling names here rather than threatening physical violence as you've been known to do, so kindly allow me to return the favor:

Could you please explain how a MacBook Air isn't possible without a CPU other than the one it uses?

Hint: the MBA already exists, and it pretty much rocks, stock CPU and all.

I never said that a new CPU was needed. I simply pointed out that your assertion that you know more than the rest of the computer industry is absurd.

And I would appreciate it if you would stop spreading blatant lies about me. I never threatened physical violence to anyone. Here or anywhere else.

"I'm way over my head when it comes to technical issues like this"Gatorguy 5/31/13

My contention was that it's self-evident that making a computer that competes with the MBA does not require a different CPU than the one the MBA uses. That's the claim by those companies, and we both agree it's silly.

You can continue arguing about that if you like, and no doubt you will, but it won't make any more sense no matter how much you type.

If you had said simply that it didn't require a new CPU, you would have been correct and no one would have disagreed with you. But you said you could do a better job than Dell, HP, Acer, Asus, Intel, Lenovo, and everyone else. That is, obviously, BS.

Nvidia's only chance here is in the GPU portion (their own creation) is demonstrably better than the competition. Considering that Imagination is mopping the competition at the embedded level with PowerVR I kind of doubt Nvidia can really make a beach head here.

Nvidia is simply using a MP A9 core. EVERYONE is going to have product with this same core and larger volume customers with in-house design teams like Apple will be able to tweak the cores to good effect.

As for the ARM 7 chip on the side. Bah. ARM has already announced their A15 will run in a big.LITTLE configuration with a A7 processor mated to a A15. Vendors won't even have to alter their software significantly.

I can't get all excited about knowing a quad core A9 is coming when the A15 is so much better being an deeper pipelined OoO SoC.

I think the A6 = MP A9. A7 = A15/A7 combo with Rogue graphics.

He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.- SolipsismX

As a current owner of the Asus Transformer the Prime is high on my tech shopping list. I really enjoy using it, not sure why all the bad comments about Android, might not be as well polished as IOS but I've started using it a lot more then my iPad. No need for iTunes to install a program, Codec support is awesome, Flash, better multitasking and all the those cool custom roms make my Transformer a pretty neat little multimedia machine. Course the iPad 3 will also be high on my shopping list, gosh I love tablets.

When I looked up "Ninjas" in Thesaurus.com, it said "Ninja's can't be found" Well played Ninjas, well played.

As a current owner of the Asus Transformer the Prime is high on my tech shopping list. I really enjoy using it, not sure why all the bad comments about Android, might not be as well polished as IOS but I've started using it a lot more then my iPad. No need for iTunes to install a program, Codec support is awesome, Flash, better multitasking and all the those cool custom roms make my Transformer a pretty neat little multimedia machine. Course the iPad 3 will also be high on my shopping list, gosh I love tablets.

If you mean iTunes desktop. I rarely use it to install programs I'm downloading everything over the air.

Codec support - I use Azul and it plays everything I need it to

Flash - Dead

Multitasking - does what I need it to do.

I'm actually thankful for Android because it's delivering to all of us better and more competitive products regardless of platform.

That's what caught my eye. Sounds like the low performance chip will be running most of the time and they'll only fire up the quad core for really CPU intensive stuff. That could save energy, although possibly at the expense of performance when in 'low speed' mode.

I believe that the A5 and A6 have on-demand cores - which means that they only need to run one core if the workload demands it. That achieves much of the same thing without requiring a separate chip. Time will tell which approach is better.