Posted
by
Unknown Lameron Tuesday February 28, 2012 @06:10AM
from the four-cores-good-two-cores-better dept.

MrSeb writes with this news from Extreme Tech: "In a move that will shock and disgust bleeding-edge technophiles everywhere, Asus has announced at Mobile World Congress 2012 that its new Transformer Pads — the high-end Infinity Series — will use the recently-announced dual-core Qualcomm S4 SoC. The critically acclaimed Transformer Prime, the Infinity Series' predecessor which was released at the end of 2011, used the quad-core Nvidia Tegra 3. Why the sudden about-face? Well, the fact that quad-core processors don't really have a use case in mobile devices is one reason — but it doesn't hurt that the Krait cores in the S4 are significantly faster than the four Cortex-A9 cores in the Tegra 3, too. The S4 is also the first 28nm SoC, while Tegra 3 is still on 40nm, which means a smaller and cheaper package, and lower power consumption to boot. The S4 is also the first SoC with built-in LTE, which was probably a rather nice sweetener for Asus."
The Snapdragon S4 "Krait" CPU is still a bit shrouded in mystery as far as hard specs (Qualcomm has never been one to release docs), but it appears to be similar to the Cortex-A15 in performance; how they stand up to Intel's new Medfield designs remains to be seen.

In other words after carefully considering all their options and went with the one that offered the best overall package, whilst keeping the price point competetive?
Not nerd willy-waving, then?
Jolly good.

" Nvidia's processor won't be compatible with LTE radio chipsets for at least a few months and,with the One X due to launch stateside within 60 days, AT&T wants a version of the phone that supports 4G LTE."

now the above url is about a quadcore android phone planned for US release only as a two core, but the same datum likely applies- the

The problem is, the Tegra does not have an LTE radio/modem on the SoC. This means it's a seperate chip and the current chips do not pla nicely with the Tegra. Qualcomn has pulled a major upset with their new design as the SnapDragon is faster, uses less power and includes the needed LTE radio on the die. The only area where the Qualcomn offering sucks tits on a worm is graphics as the Tegra beats the hell out of it like a red headed step-child but that's the only thing it wins.

somebody please help me understand this - who cares what's under the hood? quad core, dual core, snappy dragon, spanky man, whatever. Why not judge the phones by performance? I feel like we're stuck in the mindset of the 90s, where my P3-600mhtz is bigger than your P2-400mhtz.

We certainly need to start using some universal processor benchmark score. Even though it wouldn't always produce completely comparable results, it would be much more useful than MHz (which on x86 desktops at this point tells almost nothing) or throwing around these "snapdragons".

Why is everyone obsessed at the number of cores? The more processors you ahve, the more complex scheduling your apps needs to perform to actually work faster. It's better to hav ea single core that is twice as fast, than two cores running in parallel.

Why is everyone obsessed at the number of cores? The more processors you ahve, the more complex scheduling your apps needs to perform to actually work faster. It's better to hav ea single core that is twice as fast, than two cores running in parallel.

Pfff... actually, the Tegra 3 has five cores, four of them are high-performance, and one is high-efficiency. The CPU is designed to shutdown the four cores for almost nearly everything, and just use the high-efficiency core in order to save on battery life.

So seriously, most of the time, the number of cores doesn't even matter, because unless you're playing a high-end game, the cores won't even be woken up.

So seriously, most of the time, the number of cores doesn't even matter, because unless you're playing a high-end game, the cores won't even be woken up.

So, unless I was buying a tablet specifically to play high end games on, why would I want to spend money on CPU cores that are going to sit there doing nothing? Surely a dual core CPU is a better move?

For 99.9% of the games, I'd say yes, if the Dual Core represents better better overall performance and compatibility with other things (like LTE) out of the box. The tablet is not made for that 0.1% that only wants to play that one game that makes use of the 4 cores... It's made for those that use those tablets for what they seem to be designed for (with a dock and everything...): Light work stations and media centers (that give it the 18 hour batteries.

I, for one, applaud this move. Core hype will get you nowhere in the long run.

Okay, you made a whole hubbub about choice, but still haven't explained why quad-core would be a good choice.
I think the debate is over whether or not quad-core is worth it, not if there should be a right to choose.

The whole "core" obsession on mobile devices seems to be nothing but marketing talk. At least, as far as I have been able to determine.

I have a Droid 3 which has a dual-core CPU and using System Tuner I found that the second core was always shown as "offline". Doing some research online I found that the second core is kept offline to preserve battery life. Supposedly, it only comes online if the load is particularly high.

But, no matter what I did on the phone, I could never get the second core to come online. Using one of the tweaks available in System Tuner, I can apparently force both cores to be online all the time. However, the second core is still shown as offline and I still can't seem to get it to come online via high usage. Also, battery life doesn't seem to have changed.

So, this wonderful second core seems to be entirely useless and nothing but an item for the marketing checklist on the advertisements.

Or... System tuner is reporting the status of the second core incorrectly? Surely it's wrong somewhere: it is incorrectly reporting the second core as offline or its force option is not working correctly.

Not sure what you meant "high load", but I guess that pushing a single thread very high won't cut it. It should be multi-threaded (or multi-process, but since it's java I don't really see it) high load.

The average user doesn't have the slighest idea how threading works nor why having more cores might be overkill. To them, it's just yet another number that must be increased. They look to us geeks, with our multi-core and multi-socket systems, and figure that's where they want to be once the prices come down. They're like kids emulating adults, and just as stubborn when I try to explain that the average human does NOT need a 12-core workstation with 48 gigs of Ram. It's hard enough convincing them that a Gigabit router won't make their DSL go faster than a 10/100 one, and they go absolutely retarded when they find out I use 10G fibre NICs.

This is what I tried to explain to my not-so-technical friend, who would ask me if the 4-core tablet was better than the 2-core one, and then ignored anything I said. It's a tablet, you don't multitask much on it. You're not running 50 torrents in the background, while your virus scanner eats a whole core protecting you from yourself, and trying to play a Youtube 1080p walkthrough on your second display while you follow along in Starcraft II on the main screen. It's a fuckin' tablet. One app at a time. If that app is smart enough to offload background tasks to a 2nd core, I'm already impressed. It's a very different computing experience from a desktop PC, and even there, most people get by just fine with a dual-core desktop. The mere fact that almost every computing device today has a dedicated GPU, it's like an extra "core" right there, in that it frees up the main CPU to do something else.

The average user isn't the only one who doesn't have the slightest idea of what hardware he really needs to get the job done. If "us geeks" also knew better then any synthetic benchmark would be automatically dismissed as being irrelevant and useless, and the most important property of a computing rig would be its cost/performance ratio, with cost reflecting not only the hardware price, direct and indirect, but also operational cost. After all, it's irrelevant if a certain game runs at 100fps or 10000fps,

As the old saying goes "Don't throw out the baby with the bath water".

Right now the ULPC category was never fast enough to do much other than Angry Birds, polling for mail in the background and have a few web pages going. Now with more power you have the ability to output via HDMI(mini) and have multiple apps running with a simple dual core. Where will quad + cores come into play? Partly with all the pixels getting crammed into these small displays. Most of the 10" tablets are going to be coming stock

My internet connection is faster than a 10/100 switch, so it really did get faster when I put a gig switch into one of the branches where a couple of machines were stuck far from the router with only a single cable run to share and I didn't have a gig switch handy.

And then someone will come out with the ultimate answer to more cores: "New, Coreless Computing(TM) - no need to wonder how many cores is right for you! With our new Coreless Computing technology, you can beat all those pathetic multicore junkies!" - picture of pathetic nerd looking glumly at his now-obsolete shiny.

Of course, I have no idea what 'Coreless Computing' might be - maybe processor-per-cell memory? Neural network processors?

Why is everyone obsessed at the number of cores? The more processors you ahve, the more complex scheduling your apps needs to perform to actually work faster. It's better to hav ea single core that is twice as fast, than two cores running in parallel.

Pfff... actually, the Tegra 3 has five cores, four of them are high-performance, and one is high-efficiency. The CPU is designed to shutdown the four cores for almost nearly everything, and just use the high-efficiency core in order to save on battery life.

Introducing the 16 core processor.

The first core walks by the process assessing it's potential time consumption.
The second core types this out into a report and forwards this report to the other cores.
The third core skims the report before filing and ignoring it.
The fourth core empties the inbox of the third core, failing to note the process.
The fifth core is focusing on it's career and promotion through middle management.
The sixth core notices that there is a process and tries to point this out the the third, fourth and fifth core.
The seventh core is having a nervous breakdown.
The eighth cor0xDEADBEEF.
The ninth core is dealing with the problems from the malfunctioning 8th core.
The tenth core distracts the process by acting as a door to door salesmen.
The eleventh and twelfth core hold the process down whilst the thirteenth core goes through it's wallet.
The fourteenth core takes the process's statement.
The fifteenth core actually runs the process.
and the sixteenth core is just along for the ride.

More cores means better multitasking since threads can run in parallel. Also, even for handheld devices you are unlikely to find, for example, a single-core CPU that is four times faster than each core of a quad-core CPU.

Another major advantage of multi-core systems is if a poorly written piece of software locks up it is highly likely to also be single-threaded and your system will chug along nicely despite the misbehaving program, allowing you to kill the process (by comparison, on a single-core system you're likely to suffer through five minutes of waiting for the system to respond before you are able to kill the process). Sure, in an ideal world this wouldn't happen but when it does happen it's nice to not be locked out of your system because of a single process misbehaving.

More cores means better multitasking since threads can run in parallel.

In my experience, many of the cases where people thought they need to use threads were really cases of doin' it wrong.When you have properly designed asynchronous APIs, inter-process communication, a decent kernel scheduler, and a GPU to offload graphics to, you can do non-blocking UI even on a single CPU, and the difference between single and multiple cores only manifests in performance improvements, often fairly marginal, and hitting different bugs (which good software should not have, of course).

Also, even for handheld devices you are unlikely to find, for example, a single-core CPU that is four times faster than each core of a quad-core CPU.

What are you talking about? Two years ago phones ran like absolute pieces of crap. Even one year ago, for the most part. There was constant stuttering, things weren't that smooth, etc. While apple's UI was smooth, actually running things at first? Not so much. Funny how people can't remember two years ago well.

Only since december or thereabouts have phones actually been getting towards powerful enough to handle everything smoothly.

Four processors on a smartphone is fairly trivial in cost, so no it doesn't

What are you talking about? Two years ago phones ran like absolute pieces of crap. Even one year ago, for the most part. There was constant stuttering, things weren't that smooth, etc.

Sounds like a typical Android UI experience:) I suspect that part of the drive for multi-core devices is to compensate for poorly designed software that couldn't respond to the user and wait for something else in the same thread.

Only since december or thereabouts have phones actually been getting towards powerful enough to handle everything smoothly.

I have a Nokia Lumia 800, that was released in November and was right then panned as hopelessly inadequate in the Self-Evidently-Important Core Count Smackdown charts. As it happens, I fail to find a case where I would want it to work much faster. Few applications take a little tim

I've been searching for someone to answer to a simple question: what are the tasks at which you feel your high-end smartphone should be faster, that are not attributed to things like network roundtrips where a faster proc is irrelevant? "Four times faster than good enough" just does not equal to "four times as good" to me, especially if it comes with a larger price point.

Gaming at sufficient graphics quality to make full HD gaming possible when I hook up to larger displays. I want PS3 level graphics at the least, and that's coming via a) lots of GPU CUDA cores and b) lots of CPU helping out as with known hardware much of the effects rendering (physics etc) can be pushed back on the CPU cores. It's not like consoles don't use them, and that kind of customisation is the whole benefit of having known SoCs used widely.

Gaming at sufficient graphics quality to make full HD gaming possible when I hook up to larger displays. I want PS3 level graphics at the least, and that's coming via a) lots of GPU CUDA cores and b) lots of CPU helping out as with known hardware much of the effects rendering (physics etc) can be pushed back on the CPU cores. It's not like consoles don't use them, and that kind of customisation is the whole benefit of having known SoCs used widely.

So you don't want to buy a console and tap its mature and focused market of games, instead you expect game vendors to have come forward to meet your use case and not just provide a shrunken down phone gameplay for mobile use. How many of them already did so? You plug in your phone whenever you need to play, and carry the extra hardware in your phone, putting up with some overhead in size, battery life, etc., even if you don't use it most of the time?

Uh, presumably they'd be doing that because tablets are at the same resolution. And yes, thanks to Moore's law phones are approaching PS3 level graphics. Does this surprise you? Even on a 4-5 inch display that makes a difference, so it becomes acceptable for both.
Tegra 3 is pretty much ps3 level visuals, and fits in phones that are this size. [androidcommunity.com] That's small enough, light enough and can do it the job. Given that it drops down to one lower powered core for the majority of the time and has good batter life, w

Uh, presumably they'd be doing that because tablets are at the same resolution. And yes, thanks to Moore's law phones are approaching PS3 level graphics. Does this surprise you? Even on a 4-5 inch display that makes a difference, so it becomes acceptable for both.

I can halfway understand the case for tablets, but are you saying that the same game UI can be played equally well on a big TV screen and on a phone, that furthermore has little in the way of input usable for typical 3D games? I guess I'm just not into gaming that much, and certainly not to a compulsory degree that would make me want to play big screen games on a phone.

Personally I want my phone to be successfully streaming media from the 'net, playing it through my bluetooth headset,

Unless HD video is involved, all this should not take a large share of scheduling slots on a reasonably good smartphone CPU. It worked on Nokia N800 with plenty of cycles to spare.

waiting for inbound calls

This is what the modem unit is supposed to busy itself with.

Given the web page rendering can max out a single core all by itself, how were you planning to avoid stuttering audio, interrupted audio feeds or missed calls?

By boosting the audio service process' priority, the way it's done in all modern operating systems? Note that it does not take much processing time to feed the audio sink with buffers, it just has to be done on time.

An incoming call should suspend all media playback and put the active application to the background, naturally. If your phone software does not do it, you got bigger problems than a lack of cores.

Sure, it _can_ be done - but it can be done better and quicker with multiple hardware threads.

I'd put it as "it can be done with more slack, and the warts will still show up at times". Certain popular devices seem to follow this design philosophy, indeed.

More cores means better multitasking since threads can run in parallel. Also, even for handheld devices you are unlikely to find, for example, a single-core CPU that is four times faster than each core of a quad-core CPU.

According to Amdahl you're looking at 1 / [( 1 - P ) + (P/N)] where N = number of processors and P is the percentage of a program that could run parallel. So if 75% of a program can be made to run in parallel on a quad-core processor we are looking at 1/[(1-0.75)+(0.75/4)] = 2.29, so we are looking at a maximum speed increase of 2.29 times the speed of a single processor not 4 times.

Another major advantage of multi-core systems is if a poorly written piece of software locks up it is highly likely to also be single-threaded and your system will chug along nicely despite the misbehaving program, allowing you to kill the process (by comparison, on a single-core system you're likely to suffer through five minutes of waiting for the system to respond before you are able to kill the process).

I haven't seen this. In fairly modern operating systems you'll have multiple services in operation. This means you'll most likely have more threads in execution than there are cores. Context switches between threads within a core of a multi-core processor will still need to be made. I've had a misbehaving program slow down an embedded multi-core processor board because we were "unlucky" that the OS scheduler was running on one of the same cores and other resources on the processor board were being committed by the errant process (eg. Memory, I/O ports, etc.) so the system is not as foolproof as you'd like since memory takes time to to swap, and deadlocks across cores can happen when a computing resource is shared.

Sure, in an ideal world this wouldn't happen but when it does happen it's nice to not be locked out of your system because of a single process misbehaving.

It really is a speed versus power issue. In an embedded environment, where one would hope that the system was well tested prior to being released to the public, such a safety net is really not required.

...so the system is not as foolproof as you'd like since memory takes time to to swap, and deadlocks across cores can happen when a computing resource is shared.

I didn't say it was foolproof, nor did I think it. I merely pointed out that one advantage of multiple CPU cores is when a runaway process tries to use every available CPU cycle (not any other resource, just CPU, some programs do stupid things like this) and the underlying OS allows it to do this it is a good thing to have more cores so you don't have to sit around and wait for the device to register user input (I had a Nokia smartphone which had a few programs that seemingly did this, they'd begin to proc

What you are describing is more of a function of the OS (Preemptive vs. Cooperative) multitasking.

The OS being used on the ASUS transformer is fairly modern and preemptive. By being preemptive the scheduler is a high priority process that uses an interrupt signal to trigger and perform context switches (switching between process threads). This is not as foolproof as you'd think because I've had Linux slow down to a crawl because of an errant spin-lock without a yield (sleep) which caused the scheduler to w

The more processors you ahve, the more complex scheduling your apps needs to perform to actually work faster.

Well, sort of. All you need to do is make sure you run each process long enough between breaks so that the ratio of scheduler time to actual processing time is small (fractions of a percent, say). You need a scheduler even on a single core to get multitasking to work anyway. Getting coders to take advantage of multiple cores is the actual problem here since writing bug-free parallel code is often very hard.

It's better to hav ea single core that is twice as fast, than two cores running in parallel.

Again, sort of. For tasks that do not parallelize well, obviously a single 2x-speed core is better tha

On a tablet, I don't see us needing more than 2 cores for the time being, because of how we use them as toys. It's kind of hard to use a desktop or even laptop while laying on the couch, at least without some type of foldable stand or unnaturally ergonomic beer belly. As a stupid little "I don't wanna get off the couch" internet machine, a tablet kicks ass. For that basic usage, 2 cores is more than enough.

If/when we start using them as true laptop replacements, with a keyboard and stand/dock, that's whe

I thought 3 and 5 core tablets were supposed to be coming out, where the "odd" core is so underpowered it can be left on when the screen and other cores are off, using practically no battery but still letting the tablet run its background processes.

I'm surprised more emphasis isn't being put on improving "standby" battery time because that seems to be the real killer in so many mobile applications these days (like my 14h SGS2 battery of doom).

Tegra 3 has 5 Cortex A9 cores, of which one is low power and used for light loads. The other 4 make up the full-power quad-core, and each core is power-gated so they're only used when required. This means that despite it having 5 cores, battery life is as good or better than the dual-core Tegra 2.

The tegra 3 does that... But the prime never had that much battery problems, since it got up to 18 (or more like 15) hours of battery if you had the dock. On the other hand, since there are very few devices with 4 cores, almost no application make use them, turning them useless.

Your SGS2 is configured wrong. You should be getting a standby drain of about 1%/hour (or less) with sync enabled.

Two things are at fault here, of course:

1. The awful apps that keep the phone awake and active during standby - for instance: Facebook2. Android, for not telling the user THIS APP IS KEEPING YOUR PHONE AWAKE, KICK THAT CRAP!

In your specific case: Check your battery usage (in your SGS2's settings), and find out which process is keeping your phone awake, either with the old battery history (Gingerbread and earlier, accessible via Spare Parts, apps like BatteryMonitorWidget or a dialer code that varies from handset to handset) or (ICS only - because someone at Google decided to remove the battery history) with an app like BetterBatteryStats.

The interesting part is usually partial wake usage. Eliminate the apps causing the most partial wake usage, and you'll have a power draw of next to nothing. Standby battery life with Google sync, a few IM clients (I run Skype and imo.im), Whatsapp, Viber and so on, should be around 4-5 days.

Standby battery life with Google sync, a few IM clients (I run Skype and imo.im), Whatsapp, Viber and so on, should be around 4-5 days.

I wonder what Skype did to achieve this. On N900 and N9, the Skype engine is the monstrous wakeup hog that drains your battery in a day and exchanges packets with various hosts on the network all the time.Did they subscribe to some push wakeup mechanism where the app can be launched on incoming activity that needs user interaction?

Skype calls can be made and received when it's running in the foreground, but as soon as you switch away - even mid-conversation - the application goes offline. Want to check a detail in an e-mail so you can tell the person you're calling? You'll have to hang up first.

$8.5bn for Skype, shame they couldn't chuck a few dollars at getting multitasking for the phone working.

Considering that the S4 eval board clocked at 100MHz faster than the Asus Prime did half-again better in a multithreaded computational benchmark thrown at it, I'd say that you're probably looking at the differences between an A9 and an A15- and you might have found a CPU that's considerably faster single-core than Tegra 3's single cores at clock. If so, there's an explanation for the move. The S4's cheaper. It consumes less power doing what it does at peak. And...if it's faster doing most of the things

3/5 cores - this is a Tegra thing, the snapdragon does it differently.

Tegra has a 'companion' core that is low, low power for standby tasks, then it switches the main cores on individually as they are needed. Note that the Tegra main cores are on/off designs.

The Snapdragon doesn't do this, it varies the power to each core individually, so they are all running in a low-power mode all the time until they need more. This means it doesn't need a 3rd core. Its debatable which is better, but the snapdragon can ru

From what I articles I saw yesterday I gathered that there would be two levels of the new Asus pads. One with the Tegra and the other with the new Krait. Here is one article that talks about it: http://www.anandtech.com/show/5586/the-asus-transformer-pad-infinity-1920-x-1200-display-krait-optional

Of course we won't know anything for sure until Asus releases the product details.

the one outstanding product to come out of the MWC, Nokia's 808 "Fuck everything, we're doing 41 megapixels" PureView, is ignored by Slashdot for whatever reason, while tiny product differentiations that don't warrant attention at all are posted.

No, that is the dumbest of them all. Symbian is dead for smartphones (anyone can see that), why on earth would they not add that to their windows mobile line? It's like saying "look, we have this 500bhp engine, lets fit it into a horse carriage".

Sure, it's so dead that it sold more than six times as much as WP7 in Q4 2011 and almost half as much as the iPhones. Any blind and gullible person can see what they're told they should see, and the idiots will repeat it ad nauseam to show how perceptive they are.

Fact of the matter is, Nokia just released some interesting technology. This site is supposedly about "news for nerds", not news for gadget consumers who feel they need to prove themselves by buying whatever "experts" in the social media tell them

Ok, then tell me how many of those were smartphones. Symbian sold in low end normal phones (where it actually fit), the symbian adaptation for touch is a fiasco that even Nokia has seen already (why on earth would they bow down to windows phone when they had, at least, 2 other OS choices).

I don't know if you're trolling me or actually being serious, but there is a reason Nokia (even though they sell that many phones) has been downsizing left and right.

Spot on. Also, this is a more ingenious use of the resolution than just throwing in megapixels for the spec sheet value with no difference to actual image quality for the tiny lens and the respectively constrained matrix size. Without multisampling or other processing, higher resolution matrices may actually produce worse results because sensitivity of individual elements gets lower as they get smaller.

They're binning the samples, for an actual resolution of 5mp. And they have to; a lens that size is unable to create an image of sufficient resolution for anything like 40mp being useful. You go above 8mp or so and you'll only get better pictures of the lens blur.

And it's not at all clear that binning several individual photosites is better than simply having larger sites in the first place either. Of course, being able to write "41mp!! *woot* *Munchkin FTW!* " in your promotional material is a likely sales

Untrue. They've released a bunch of 38 MP images with staggering detail. It is, of course, not its primary function: they take up far too much space for a phone, and in the end you'd prefer noise free 5 MP images for phone usage. But of course, you're free to dismiss things out of hand, based on no evidence whatsoever.

If all it takes to "shock and disgust" "bleeding edge technophiles" is a technical decision to pick a CPU with faster cores instead of more of them, then these "bleeding edge technophiles" must not get out of Mom's basement very often and are in need of some serious therapy.

They run it to be able to run the radio on the same hardware. But that means that it won't necessarily run anything in bounded time, hence the need for more than one core. Oh, it also comes in 2 and 4 core packages.

The Tegra3 isn't compatible with any LTE modems and won't be for several months so ASUS opted to use the S4 for all 3g/4g transformers so they could have something for carriers to sell nowish. The Wifi only models will keep using the Tegra3. Either way this isn't really something ASUS can fix itself since Nvidia never bothered getting its product to support any LTE modems.

Yes, and the summary is sensationalist (par for the course I guess) and misleading (more annoying).
The story is just that Asus preferred the Tegra3 when having to chose a platform without LTE. And when taking into account LTE, it preferred the new Qualcomm S4 combined AP and LTE solution.

Now when you say:

The Tegra3 isn't compatible with any LTE modems and won't be for several months

It can be misunderstood as an issue on the Tegra3 side, which would be unfair. There's nothing special to do to hook a LTE baseband to a T3. It's all Android and very common interfaces. The issue is more

First HTC suddenly drops its quad core chip for a dual in a phone thatwas supposed to have a quad core chip since it was leaked back in July.

And days later, Asus drops a quad in favor of a dual core.

Same chip was dropped.

Someone... is keeping a secret. There is a problem with the quad corechip and 'something' new(er) that is appearing in the phones. I read thatan LTE chip appeared in the "One X", while the quad core disappeared.

Is LTE and quad core not playing nice? Are there production shortages?Overheating issues, battery issues?

The whole story isn't out. I'm curious what it is. I've been waitingand salivating at the promised "Quad" core offerings for smartphones.The Samsung SIII is supposedly going to have one, but from a differentcompany, their own Exynos chip. So, we won't see that quad be cut in half.

Hopefully.

Regardless of what the non-power users say about not needing more cores,I see my dual cores maxed out regularly. I need the extras, I was willing tosell my life, I mean soul, I mean sign a new 2 year contract for it.

I bought a Prime as soon as they were available where I live. The first time I switched it on, it updated to ICS--great. I headed over to XDA-developers to see how to root it, and found stickies dedicated to the various problems that the devices have--random reboots, lockups, terrible WiFi performance, and so on. It seems that these problems are related to the serial numbers, too. ASUS even has an "official" support thread on XDA in which (what I presume is) an engineer fields questions about said random pr

Maybe not in sales, but the Tegra continuously lags behind many manufacturers both in performance and power consumption every generation. Nvidia's adventures in SOC Land might be profitable, but they're always the bottom of the barrel when it comes to performance.

Ummm Tegra 2 was the fastest platform for Android for quite some time. The G Tablets are still pretty blazingly fast. The issue is just that Tegra 2 was released for such a short time before Tegra 3 came out that it never got much saturation, and then Tegra 3 came out with a bunch of faster options close on its heels.

NVidia has great hardware engineers, but awful software driver people on their mobile platform. They have done a terrible job supporting their chipsets after release with Android, or getting good manufacturers to adopt them.

Ummm Tegra 2 was the fastest platform for Android for quite some time. The G Tablets are still pretty blazingly fast. The issue is just that Tegra 2 was released for such a short time before Tegra 3 came out that it never got much saturation, and then Tegra 3 came out with a bunch of faster options close on its heels.

NVidia has great hardware engineers, but awful software driver people on their mobile platform. They have done a terrible job supporting their chipsets after release with Android, or getting good manufacturers to adopt them.

so is it fast or is it not? the terrible sw excuse was employed by intel for many years as to why their gpu's sucked since they weren't as fast as their promises..

hspa(+) is more like 3.5G, been for years marked like that on actual selling shipping devices.

if what's commonly called LTE had things like.. well, voice spec, standard way to handle video and all that, then it would be more like 3.5G. it's just another network for the yank operators to ask for mo' money whilst not upgrading their hspa network to bearable levels where they could sell it without transfer limits..

a more proper name for currently out there LTE networks would be short term evolution though! it'

Transformer prime with a shitty GPS antenna, now the descendants with hopefully fixed antenna have an inferior CPU. I personally want lots of cores, they make sense for me. Asus is being really creative at finding ways not to take my money.