Viewsonic G Tablet - Tegra 2 Performance

Let’s get to the really important question - how does Tegra 2 as an SoC stack up against Snapdragon and Hummingbird? We’ll start with the CPU side first, so that we can take a good look at how the Cortex A9 stacks up against A8 and Scorpion. For Hummingbird, we’ll be using the Froyo-based Galaxy Tab that we have in our labs (review forthcoming) since it’s the only Galaxy device that we have available with 2.2 installed. We’ve also got the T-Mobile G2, running an 800MHz 2nd generation Snapdragon along with the Nexus One and EVO 4G with the original 1GHz Snapdragon. We tossed in the 1GHz OMAP3-based Droid 2 just for kicks.

So how does the A9 fare? Impressively. Anand said in his A9 intro to expect 30-100% performance improvements over A8-based designs, and that’s exactly what we see. Comparing the Viewsonic G Tablet to the Galaxy Tab, we see a 75% improvement in JavaScript performance and a 2x performance increase across the board in the rest of our computing benchmarks. We saw roughly the same performance increases when compared to the Droid 2. So, that’s a pretty good summary of A9 performance versus the A8. The only thing we need to note here is that some of our benchmarks appear to be single threaded, so we don’t get the full effect of having the dual-core A9. Multitasking usage models can see a bigger impact from the two cores, not to mention the power savings having two cores sharing a workload at lower frequency/voltage states can offer.

Now, comparing the A9 to the 45nm and 65nm Scorpion cores is more interesting. Qualcomm ensured that Scorpion had really good floating point performance along with some elements of speculative execution, so Linpack performance ends up being similar to A9. Scorpion had slightly faster JavaScript performance than A8, but A9 still ends up being roughly 50% faster. BenchmarkPi is 83% faster on the A9 than Snapdragon, but the BrowserMark result is pretty variable. The Nexus One is roughly 50% faster than the EVO 4G with the same processor, which is probably due to the Sense UI overlay on the EVO instead of the completely stock UI on the Nexus One. The A9 result is roughly 90% more than that of the EVO, while only 30% faster than the Nexus One. So overall, we can say that A9 is significantly faster than Snapdragon as well, but how much faster is kind of across the board and very dependant on the task.

The move to dual ARM Cortex A9s is going to be a significant step forward in smartphone and tablet performance next year. In fact, the next three years will be full of significant SoC performance gains eventually culminating in some very, very fast PC-like smartphones.

So now with the A9 part out of the way, let’s look at the graphics side of Tegra 2. NVIDIA hasn’t given too much information about the GeForce ULP other than saying it’s the same OpenGL ES 2.0-supporting architecture as the original Tegra GPU with higher performance. NVIDIA claims a 2-3x performance increase stemming from higher clock speeds and more memory bandwidth.

I’ve heard rumours that the GeForce ULP is actually slower than the PowerVR SGX 540 in Hummingbird. To see whether that’s actually true, I put the Viewsonic through our two graphics benchmarks. The problem is, we can’t compare the results to any of our smartphones due to the change in resolution, so the only real device we have for comparison is the Galaxy Tab. This is fine, since SGX 540 is really what we want to compare GeForce ULP with.

So, the benchmarks. Neocore gives us some weird results, with the Galaxy Tab hitting the same 54 fps cap we’ve seen before on the Galaxy S smartphones, but Tegra 2 only managing 28.1 fps. Neocore is a Qualcomm benchmark and is likely optimized for tile based architectures, and not for NVIDIA's. A performance advantage here isn't unexpected.

12/20/2010 - Updated Graphics Benchmarks

Before we intially posted this article, I ran the Quake 3 demo on the Viewsonic G Tablet and the Samsung Galaxy Tab three times each. I ended up with 49.0, 48.2 and 49.9 fps on the Viewsonic, and 31.9, 32.2, and 32.1 fps on the Galaxy Tab. Run the averages, and you get 49.1 fps for the Viewsonic and 32.1 fps for the Galaxy. Based on some input from Imagination and other Galaxy Tab users, we decided to retest the Galaxy Tab, since our results were a good bit lower than what they were reporting.

And on re-running the Quake 3 benchmark, I got results in the 44.9 - 46.0 range. I ran it over 50 times trying to replicate the previous scores, but under no conditions (settings, background applications, etc) could I get anywhere near my previous result. I know for a fact that the settings were all correct and that there were no previously running applications, so I really have no idea why I got a framerate that low, much less why it was repeatable.

In addition to retesting Quake 3 on the Galaxy Tab, I also ran both slates through GLBenchmark 2.0, which we recently added to our benchmark suite. The combination of the two gave me enough reason to write up an update to our Tegra 2 performance preview from two weeks ago. I was also planning to compare Quadrant’s 3D graphics score, but I was unable to get the SlideME store working on the Viewsonic to download Quadrant Professional.

With the revised Quake 3 benchmarks, Tegra 2 has a slight 10% lead over the SGX 540, bringing it more in line with what we expected. Looking at our GLBenchmark 2.0 results, we can see that Tegra 2 has the lead here as well, though by a slightly larger margin. In GLBenchmark Pro, the Tegra 2 has a 30% lead over the Hummingbird, while in the more-demanding Egypt benchmark, the gap shrinks to 20.4%.

So while the performance difference isn’t nearly as dramatic as we originally thought, most of what we said before still holds true. It looks like Tegra 2 has the potential to be the best SoC for Android gaming, making it a really attractive platform for tablets. With all the rumours flying around about Google making Tegra 2 the reference platform for Honeycomb-based tablets, apparently we’re not the only ones who think so. The question is how well will it handle existing 3D content that's likely optimized for Qualcomm and Imagination Technologies GPUs.

We’ll take a more in-depth look at Tegra 2’s performance as well as power consumption in our full review of the Viewsonic G Tablet later this month, and hopefully by that time we can give you the Quadrant benchmarks as well. But for now, we can say that Tegra 2 is the most powerful SoC on the market at present and makes for a very capable tablet platform.

78 Comments

What's up, my friend. But yeah, viewing angle is huge. That shot of the Gal Tab vs Viewsonic is taken from my vantage point on my desk. The Viewsonic is pretty awful as far using away from you goes. If you go more than 15deg off center in either direction, you get shafted and everything washes out. Reply

This is something Apple doesn't get enough credit for, that being their very nice IPS screen. Hopefully another company implements one in their tablet offering, though most seem to be going with the AMOLED.

For the full review I hope you add a few more existing games into the benchmarks to further test the tile-based optimizations vs. Nvidia's architecture and of course doing the browser and usage tests with stock UI vs. Viewsonic's custom one (though I think you mentioned you would be doing just that).Reply

I don't see companies going for AMOLED screens for tablets for some time, because of the price. At least another year. Of course, if you consider something as small as the Streak to be a tablet, then we may see some sooner.

But I'm not so impressed with any of the AMOLEDs so far because of their low brightness, and inability to work outdoors on a bright day. My friend's Samsung is tidally washed out no matter how bright he makes the screen. A couple more generations and that may be solved.Reply

You'll never run into a person that intentionally holds his device in direct sunlight. Every human being I have encountered even since the first laptops were built, regardless of screen technology, always seeks refuge from direct sunlight via different angles, shade, indoors, or cupping a hand over the display.

I think your expectations are strange. We'll likely never see a display technology that looks the same in broad daylight as it does in a controlled environment. Sure, some displays are brighter than others in daylight, but I have yet to see one that wasn't viewable to the point where usability was threatened. I have both OLED and LCD phones and I've had no issues reading text or viewing pictures in direct sunlight. Sure, I notice a difference between the two, and certainly between indoor and outdoor, but I don't enjoy viewing LCDs in direct sunlight any more than OLEDs.

If they can improve it, great, but hopefully not at the expence of what make OLEDs so attractive in the first place.Reply

Every single human you've met? What about the non humans then? Strange way to put it. I've seen several AMOLED phones, and none of them were worth a damn in the sunlight. Every review of those phones says the same thing. I'm scrptocal when someone says that their's works well enough for that reason.

Yes, no one deliberately holds their phone in direct sunlight. But sometimes itmcan't be helped, and cup g your hand over it rarely helps much. But, these are tablets we're talking about. No matter how big your hand is, it won't help on these screens. Even the small 7" tablets are too big for that, and the 10" models are much too big.

I'm certainly not saying that they should be the same in the sunlight as indoors. For you to assume I did is odd.

I do expect that at some point, when efficiency allows much higher brightness levels in AMOLED displays, we will see them working much better outdoors. But now, they are much dimmer than the best LCDs. Again, you can check the reviews and comparison tests. And it's brightness that determines how well a display will work outdoors.Reply