Post Your Comment

39 Comments

That SoC impresses me like no other. I was initially concerned about the previous battery life measurements, but seems to be controllable, and amazingly so.NVidia really screwed up their reputation with Tegra 3, but it seems it's getting back on track. Now for it to actually hit more devices. I'm putting my money on Surface 3.Reply

I would agree. I once owned a HTC One X international and the graphics performance was just not ideal and even lagged behind the A5. It was a disappointment as graphics should be nVidia's strength but I guess they are doing much better with the Tegra 4 and 5 now.Reply

Delicious graphy goodness this early in the morning? Must be my birthday.

Given that scant few games are really going to push the K1 to its limits at this point I don't think there's much to be concerned with right now in terms of battery life. At the Peasant-Standard 30fps it holds up quite well.Reply

The GPU IP block could be used in another SoC, but it's a bit unlikely that NVIDIA would license their GPU to Intel or that Intel would allow NVIDIA to manufacture x86 reference designs or acquire an x86 ISA license.Reply

K1 gives competitive CPU/battery life to Intel, but I don't want to use Android to get the GPU performance I want. I've currently got several Android devices (phones and tablets), but I'm just getting tired of Android's limitations. The problem is that Windows tablets pretty much all suck for GPU performance. Sure, I can use it as a Steam Streamer (which is great), but that isn't always ideal. Do I just need to wait for Broadwell? Will AMD, who already has x86 license, ever build what I want?

AMD already has what you want, the Mullins APU is better than an Atom at the CPU side, and similar to the K1 at the GPU side (when compared at a 4.5w TDP, as the Shield has a higher TDP adn gets as hot as pancakes).

They just didn't get any design wins yet, that's the sad part of the story.Reply

If you look at the full Shield Tablet review, you'll see that the results from the "AMD Mullins Discovery Tablet" are the only ones from a mobile SoC to approach the K1's GPU results. But K1 has a TDP of 5-8W (11W at full load), as opposed to Mullin's 4.5W TDP. So given that, I'd say Mullins is roughly equal with the K1 on the GPU front when you look at power/performance.

Of course, Mullins has the advantage that it's an x86 design, and so the GPU performance won't go largely to waste like it does with the K1 (though at least you can use it for emulators on Android). The disadvantage is, of course, that AMD doesn't seem to have any design wins, so you can't actually buy a Mullins-equipped tablet.

That last point makes me rather irritated, since I'd love nothing more than to have a tablet of Shield's caliber/price (though preferably ~10" and with a better screen) powered by Mullins and running Win8.1.Reply

Intel will NOT license x86. This is their "crown jewels" and they battled hard enough to limit who could use it after an initial easy share (to start the architecture).If intel could put their Pride aside and use K1 in their Atoms or even laptop processors, this would be a killer.Don't get me wrong, Intel HD cores have grown by leaps and bonds, but they both lag in hardware perfs (they have a much lower silicon area too) and particularly driver development.If Intel wanted to kill AMD (hint: they don't) at their own game (APU), they would license Kepler and integrate it.Imagine an Atom or even Haswell/Broadweel (or beyond) with an NV integrated?Reply

dream laptop right there. dual core i5 with k1's gpu, maybe running at higher clockspeed. integrated, simple, but decently powerful without breaking the bank and good battery life and drivers to boot.Reply

Josh, isn't there a feature on Shield tablet where CPU/GPU clock operating frequencies get reduced when the battery life indicator is < ~ 20%? After all, it takes more than 110 looped runs (!) of the GFXBench 3.0 T-Rex Onscreen test to see any significant reduction in performance, and during that time, the peak frequencies and temps are pretty consistent and well maintained.

Note that if you look at the actual GFXBench 3.0 T-Rex Onscreen "long term performance" scores (which is likely based on something closer to ~ 30 looped runs of this benchmark), the long term performance is consistently very high at ~ 56fps, which indicates very little throttling during the test: http://gfxbench.com/subtest_results_of_device.jsp?...Reply

To my knowledge there isn't such a mechanism active. It may be that there are lower power draw limits as the battery approaches ~3.5V in order to prevent tripping the failsafe mechanisms in the battery.Reply

If I recall correctly, it was mentioned somewhere on the GeForce forums (in the Shield section) by an NV rep that CPU/GPU frequencies get reduced or limited once the battery life indicator starts to get below ~ 20%.

Interesting, I see in settings menu that there is an option to enable CPU/GPU throttling and FPS caps after the battery drops to a certain level, but I've made sure to keep that off for all of these tests.Reply

It would be easy to test if either the OS or some setting is throttling the CPU/GPU when the battery is getting low (< 20%). When you start to see the drop off around loop 110 just plug in the charger and see if the throttling goes away as the battery is recharged.

Looking forward to an update in the article is this proves to be true.Reply

How does performance mode impact GPU clock speeds? My note 3 is driving me nuts lately, so many games are being released that won't run the GPU over 320mhz. I would say the VAST majority of 3d mobile games being released throttle the GPU between 210mhz and 320mhz. Reply

"We see max temperatures of around 85C, which is edging quite close to the maximum safe temperature for most CMOS logic. The RAM is also quite close to maximum safe temperatures. It definitely seems that NVIDIA is pushing their SoC to the limit here"

What does the product's datasheet have to say about how close to the limit this really is?Reply

TjMax is not quite the same as the maximum sustainable safe temperature. Operating in that region will make leakage worse and reduce the effective lifetime of a part. In addition, the operating temperature for lithium ion batteries is no higher than 50C, and operating at such high temperatures can seriously affect the usable lifetime of a battery: http://www.portal.state.pa.us/portal/server.pt/doc...Reply

i don't understand why you talk about the battery temperature if we talk about the temperature of the die in the SoC? Depending on the thermal design both can be nearly independent of each other.Temperature in every region has an influence of the leakage current, so I don't understand why you bring up this topic. Especially because you probably have no idea in which amount it has an influence, so we can just guess. The same with the lifetime.Just for clarification:We talk about 'the maximum safe temperature for most CMOS logic.' not more not less.And according to lots of datasheets of microcontrollers the safe temperature is up to +125C.Of course, I don't know it for HKMG.Reply

It's possible to thermally isolate battery and the board, but in most devices this is not done as both parts tend to share a metal midframe to aid in heat dissipation. As a result it's not possible to simply ignore battery temperature and focus on SoC temperature. It's likely that both battery and SoC at the maximum temperatures observed in this test are at the highest safe level.

Maximum safe temperature in most datasheets for something like a CPU or GPU would be the point where the device is shut off and/or reset, not a point to throttle to. While it's fully acceptable to run something like a CPU up to 100C continuously with a TjMax of 105C, the MTBF will be noticeably shorter than if the same CPU was run at 70C or less.

Exceeding TjMax is far from the only way to damage an IC with heat. Thermal cycling from high to low temperatures is also a concern, and other components on the board will have reduced lifetime from high temperatures.

I have no doubt NVIDIA has carefully throttled this SoC and ensured that the MTBF of this device is within acceptable range, but it is still quite a high temperature.Reply

"The 95C maximum operating temperature that most 28nm devices operate under is well understood by engineering teams, along with the impact to longevity, power consumption, and clockspeeds when operating both far from it and near it. In other words, there’s nothing inherently wrong with letting an ASIC go up to 95C so long as it’s appropriately planned for."

"AMD no longer needs to keep temperatures below 95C in order to avoid losing significant amounts of performance to leakage. From a performance perspective it has become “safe” to operate at 95C."

When talking about 85C you stated "such temperatures would be seriously concerning in a desktop PC". Are you saying Ryan is likely too optimistic about that desktop device's 95C lifespan?Reply

For a desktop it's usually fully possible to keep temperatures well below 80C by throwing more surface area and CFM at the problem. The same page also cites a cost to longevity, and when upgrade cycles for desktop parts can greatly exceed the warranty period, allowing ~95C core temperature can be much more expensive than louder fan noise or a custom cooling solution.Reply

Looking at all of this.We got a chip that runs fast, but also consumes insane amounts of power for a mobile device. It runs so hot that it has to throttle (for whatever reason), even though it happily runs on potentially damaging temperatures, even with an integrated magnesium heat-spreader, even when running an "uncapped" test that is capped by the display refresh rate at a performance noticeably bellow the off-screen tests.So hot it can't be put into a phone (1440p phones would be happy).Only when running at 30FPS and losing any significant advantage over the competition, we could say that the battery life falls into tablet class. So whats the difference between this tegra and an adreno 330 that gets an 7W power budget, and a heatspreader ? Where are the comparisons ? How does the iPad mini with Retina display compares for example ?

Everything I see is a chip with far higher maximum power draw than the competition and thats all.Reply

If you look at the actual testing, Shield tablet is able to maintain steady temps and steady performance for > 110 (!) continuous GFXBench benchmark loops even in the max performance mode, which is pretty amazing for an 8" thin and fanless tablet. So the end of test throttling does not appear to be related to heat, but is most likely due to the very low battery % capacity that is left at the end of the test which triggers lower CPU/GPU clock operating frequencies.

At 30fps framerate cap, the performance of Shield tablet in the T-Rex Onscreen test is roughly 1.5 higher than iPad Air. With an uncapped framerate, the performance of Shield tablet in the T-Rex Onscreen test is > 2.5x higher than iPad Air. Reply

You need to look further down on the front page. Sometimes two or more articles get posted on the same day, in which case the more recent article gets the large image while the second article gets the small image below it making it seem like that article is very old when in fact it could have been posted just a second before the top article.Reply