Nvidia, not AMD, will power the Nintendo NX

This site may earn affiliate commissions from the links on this page. Terms of use.

Every now and then, the rumor mill gets a big one really, really wrong. For over a year, conventional wisdom has predicted that AMD would win the Nintendo NX contract and cement its hold on the console industry. Rumors that Nvidia had sniped the win began surfacing several months ago, and now Eurogamer has seemingly confirmed them with input from multiple sources. Nintendo’s NX will be powered by Tegra — and if Eurogamer’s resources are accurate, we now know more about both the console hardware and the underlying SoC.

Let’s start with the console itself. We now know that the NX will ship with a tablet that has two controller modules attached. One or both modules can be easily detached from the tablet, and the tablet itself can plug into a base station that connects to your television. Presumably the handheld controllers then connect wirelessly to the base station to allow for television gaming. One odd aspect to Eurogamer’s write-up, however, is that it describes the brains of the NX as being within the controller rather than within the television.

Image credit: Eurogamer

This could explain some of what we read about the NX months ago, when Nintendo’s patent filings suggested a console whose performance could be improved by adding additional controllers or base station devices, but it doesn’t explain why the company would put the brains of the device in a controller instead of in the tablet. This point is still unclear. As rumored, the NX will use game cartridges in lieu of disks, with a 32GB recommended cartridge size. That’s small compared with modern games on the PS4 and Xbox One and it’s not clear if Nintendo intends to use some form of compression technology or if Nintendo NX games will simply target smaller capacities. The dev kits already shipping also reportedly use fans, though that could be indicative of early revisions not final hardware.

As of now, backwards compatibility with previous Wii and Wii U titles is completely off the table and the new console won’t just run Android — it’ll be a custom Nintendo operating system. Emulation of previous consoles is hypothetically possible, but the NX may not pack enough horsepower to do the job efficiently. While Microsoft has managed to emulate the Xbox 360 on the Xbox One, it’s working with eight CPU cores clocked at 1.73GHz and a far more powerful GPU than anything that ever shipped in the Xbox 360. By making the NX a mobile console, Nintendo will have inevitably sacrificed top-end performance — and Nvidia’s Tegra X1, with four Cortex-A57 cores at 1.9GHz, may not be powerful enough to emulate the Wii U’s triple-core PowerPC architecture clocked at 1.249GHz. Generally speaking, you need either significantly more cores, more clock speed, or both to pull off successful console emulation.

The NX SoC: Tegra-powered

As for the Nintendo NX’s SoC, it’s going to be based on a Tegra derivative, though exactly which chip is still unclear. Most of Nvidia’s Tegra designs have used stock ARM cores, though it’s possible that Nintendo might tap Nvidia’s own Project Denver. Nvidia’s Drive PX2 combines a quad-core Cortex-A57 cluster with a next-generation version of Project Denver, Nvidia’s own CPU core, but which CPUs Nintendo might use (and which GPU core they’ll pair them with) is still unknown.

Here’s my own bet: While I don’t claim to know which CPU core Nintendo will utilize, I’d bet they’ll go for a unified design — either a standard big.Little configuration from ARM (probably tapping the Cortex-A72 over the A57) or a Project Denver option. Given that the NX isn’t expected until 2017, it makes sense for Nintendo to wait for a 14/16nm process node (by then yields on 14/16 should be well on their way to mature). We expect Nvidia will have used a Pascal derivative for the GPU side of the equation as well.

This would be in contrast to Nintendo’s earlier consoles, which tapped older process nodes even when they were new (the Wii U launched in 2012 on 40nm when 28nm was already well in-market). The distinction, in this case, is that battery life is going to be key to the new platform’s attractiveness, and the benefits of 14/16nm are well-established over 28nm chips.

What’s less clear is how much of its own DNA Nintendo will bring to the table. The Wii U was based on established technology from both IBM and AMD, but there were Nintendo-specific customizations to the final silicon. The PowerPC-derived CPU had an unusual, asymmetric cache configuration and the SoC sported a significant amount of on-die EDRAM. There were also some fixed function blocks associated with the Wii U’s GamePad and its wireless capabilities. Presumably we’ll see a similar situation here, where Nintendo pays for some degree of additional capability or specific performance, but layers that on top of a custom Nvidia solution.

As for the intrinsic capabilities of the NX, it should be well in line with what Nintendo said it was targeting — fast play at 900p and 60+ FPS. Nvidia’s existing Tegra hardware can already deliver on that promise, so a next-generation 14/16nm chip should have no problems. It’s not clear if games will run at one resolution on the tablet and a different one when hooked up to the TV, or what kind of additional horsepower, if any, might be available once the device is plugged into the base station. In theory, Nvidia could ship a base station with one GPU in it, include another in the tablet, and combine them in an SLI-style interface to use multiple graphics solutions to render games. With close-to-the-metal operating system access and Nvidia’s overall reputation for smooth gameplay on multi-GPU configurations, this isn’t out of the question — though no information that’s leaked yet has supported this configuration, and we want to stress that it’s nothing but our own hypothesis.

Much will depend on the Nintendo NX’s price. With the Xbox One Scorpio shipping in Christmas 2017 with a 4K target and the uprated capabilities of the PlayStation Neo, Nintendo may have difficulty matching the current-generation consoles in a mobile form factor. Packing Xbox One or PS4 performance into a handheld isn’t going to be possible — both those platforms consume well over 100W, while the typical device power for a tablet is in the 5W – 8W range. Nintendo might burst up to 10-15W, but only for short periods, not sustained gaming. The gains from 28nm to 14/16nm aren’t going to be large enough to offset the power consumption, which means Nintendo is once again going to march to the beat of its own drum rather than tackling Sony and Microsoft in head-to-head battle.

Tagged In

They’ve already stated they don’t plan to kill off the Wii U with the NX. With that there, are only so many options available to do that. The most likely scenario I can think of is that the Wii U will continue to get new releases at a lower performance than what the NX will be capable of, but it will also continue to get older emulated games on the store.

The NX on the other hand will probably get more titles along with better performing Nintendo titles that the Wii U is incapable of. Being a handheld console may also allow them to put handheld series on the new console which could also significantly boost that part of their product line as well.

It gives them 2 different groups of consumers to cater towards. Those who continue to enjoy the emulated games they continue to put out along with their own titles that people love at cheaper quality, and the gamers who want a better performance along with 3rd party titles on a Nintendo system. I can think of no better way that Nintendo could keep the Wii U relevant without making the people who bought the system from feeling like they got cheated out of their purchase.

Joel Hruska

I don’t recall Nintendo saying it wasn’t killing the WIi U. They said they weren’t retiring the Wii U this fiscal year. The NX launches right as Nintendo’s new fiscal year starts, which means they can keep that promise, keep their March 2017 launch date, *and* kill the Wii U all at the same time.

As a fan of AMD I’m a bit sad they missed out on this opportunity. But Nintendo has their reasons. I hope they come back stronger than ever with their new console.

DecksUpMySleeve

#Efficiency and probly a better pricing offer by Nvidia to grab one more AMD segment.

Daniel Anderson

With low level API’s, including those that consoles employ you don’t see the inefficiencies of GCN that you would with the serialized API’s due to the parallel nature of the architecture. Jaguar is another story.

However when it comes to price I can see that. Nvidia is in a position where they can lose money for the sake of not allowing their competitor to pick up additional wins.

Joel Hruska

Word on the street is that NV flat-out paid for the contract so as not to see AMD lock up all the business. Historically, NV has been the company that refused to sign console deals, either because the margins weren’t good enough (that’s what they said about Sony) or because Microsoft was flatly unwilling to work with them (nasty lawsuit left very bad blood between the two back in original Xbox days).

With that said, AMD’s decision not to build 14nm versions of its lower-power core might have made this a competition between NV and a company like ARM for the graphics IP, or NV or Imagination Technologies.

Zunalter

I look forward to playing the $60 version of Mario: Angry Birds on the new mobile console.

Skywax9016

It would be 3D in VR at 60 FPS, loaded with particle, lighting, shadow, AA, AO, GI, ray tracing effects.

Zunalter

Yea, on a tegra, keep dreaming.

Skywax9016

That depends on how taxing the game Mario:Angry Birds is to the GPU

DecksUpMySleeve

Quite curious how this will pan out. 4K(attempting) handhelds, crazy world we live in.

Joel Hruska

No way they’ll be 4K.

Last time I checked, the power requirement for 4K is 2.5x – 4x the power requirement for 1080p. If Nintendo is targeting 900p at 60 FPS within a 5W TDP, that’s going to be difficult enough. Extrapolating out from my recent power tests, you’d still be looking at 9W – 14W per frame of 4K. You want 30 FPS? That’s between180W and 280W of system power consumption.

To put some hard data to this: At 28nm on my older (less efficient) PSU testing, Maxwell needed 18.5W per frame of 4K animation. I usually test with SSAA on, however, so let’s assume that turning off SSAA doubles your frame rate. That would mean 4K without AA would’ve drawn about 9.25W/frame. Now, Pascal is about 65% of Maxwell’s power consumption, which takes us down to 6W/frame. The implication is that you *might* do 4K in a 180W console, which is a TDP and power consumption that consoles have targeted before — but remember, TDP on a handheld is in the 5W – 8W range.

There’s just no way. Not on the handheld. A base station console could be more powerful but Nintendo has historically prioritized small form factors and low power. I’m doubting they’ll aim for 4K this generation.

DecksUpMySleeve

Hence ‘attempting’. With the right engine that can breach 2K I’d think, plug in only boost perhaps.

MLSCrow

Why is anyone even thinking about 4K or even 2K, when Nintendo is aiming for 900P at 60fps? 900P! That’s not even 2K (1920x1080P), which is a shame. Let NV have that crap if they’re willing to pay for it themselves. It’ll just make them look bad and AMD look like the better option for the following gen console, which Nintendo might think at that point, “We should’ve stayed with AMD”. One thing NV can never beat is an APU from AMD. With graphic capabilities relatively equal, NV doesn’t have the rights to create x86 CPU’s, which, are much better performers, though generally haven’t been as power efficient, however, with the new architectures, tick/tocks, whatever improvements that are made to x86 architectures, specifically in the area of power efficiency (e.g., low powered SoC’s Bristol/Stoney Ridge) I don’t believe an x64 only CPU is going to ever cut it. How any powerhouse video game console company could even consider not at the very least achieving the current standard 1080P at 60fps, when all of the options are available to do so with existing solutions (meaning cheaper to mass produce than creating a new, never proven, custom design), is beyond me. Nintendo should realize the trend they’ve been carving of lackluster sales of the WiiU compared to the Wii is in the wrong direction. I know that, I’d rather sit, stationary, with a solid performing machine, with great graphics and sound vs a mobile solution that plays on a tablet, which decreases in performance when I detach it from the main console so that I can play while I’m taking a dump. I’d rather take the toilet break, grab something to drink, and be back at a legitimate gaming console, even if it means only being able to easily play in one spot at any given time.

Sir Williams

isn’t x64 better than x86 since x86 is just a rebrand of x32?

BreadFish64

x64 is x86-64 as far as I can tell

Joel Hruska

So, just in case you care (I assume he meant x86-64 in his reference).

IA32 = 32-bit x86 code. That’s Intel’s name for the architecture. When people say “x86” these days it typically means 32-bit code.

x86-64 / AMD64 / EMT64 = All of these are references to 64-bit x86 code.

1080p is actually (almost) 2K. 2 and 4k refers to the number of columns not lines. 2k is 2048×1080 while 1080p is 1920×1080. There is a slight difference in aspecto ratio.

MLSCrow

Correct, I made a typo. 1920 is indeed close enough to 2K, but just shy, but as is 4K with it’s 3840×2160 (or what’s being deemed as 4K). What I was ranting too fast to state was that 4K requires 4x the power to run at the same fps as 1080P. 4K is 4x (1920x1080P), which made me think that 1080P is “1K”, but I know better than that, heat of the moment *shrugs*. Still, Nintendo isn’t even aiming at 1080P, which is a shame, just 900P. Who even games at 900P?

I’m aware of the upscaling, but I’m a PC gamer and have never upscaled anything, nor would I ever. Console gamers have had no choice for some time, but now that console hardware and API’s are performing as well as they do, there is no need for 1080 upscaling and the next gen will be 4K playable, hopefully also without upscaling. Xbox had to, in order to keep up, because they made the mistake of ordering a smaller amount of GPU cores than Sony (a mistake they promised to make up for with the next console). Nintendo has been a lost cause for performance ever since the N64. Their last performance king…SNES…*cries*.

DecksUpMySleeve

4K is being discussed as it was part of the Rumours being discussed on multiple sites.
As to the NV chipset, we’ll see how it works out, I think the efficiency will bid well in a handheld.
I furtherly think the NX will be more of a hit than the WiiU was, though that’s also to be seen.

Phobos

I think it’s just a replacement for the 3DS, it does look good but if that is their next gen console good luck. Part of the problem with Nintendo is the lack of support from third party developers. For how long can Nintendo rehash Mario, that ape shit kong and link? It needs some fresh material.

Phobos

4k lol, can I have some of that weed too.

DecksUpMySleeve

Repeating others rumours, not of my own voice.

Phobos

If they come from Nintendo I sure do want some of their weed.

Medallish

It was my understanding that Nintendo needed to court more 3.rd party developers, that was the problem with the WiiU developers were hitting a wall with last-gen consoles, and were gearing up for PS4 and Xbox One type hardware, where the WiiU was barely faster than the 360, I don’t see how this, or the 32GB cartridges are supposed to help them in this at all? Indeed the whole going from discs to cartridges is a very strange choice imo. We’ll see what they end up doing, but a gaming tablet that uses cartridges? That’s just not what I look for in my consoles.

jtibbs

A cartridge in todays production would basically be like a flash drive. It can hold much more storage space than a disc could. Not to mention also be much smaller than a disc. It all depends on how they make the cartridges and how they work. I do think a 32GB storage space is a bit small though.

Joel Hruska

Rumor is that they’re using NOR Flash, not NAND Flash. NOR is much less dense, which would explain the 32GB limit. I’m also not to certain of performance.

NOR Flash is sort of its own animal, so it’s tricky to compare them. No one, to the best of my knowledge, has ever built a NOR SSD,

Sweetie

NOR can withstand a lot more erase cycles, right? I don’t see the advantage to using it here. I bought a flash cartridge for my GBA that used NOR years ago. Wiki says NOR has low read latency, though.

Micron:

“NOR flash devices, available in densities up to 2Gb, are primarily used for reliable code storage (boot, application, OS, and execute-in-place [XIP] code in an embedded system) and frequently changing small data storage. NOR flash provides systems with the fastest bootable memory solution, is easy to implement, and requires minimal ongoing management due to the underlying cell structure. Because of the cell structure, NOR flash is inherently more reliable than other solutions.

Serial NOR’s low pin counts and small package solutions make it a good fit for applications like PCs, ultrathins, servers, set-top boxes (STBs), printers, Blu-ray drives/players, modems/routers, wearables, and hard disk drives (HDDs). Parallel NOR flash delivers fast system boot times and high-speed, low-latency XIP operation, making it ideal for applications like digital still cameras (DSC and DSLR) that need “instant-on” performance, as well as other process-intensive applications like networking routers/switches, home gateways, and STBs.”

It could be to allow for fast read speeds to possibly run it straight off the cartridge without installing onto the console. This would allow you to cut costs and have a smaller hard drive on the device itself for saves etc. It would drive up the cost of a cartridge, but if you buy a console your gonna buy games.

Sir Williams

With cartridge is like ssd, u sacrifice storage space for ridiculous speed. If done correctly you probably wont even have to install the game on the system’s HDD. It would be a true return to plug-play.

dellagustin

I would say they are aiming to be a second console for people + portable.

I wonder if they will be able to attract 3rd party to this kind of console specially now that PC, ps4 and xone have a very similar architecture. NX would be that one that would require more effort to port then.

BreadFish64

I don’t see why they can’t make a 64gb cartridge too

Phobos

So is it the successor of the 3DS?

jtibbs

Its supposed to be its own independent thing. They haven’t explicitly said its a replacement for handheld or Wii U. However its beginning to look like it might be able to do both. Just barely though, take that theory with a huge grain of salt.

oGMo

But then they said that about the DS.

Phobos

Well to replace the 3DS it looks very promising but as a next gen console it doesn’t look like it.

Abram Carroll

Replacement for the 3DS and Wiiu in one item.

booyaboo2

if this is true it will be a disaster for nintendo, you’ll be able to play mobile games but no aaa games. this is either utter bs or a majorly idiotic product.

dc

I suspect that most people buy Nintendo’s either for little kids or stay at home moms, so I doubt that having great specs will be a big deal. While this market doesn’t appeal to me, one has to recall that hundreds of corporations make a great living selling stuff to kids.

Zunalter

and nostalgia obsessed adults.

Abram Carroll

Nvidia has streaming tech that actually works. They can offer > console graphics for that. It will run UE4 native and work with the software devs like the best.. That is they like Nvidia tools.

booyaboo2

on a half a tflop of fp32?? hahahhahahahahaha lol, it will run mario bros and android games brilliantly lol what a joke. as one other poster wrote it is a console for small kids and such.

The final specs isn’t out yet, probably it would be on Pascal architecture and 14nm Lithograph which could push up to 512 GPU cores.

Abram Carroll

Close to XB1 if it’s Tegra X1. If it’s pascal based it could pass PS4.

BTux

Oh no! I truly rather use an AMD APU/CPU/GPU over Nvidia. Please be wrong about this post

Abram Carroll

AMD is old tech and their drivers and software suck.

BTux

Respectfully I disagree. I just purchased a Polaris GFX card and am amazed by the price/performance ratio. Their drivers are perfectly fine. As a Linux and Windows user I haven’t had a single problem. I do miss physX but am beginning to not care about. Vulkan FTW

Lueiriueiur lueiriuediuR

Who cares about nintendo, ps and xbox is what matters!

forgerone

If Emily Rogers was a bona fide “Nintendo Insider” then she isn’t now. Nintendo would have fired her.
EVERY rumour regarding NVidia Tegra being used in NX can be attributed to Emily Rogers.
Rogers said in her “OPINION” NX would use Tegra.
And Hruska just hates to see AMD succeed so his very lame attempts at stock manipulation through lying in his pseudo journalistic endeavors just makes the author pathetic.

Joel Hruska

Uh. I’ve never heard of Emily Rogers.

I’ve been hearing rumors that NV won the NX for months. Again, not from anyone named Emily Rogers. Furthermore, Eurogamer was dead right on the PS4 Neo.

Finally — “hates to see AMD succeed.” Really? I mean, *really?*

Sweetie

“Finally — “hates to see AMD succeed.” Really? I mean, *really?*”

I don’t have that impression, unlike a few other tech sites which will remain nameless.

Joel Hruska

Thank you. :)

ralph richardson

I seriously doubt this article, unless the base station itself has plenty of power in order to run a system with the power of a ps4. Possibly, the base runs a full graphics game system version while the base will offload a dumbed down graphics version for the handheld so you can take it on the go. If that’s the case, Nintendo could have a serious winner on their hands. As of now, I seriously doubt Nintendo is making their new game systems graphics ability maxed at hand held level. That would be suicide.

Joel Hruska

Eurogamer has a very good track record with leaks and they were dead on target with the PS4 Neo. This also matches what we’ve heard directly from Nintendo.

And let’s not forget — the last generation, the 3DS became the anchor hit, while the Wii U sold like crap. Nintendo has sold nearly 4x more 3DS’ than Wii U’s and the 3DS is only one year older.

bschool

Never underestimate Nintendo’s ability to self-sabotage itself. Nintendo is proud of “Marching to beat of its own drums”, but that drum is stuck in the 90s, they need to get with program. Who knows maybe they’ll prove the critics wrong … probably not though.

Lorfa

You have to have a new game console that uses xpoint+5 GHZ 24 core xeon type CPU, and graphics power twice that of the new TitanX, 1024 GB+ HBM2, 10 TB storage, super optimized software platform and games. It has to run 250 fps steady at 16k res HDR. Otherwise it’s complete poo, like why bother.

Joel Hruska

Nice troll. :P

Also, have fun with your $25,000 console. The rest of us will be over here, playing games on 1/10 – 1/100 the budget. :)

Lorfa

Oh it will only be $300. (Again otherwise it’s poo). I’m just curious what it would take for a new console to avoid the typical criticism. Do you think that would do the trick?

Joel Hruska

Well, sure. I mean, there was a time when consoles really did tie PCs on launch day. It just required MS and Sony taking an enormous loss on the launch *hardware.* Sony lost more than $200 on every PS3 on launch day.

If you’re willing to run a huge deficit, you can do that, but neither company was.

Sweetie

Sort of. Cell was awful in terms of general-purpose CPU power and excelled, narrowly, in streaming. So, forget about complex AI.

The starkest case I remember is the NES. One could get a cheap NES console that had better graphics, sound, and CPU performance than a very expensive Apple IIe (albeit with much less RAM and a smaller resolution). To me, the Famicom was quite a cutting-edge console when it was released, specs-wise, in 1983.

Joel Hruska

Cell was sort of like an idiot savant. If your workload fit its characteristics, it was capable of unmatched performance in 2005. Making your workload fit its characteristics could give a man cancer via sheer frustration and there weren’t any “average” cases. You either got great performance out of Cell or your code ran terribly. There wasn’t much in-between.

Consequently, not many games took full advantage of what the chip could do. That’s my understanding.

Sweetie

I am under the impression that the CPU simply was incapable of things like advanced AI because it lacked the necessary components, in favor of streaming performance. So, even with highly optimized code one would be likely to have a very pretty but comparatively shallow experience. But, I’m no expert. I’m just relying on what was said. But, the point was made that Cell would have been unsatisfactory as a general-purpose PC CPU.

Joel Hruska

“I am under the impression that the CPU simply was incapable of things like advanced AI because it lacked the necessary components”

I think that if this were true, you’d have run the AI on the main PPE (Cell had a standard PowerPC core at its heart) and not the SPEs. But I can’t really speak to whether AI, specifically, was harder to optimize on Cell than other forms of code. Cell was considered very hard to use, period. So you’re completely right that it would’ve been a very unsatisfactory general-purpose CPU. It’s a bit similar to a GPU in some respects (which also make unsatisfactory CPUs).

Sweetie

That AI has been given short shrift fits into why Cell was chosen — because developers decided to focus on pretty graphics instead.

Joel Hruska

Developers have always focused on pretty graphics. That didn’t change when Cell came along. If you want to see what AI and procedural gameplay can generate you play something like Dwarf Fortress. ;)

Even when developers put more into graphics than into gameplay it doesn’t always fool consumers.

I understand that FF7 was developed, to a significant degree, to run on Nintendo SNES-era hardware with a CD-ROM but an intelligent company would have realized that pretty graphics and vapid gameplay (the card minigame in FF8 is the best part of the entire game) is an inferior strategy after the success of 7.

Joel Hruska

I agree with you re: FF8 vs. FF7. FF8 looked amazing but generally failed to deliver.

Sweetie

Overstated. There are plenty of titles that were a success in the history of gaming because of the quality of their gameplay not their graphics — with plenty of titles on the same systems with better graphics that weren’t successful due to empty gameplay.

However, if one creates a system that simply can’t do decent AI then it’s easier to make sure pretty graphics are at the fore.

Joel Hruska

I’ve really never seen anyone argue that Cell couldn’t do AI, though. I’d like to know where you are pulling that from.

“There are plenty of titles that were a success in the history of gaming because of the quality of their gameplay not their graphics.”

Of course there were. There are games that are great fun to play because of their storyline, games that are great fun because of their gameplay, and games that are amazing to walk around in (even if the gameplay isn’t great) because of their graphics. If I had to choose between gameplay, story, and graphics, I’d rarely choose graphics.

But that doesn’t mean developers don’t focus on building successively better graphics engines, or use those engines to market titles. How much of this any specific title does is, of course, specific to that title.

Toad Wallop

That’d be nice. I’ll take one! I seriously do want two of those new Titan X’s. Probably will, but not right away. I have two of the old ones(as in last year;), but I still get drops under 60fps in 4k. Not in every game, but some. I want 60fps in every game in 4k. We’ll see if this new one is actually 60% faster.

Mirimon

I had high hopesnfor nintendo, bit even the best rumors for them are elittle better than a bad omen.
Well, here’s hoping they decide to simply publish to platforms in the future… hopefully before they get too far behind the times that their game developers won’t be able to keep up with today’s industry. (Actually, it seems they are behind tehe times on both hardware and software these days… they jave more personality than msft does but that could be lost or their downfall aoon of they are not careful).

Zunalter

Yea the Nintendo fanboys screech about how amazing that new Zelda game looks, and all I can think of is, “Yea, it does look good, but would look WAAAAY better on any of the other consoles”.

Mirimon

True, very true. And it seems when it comes to game development the only advancement they made was to copy/paste their characters into popular titles.. I want to play this new Elderscrolls: the adventures of Link, but I agree its 2016 now and they could have done better than N64 grade graphics… its not even a retro art style thing, they simply don’t know how to make a modern game.

Several other titles coming do the same, yoshi, kirby, etc…
Splatoon was fresh, and smash bros. Now a staple title. It is smart to take what is working, and popular, and use it. That’s one step in the right direction, now seemingly ruined by a giant leap backwards in gaming.

Zunalter

I assumed the art style was a “we have no horsepower, we have a large open world and several physical systems, what is the best art we can get out of what’s left?”

Mirimon

That may be, but with the nx rumored specs it seems they don’t intend to work more either.. perhaps its both a hardware limitation and general ineptitude.. (heck, pokemon go was made by a mobile company who simply took their last game and dumbed it down with a nintendo style (a whole other issue there…)

I think this is an awful decision. I don’t see how big.little or Denver could properly feed Pascal for higher res graphics. 900p is aiming way too low, this seems like a company killer.

Joel Hruska

900p on a small tablet screen is more than enough. And realistically, that’s probably a good target for them (remember, the original claim was 900p @ 60 FPS).

Tegra devices in a tablet probably operate on a 5W – 8W sustained TDP. That’s much lower than the Wii U, which is one reason I think Nintendo will wait for 14/16nm — they’ll need it. But even so, they’re going to be facing thermal constraints.

It’s also why I can easily imagine them using a base station with a secondary GPU in it for offload. While ordinarily I’d expect consoles to eschew this solution (and Nintendo may not use it either) the advantage of being able to link up to a base station for better graphics may be too good to pass up.

Dan

Well is it a console or is it a tablet? Seems like more of the latter which will not be able to match neo or scorpion. Pascal is great and all, but a few hundred GFLOPs vs 6 TFLOPs? This seems too bulky for a tablet, too weak for a console. I hope I am wrong, I have owned every Nintendo console, og gameboy, gameboy color, had the power glove and the track and field mat, I love Nintendo. I just see this as shooting themselves in the foot.

Joel Hruska

I’ve theorized that NV could put a GPU in the base station and a GPU in the tablet, then use SLI to bridge them. So you’d get both in the docked configuration. But that’s pure speculation on my part.

Either way, even a mobile Pascal would give far better graphics than any 3DS title currently packs. They should have no trouble exceeding the Wii U, which uses a GPU core that’s roughly seven years old and is built on 40nm tech.

Zunalter

I don’t know if this is an endorsement of mobile Pascal as much as a condemnation of Nintendo’s “current-gen” offering.

RV

In the absence of FACTS Hruska just makes shyte up.

Skywax9016

Probably it would only be 900p 60fps while docked rather than 900p 60fps while being mobile. It would be hard to construct a mobile device that outputs a rather demanding graphical feat that would still be portable and more important, affordable.

don681

$299 Mario Maker Bundle

don681

And you can lug it around

don681

All this talk about performance, I betcha everything’s gonna come down to pricing. Given whatever it can do, how much?

Joel Hruska

Nintendo’s last few generations have targed the $249 – $299 price point, so that seems a good place to start.

don681

A $299 Mario Maker bundle for my kid — that free’s up a TV and a PS4 in my house — hmmmm

Zunalter

I am more concerned about the long term costs of the software. Given that Nintendo rarely discounts their titles appreciably, I would prefer a higher up front hardware cost and better long term software costs.

don681

They rarely do that because for the past 10 years, they havent captured enough market share to cover the costs of game development if they underprice their titles.

Unlike say PS3 “classics”, those have sold enough they can discount new runs.

Zunalter

What are the costs of game development? They essentially repackage the same titles they have been working on for 30 years. I mean, there are obviously some mechanical differences and level design, but their hardware is so under-powered its not like they are doing next-gen engine designs or anything. Maybe you could argue that getting their gimmicky tech like their tablet to work properly was a large R&D expenditure, but overall it seems they could be able to run fairly lean.

Also, didn’t the Wii outsell every other console of its generation? I assume they had a decent chunk of the market.

Jonathan Freeman

I just don’t understand the angle there trying to go at. Its allows some mobile game play, it has stationary elements, and the brains are in the controllers…. does this mean that the system will be upgradeable? Will they release new controllers with beefier specs to keep the console competitive, can you add multiple controllers to beef up performance??? I always though of Nintendo as more better orientated for local multiplayer, hope this isn’t lost in the NX.

Sweetie

I wonder how heavy it will be.

Orion4tech

It already sounds like a fail.
Congratulation Nintendo you did it again.

I was hyped, but then later found out that you cannot buy more games for the NX. So yeah, pass.

Joel Hruska

Are you thinking of the Nintendo console they announced a couple weeks ago? That’s not an NX.

✪ Shamz ✪

Nope the NX. I was hyped for old school console games! Then found out that no new games could be added – Also it’s selling for 110 bucks around these parts .. Expensive to say the least. May as well just buy a friggin ps4 for that price.

Joel Hruska

I am badly confused.

The Nintendo NX is a next-generation Nintendo due out in March 2017. It will have many new games from Nintendo itself. The list price is expected to be between $249 – $299 based on Nintendo’s current pricing.

The classic NES that Nintendo will offer for the holidays this year is NOT the NX. It comes with 30 preloaded games and can’t load any more games past the ones you buy on it. It will cost $60 USD, which would seem to explain the $110 price you mentioned.

sigh. I wish Nintendo would be bold and go for broke with a console that will match the performance of it’s competition. Take that and add some special Nintendo flavored features no one else has and then they could have something that could play any game from current developers and entice them to making more innovative games based on whatever kooky hardware/software they come up with. Super Mario 256 VR!

Zunalter

Since when is “matching the competition” considered going for broke and bold? I thought that was just…you know…doing business.

Sir Williams

they’d surpass the competition if they made a strong console with no gimmicks. Neither Sony or Microsoft can compete with their first party lineup.

Zunalter

They cannot compete with the nostalgia of their first party lineup, to be sure. The games themselves are solid but unremarkable.

darkich

Joel, your premise is basically – a three/four generations more advanced architecture (pascal vs old and far outdated AMD technology) built on a twice more advanced process.. doesn’t matter.(!)

This thing, if packing Nvidia’s latest, SHOULD be able to rival Xbox one in raw graphics power, even if restricted to tablet consumption and thermals.
The SoC in Pixel C is rated at around 600 GFLOPS, and it is a Maxwell part built on planar 20nm.
Do the math.

Joel Hruska

Comparing GPUs based on GFLOPS is a really bad idea for all sorts of reasons. Chief among them? It’s a terrible metric. It tells you nothing about all of the other parts of the GPU.

The Xbox One has 768 cores, 48 TMUs, 16 ROPS at 853MHz. So right away, you see enormous differences between the two. In terms of FLOPS count, the Xbox One will demolish Tegra X1. Gigatexels/s will also be higher on Xbox One, 48 TMUs is 3x more than what Tegra X1 has. But pixel fill rates will be higher for Tegra, thanks to higher clocks.

Unfortunately for any mobile hardware, this comes down to thermals, and the thermals are incomparable. The Xbox One has a measured power consumption of 110W. Tablets typically target 5W-8W, they may burst higher, but they become too hot to hold if they stay above that target for long. The net result of this is that most tablets (save for very large tablets) drop their GPU clocks over time. That 1GHz clock, for example, is the maximum boost clock, not the average.

Nvidia’s GTX 10xx series appears to draw about 65% the power of its 28nm Maxwell GPUs. 65% the power consumption of the previous generation does not give any handheld solution in even a 10W form factor the oomph it would need to match the perfomance of a 110W solution in a console form factor. You’d need an order of magnitude improvement for that.

Pascal is a great graphics chip and I have no doubt Nvidia can build a great mobile processor for Nintendo. But it’s not going to be capable of matching console perf in a single generational leap.

darkich

But again you are too reliant on comparing outdated AMD cores and shaders with the far more modern and efficient Nvidia setup. Also, remember, I wasn’t talking about the Maxwell
-based X1 whose numbers you bring up, but the supposed Pascal successor (X2?)
And you seem to make a mistake of linear scaling down of the perf/watt to come up with the performance estimate for this SoC.

And I’m sure you know very well that it doesn’t work that way.
With the increase of power consumption, performance gain gets relatively smaller.
For example, let’s say that this SoC gives level 1 performance for level 1 TDP.
Doubling the TDP would give around 1.75x performance, quadrupling would give 2.6x, and 8 times more power would be enough only for around 3.4x greater performance on a given architecture, all things equal.
Consider now that one architecture is far more advanced than the other here, on top of being 40-50% more efficient on the get go just by the die size alone.

And say what you want about FLOPS, they remain the best measure of *RAW* performance potential.
You are right that it doesn’t take into account any other aspect of the GPU, but it also doesn’t take into account the software, architecture and optimization between the two.

Finally, you seem to take is as a given that the gaming specific NX will be just like a general purpose Android tablet in terms of SoC integration and hardware management, and that most probably won’t be the case.

Joel Hruska

“But again you are too reliant on comparing outdated AMD cores and shaders with the far more modern and efficient Nvidia setup.”

Sure. But raw throughput still matters. I can’t tell you exactly how well NV and AMD will compare in terms of efficiency, but I can tell you that the faster chip is almost always the one with more cores, texture units, and ROPs. (More power-efficient is obviously an entirely different story).

“With the increase of power consumption, performance gain gets relatively smaller…Doubling the TDP would give around 1.75x performance, quadrupling would give 2.6x, and 8 times more power would be enough only for around 3.4x greater performance on a given architecture, all things equal.”

You actually raise an interesting question with this, but it’s really hard to tell. Chips tend to target sweet spots. AMD, for example, claims that the RX 470 will be 2.8x more power-efficient per watt than the R9 270X. The RX 480 was about 1.7x more power-efficient than the old 270X. We definitely see non-linear scaling, as per your point.

But knowing where that scaling stops or what the sweet spot *is?* I think that depends on the process node and the specifics of the architecture, right down to the transistor level. Certainly modern x86 CPUs don’t get anything *like* 2x performance from 2x TDP above a certain point.

“Finally, you seem to take is as a given that the gaming specific NX will be just like a general purpose Android tablet in terms of SoC integration and hardware management, and that most probably won’t be the case.”

I didn’t mean to imply that I thought Nintendo would build a stock Android console, no. They have a great deal of experience in building great games that can really leverage limited hardware capabilities. So there’s two ways to look at this, I think.

On the one hand, there *is* a certain mathematical tyranny involved. A 1080p screen has 2.1 million pixels on it, a 4K screen has 8.29 million. Your peak pixel fill rate is ROPS * clock speed. You can pull all sorts of nifty tricks and buffering strategies, but you’ll never drive 4K on a 16 ROP partition (I’m not arguing that you said otherwise, just making a point).

I don’t know what the capabilities of the NX will be. I *do* know that working in a mobile TDP will impose constraints that wouldn’t have existed in a regular console form factor. That’s one reason I’ve wondered if NV might use a form of SLI to boost performance.

On the other hand, Nintendo has repeatedly demonstrated that it can build fabulous games for its own hardware. The Wii U might not have gotten very many games, but it had some titles that ranked among the best of the generation. Smash Bros, Splatoon, usual top-notch entries from the Mario series — Nintendo continues to demonstrate that you can build a great system without leaning on giant hardware specs. I think this will continue with the NX.

Ultimately, great *games* are what people care about. I don’t think Nvidia can actually match the performance of the Xbox One or PS4 because the mobile comparisons that try to do this are almost always full of shit. I’ve written about this before, sometimes referencing NV, sometimes not:

But that doesn’t mean I don’t think the NX can deliver a meaningful improvement on the Wii U and some very impressive visuals at the same time.

darkich

Alright. Nice discussion, I appreciate the reply.

darkich

Just to add one strong example to illustrate how “more cores, texture units, more ROP’s”, can be very relative..

Compare two Nvidia’s processors from just one generation apart – GTX 1080 and GTX Titan X.
The latter has significantly more of everything, yet the former easily trumps it performance wise, while consuming less power!

So yeah, I have to reiterate my point – quantity does matter, but architecture and process node matter perhaps even more.

darkich

I’ll just add that, even if NX doesn’t offer oomph to rival the Xbox One, games should look pretty damn good.

This is a really good video, which adds some perspective. The best point imo, is about the PS Vita which was capable of some relatively incredible visuals with its tiny ARM chip on a tightly integrated platform.

And just for the heck if it, here’s how good a multi-platform (ios and Android) game can look on a current 2W SoC, running at 1080p and 30fps. (iPhone 6s plus)..AND on full resolution of the iPad. It looks gorgeous.

Good luck Nintendo, too bad your losing out on the AMD Zen technology remember who made Intel more competitive it was AMD not bad for a much younger and smaller company… AMD will rule once again… soon, just watch and see….

Mark Braga

AMD is working with 8k 16k resolution….

Lex Luger

everyone is being taken for a ride lmao NINTENDO NX WILL NOT BE A PORTABLE CONSOLE WITH WEAK HARDWARE okay stop listening to the lies n rumors OKAY LISTEN ILL TELL YOU HOW IT IS PEOPLE ALRIGHT THE NINTENDO NX IS A PORTABLE CONSOLE WHICH WILL BE USING nvidia tegra x1 chipset FOR THE PORTABLE and then when you wanna play it at home you connect the handheld to a supplemental computing device which is basically a full sized desktop GTX 980 graphics card which will be used to advance the home console experience to the same level of power basically somewhere around the ps4 neo power level alright but you will have to purchase the supplemental computing device separately if you want the more advanced gaming experience and power alright the reason nintendo is going this route is absolutely brilliant as they will have the handheld aspect for all the kids who wanna play handheld full graphics games never seen before on a handheld device and then for the hard core gamers who wanna have more power they can purchase the supplemental computing device separately which is basically an nvidia GTX 980 gpu which will be able to be hooked up to the nx tablet and then used on your home television but you dont have to purchase the supplemental computing device unless you want the extra power and extra advanced level of graphics as this way nintendo caters to both the kids and the hardcore gamers at the same time while everyone can purchase just the regular nintendo nx on the cheap and then if they wanna get the supplemental computing device they can purchase that separately as it’s not a necessity but rather an addition to the regular console which players can purchase in order to have the same level of power and graphical fidelity as the ps4 neo alright it’s that simple guys THE SUPPLEMENTAL COMPUTING DEVICE IS BASICALLY A DESKTOP GTX 980 GRAPHICS CARD WHICH IS A STAND ALONE CARD WITH ITS OWN POWER SUPPLY AND IT HOOKS UP TO THE NX HANDHELD VIA NVIDIAS VERSION OF AMD X CONNECT alright people I know for 100% this is exactly what this nintendo nx is and how it’s going to work

Mirimon

Well.. history has shown us they will not only be the weakest in the group, by defy conventions of many types… lol, I mean, going back to cartridges on a home console????? At this point I wouldn’t be suprised to see a coax to hdmi converter dongle in the box…

Call it what you want, its a tablet in a box…..

Patrick James Gilmour

I like nintendo, as awesome as that sounds, do you have anything to source those claims? claiming the secondary unit will add additional GPU power is not 100% known yet, let alone if it will be on the same level as a gtx 980

This site may earn affiliate commissions from the links on this page. Terms of use.

ExtremeTech Newsletter

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.

Email

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our
Terms of Use and
Privacy Policy. You may unsubscribe from the newsletter at any time.