We usually have to wait for a new piece of hardware to actually be on the market before we can link to the obligatory teardown showing the internal components. But Nintendo has beaten the iFixits of the world to the punch this time around, hosting an official, picture-filled discussion of the Wii U's internal hardware as part of its regular "Iwata Asks" interview series more than a month before the system hits stores.

Nintendo's focus is the Wii U's multichip module (MCM) which contains both a multicore CPU and GPU along with on-chip memory, all on a single substrate component. Positioning all of these chips so closely together reduces latency and power consumption, the Nintendo engineers explained. This in turn keeps the size of the hardware down, a priority for the team.

Despite the advantages of the MCM design, combining components from chip-makers Renesas (RAM), IBM (CPU), and AMD (GPU) on a single component was a challenge. When defects became apparent during the testing process, isolating which piece of the MCM was responsible proved tougher than with a more spread-out design. When the component manufacturers would insist that another company's chip was responsible for problems, Product Development Deputy General Manager Ko Shiota said he forced each company to design a robust testing regimen to "prove your own innocence."

While the integrated MCM design of the Wii U reduces the number of major heat source locations on the hardware, the more powerful chips on the system generate three times the heat of the original Wii, developer Yasuhisa Kitano said, necessitating a larger heatsink and fan assembly in the system. The engineering team performed over 2,000 tests to determine the perfect balance between size, heat dissipation efficiency, and noise generated by the fan system, finally settling on a design that draws air from an inlet on the side and pushes it out the back.

Other tidbits from the teardown and discussion:

The Wii U casing was designed to be set horizontally, though the system can be positioned vertically with the stand that comes with the Deluxe Set.

There is at least one version of the Wii U with a clear casing that lets you see the internal components. "You've got to sell this to me!" Nintendo President Satoru Iwata said jokingly when he saw the unit.

Promoted Comments

One way to interpret Nintendo's design choices is to look at what they were trying to accomplish. What I mean is, they wanted a system that they could launch at $300 in 2012 and not lose money, that could fit in an elegant package that wouldn't have heat problems that could lead to RRoD or YLoD type problems, was backwards compatible, and could display a 1080P HD output to a monitor while simultaneously streaming to a relatively higher def pad display.

Given all that it's not surprising that they would go with a GPU that seems like it might be disproportionally more powerful than it's CPU. They cut some corners with the CPU to save heat and money.

I agree. In fact, until Nintendo releases a console with an NES slot, SNES slot, Gamecube compatibility, controller ports for all of those things, and the ability to hook up to an original Donkey-Kong arcade cabinet and use its controller and monitor, Nintendo just won't get the valuable customer that is me.

I agree. In fact, until Nintendo releases a console with an NES slot, SNES slot, Gamecube compatibility, controller ports for all of those things, and the ability to hook up to an original Donkey-Kong arcade cabinet and use its controller and monitor, Nintendo just won't get the valuable customer that is me.

I assume this is sarcasm. If it isn't: /rolleyes. If it is: /golfclap.

Just keep your Wii if you have one. There comes a point where supporting a console after 11 years is no longer required. Nintendo supported the system from 2001 to 2007 ( when it was discontinued ) then further supported it through the Wii.

Besides Nintendo could easily add GameCube support through the Virtual Console feature. This wasn't required with the Wii since it could actually read GameCube disks.

At least Nintendo are being consistent with backwards compatibility on disc based systems supporting last gen system (so far). 360 had a good portion of games but was not guaranteed. Sony flip flopped to save costs. Hopefully it was saving them enough (probably) over the lost sales of people wanting backwards compatibility or no sale. Else the move did not make much sense.

Wow, the CPU is REALLY small! And we know it is built on 45nm, not any of the smaller newer fab process. So maybe developers were right in saying the GPU is decent, the CPU is a bottleneck. Size does not equal performance, but there's only so much you can do with so few transistors at 45nm density.

The heatsink and fan are similarly small, no surprise given the low power draw but it means limited headroom. I wonder where this stands in comparison to the PS360, I really want to know more. A bit over double the RAM than the 360 (subtract some for the 360s OS) is good, a somewhat modern unified shader architecture GPU is good, but what about CPU performance, what about GPU fill rate and so on.

It seems that the 1GB of memory reserved for the OS is on a separate slower die, so no hope of it being freed up to games, or at least small hope of it and probably a performance difference between the two 1GB RAM pools since the 1GB for OS is lower in the hierarchy.

"Memory hierarchy: Structuring computer memory in layers. The human brain has short-term memory for remembering information related to a certain matter currently in progress and long-term memory for long-term storage of information unrelated to immediate circumstances. Likewise, a computer transfers and manages data by layering storage, with the CPU at the top, high-speed low-capacity cache memory serving as short-term memory underneath, followed by low-speed large capacity main storage for managing hardware, and auxiliary storage for managing the OS on the bottom.:

I agree. In fact, until Nintendo releases a console with an NES slot, SNES slot, Gamecube compatibility, controller ports for all of those things, and the ability to hook up to an original Donkey-Kong arcade cabinet and use its controller and monitor, Nintendo just won't get the valuable customer that is me.

I assume this is sarcasm. If it isn't: /rolleyes. If it is: /golfclap.

Why should it be sarcasm? The GC was two generations ago. Get over it. Blame Nintendo for goofy controller design that limits GC game replayability on their future consoles but to say it's a no sale because there's no GC controller ports? That's a bit dumb, frankly.

The "heatsink shield" (2nd page of the nintendo link) is interesting. I wonder what makes their chips give off more EM (I'm not aware of any other systems that require a faraday cage around their CPU/GPU, but I could be mistaken).

I doubt the GPU's on board memory is 2GB. 2GB is too big to be embedded with a GPU (rumors say 768KB embedded DRAM for GPU). You can see 4 DRAM packages on the motherboard located north and west of the MCM.

The "heatsink shield" (2nd page of the nintendo link) is interesting. I wonder what makes their chips give off more EM (I'm not aware of any other systems that require a faraday cage around their CPU/GPU, but I could be mistaken).

I doubt the GPU's on board memory is 2GB. 2GB is too big to be embedded with a GPU (rumors say 768KB embedded DRAM for GPU). You can see 4 DRAM packages on the motherboard located north and west of the MCM.

Eurogamer says that it looks like there's 32MB of eDRAM integrated in the graphics core.

I doubt the GPU's on board memory is 2GB. 2GB is too big to be embedded with a GPU (rumors say 768KB embedded DRAM for GPU). You can see 4 DRAM packages on the motherboard located north and west of the MCM.

Eurogamer says that it looks like there's 32MB of eDRAM integrated in the graphics core.

That would make more sense, as 768KB would barely hold a single 640x480 image at 16bpp. And that's just the color back-buffer; that doesn't even count the depth buffer.

For comparison, the GC/Wii had about 3MB of eDRAM, which is sufficient for 2 640x480 images at 24bpp (back-buffer and z-buffer) + 1MB of texture swap-space/cache.

32MB is enough for a 1920x1080 backbuffer, frontbuffer, and depth buffer (or just the back and depth buffers with 2x multisampling, if you're into that sort of thing), all 32-bits per pixel, with room left over for texture swap-space.

The Wii U no longer has GameCube compatibility, so it makes sense to remove the ports, it's too bad for those Wii games that supported GC controllers but it was bound to happen. Besides the new "budget" edition of the Wii also has all GC support removed so it's not like this removal is new to the Wii U.

I doubt the GPU's on board memory is 2GB. 2GB is too big to be embedded with a GPU (rumors say 768KB embedded DRAM for GPU). You can see 4 DRAM packages on the motherboard located north and west of the MCM.

2GB main memory, plus probably around 32MB eDRAM on the GPU package. The 2GB is separate.

Good for Nintendo for showing this beforehand. Should settle a lot of questions people have about the hardware before launch.

It settles a few questions, but most of my questions still remain. Is that 2GB all one unified pool, or separate so that the 1GB reserved for OS can never be freed up? GPU and CPU performance? Core count, clock speed, etc. Apart from the 3x higher thermals which we could have guessed from the power draw, there aren't too many tech specs here.

One way to interpret Nintendo's design choices is to look at what they were trying to accomplish. What I mean is, they wanted a system that they could launch at $300 in 2012 and not lose money, that could fit in an elegant package that wouldn't have heat problems that could lead to RRoD or YLoD type problems, was backwards compatible, and could display a 1080P HD output to a monitor while simultaneously streaming to a relatively higher def pad display.

Given all that it's not surprising that they would go with a GPU that seems like it might be disproportionally more powerful than it's CPU. They cut some corners with the CPU to save heat and money.

I doubt the GPU's on board memory is 2GB. 2GB is too big to be embedded with a GPU (rumors say 768KB embedded DRAM for GPU). You can see 4 DRAM packages on the motherboard located north and west of the MCM.

I doubt the GPU's on board memory is 2GB. 2GB is too big to be embedded with a GPU (rumors say 768KB embedded DRAM for GPU). You can see 4 DRAM packages on the motherboard located north and west of the MCM.

Reducing the CPU / GPU / etc onto a single piece of silicon is fairly standard for nearly every single console - it just happens part way through the console cycle. I suspect if you pry open the new super slim (or even slim) PS3, as well as the slim 360 - the arrangement wouldn't be all that different.

I do find it interesting that Nintendo chose to launch directly like that. In many cases with the other manufacturers the technology hasn't matured for the newer chips to the point where its technically feasable to make that merge pre-launch. Fortunately it does (for Nintendo) mean that their bill of materials isn't going to be huge on that portion of the platform, so they should be able to presumably still make money off the system at launch.

Unfortunately it does leave less room for cost savings down the road, so I wonder if it might mean less potential for price drops as the system ages.

I doubt the GPU's on board memory is 2GB. 2GB is too big to be embedded with a GPU (rumors say 768KB embedded DRAM for GPU). You can see 4 DRAM packages on the motherboard located north and west of the MCM.

Have you seen 64 GB MicroSD cards?

That kind of memory is literally magnitudes slower than RAM, and it takes less space than equivalent amounts. You can see that the Wii U has off-die RAM packages, which are no doubt the 2GB main memory, while the GPU probably has something like 32MB eDRAM on-die.

After seeing this, I think I see what Nintendo is thinking. 3 or more years from now, when they get this stuff built on a 22nm process or smaller, Nintendo will release a Wii U DS of sorts, finally merging their portable and console divisions. I'm expecting them to keep the 6.5" screen for the bottom, use a 7+" 720p screen for the top, remove the disc drive and make it DD only with 32gb or maybe 64gb onboard storage with support for some kind of SD card. I figure it'll still have a sensor bar and support for alternate controllers (because why not), and for it to be about as thick as a macbook air, no grips, and without the extra comfy shoulder buttons of the Home console counterpart.

After seeing this, I think I see what Nintendo is thinking. 3 or more years from now, when they get this stuff built on a 22nm process or smaller, Nintendo will release a Wii U DS of sorts, finally merging their portable and console divisions. I'm expecting them to keep the 6.5" screen for the bottom, use a 7+" 720p screen for the top, remove the disc drive and make it DD only with 32gb or maybe 64gb onboard storage with support for some kind of SD card. I figure it'll still have a sensor bar and support for alternate controllers (because why not), and for it to be about as thick as a macbook air, no grips, and without the extra comfy shoulder buttons of the Home console counterpart.

Air flows in from the side and ... hits the closed side-wall of the heatsink. What's that silver-coloured block in front of the heatsink? Is that supposed to channel the air (through a rather narrow side opening)?

Air flows in from the side and ... hits the closed side-wall of the heatsink. What's that silver-coloured block in front of the heatsink? Is that supposed to channel the air (through a rather narrow side opening)?

The heatsink is slanted on one side to let air from the side in, seen here

So perhaps you are right, the white thing between teh optical drive and heatsink may be a shroud to channel air in the right way (rendering the arrow in their diagram wrong?). Can't see how else that would work.

Edit: The heat sink shield also has holes in it, so maybe air just goes in the slanted part so I don't know what that other silver thing is.

So perhaps you are right, the white thing between teh optical drive and heatsink may be a shroud to channel air in the right way (rendering the arrow in their diagram wrong?). Can't see how else that would work.

Edit: The heat sink shield also has holes in it, so maybe air just goes in the slanted part so I don't know what that other silver thing is.

The photos seem to be at the same scale, so to my eyes it looks like the silver block between the sink and drive starts at the point where the shorter side-wall on the left of the heatsink ends, which means it would cover the area with the slanting fins.

Obviously they aren't looking to dissipate a lot of heat here - the fan is pretty small and probably running relatively slowly to keep down noise. I just wonder why they didn't orient the fins and airflow so it went straight through from side-to-side. They could still have kept the option for a vertical stand by putting a bit of clearance on the legs.

Kyle Orland / Kyle is the Senior Gaming Editor at Ars Technica, specializing in video game hardware and software. He has journalism and computer science degrees from University of Maryland. He is based in Pittsburgh, PA.