Post Your Comment

132 Comments

Hey Anand, are you planing to teardown the gamepad itself? I've been reading it only last 3 hours so I'm wondering how big the battery is, can't find that info anywhere and there's no teardown of the gamepad either. Thanks.Reply

That's not a bad idea. I'm tickled that Mr. Shimpi went through the trouble of tearing the Wii U down just to take a peek at its processors, but I suppose now I wouldn't mind seeing the inside of the controller as well. You give us an inch and we want a mile. Reply

Smaller battery capacity than many new smartphones, with a larger screen. That's where the battery life of 3-4 hours comes in. I expect it will be very little time before third party extended batteries come out. Reply

They are not comparable in that regard. The WiiU controller is not running the games, it is streaming the video from the console. So it only has decode hardware for the stream, it powers the display itself, and the of course the control surfaces.Reply

It was $15 at launch, if you cared that much about a review, go buy it and review it yourself.

I'm pretty sure you can still get it for $15 (don't know when the promo for "buying a pc with windows 7" to upgrade to 8 stopped, but they did not require any proof of purchase or anything else).

Thank you for this tear down Anand. Though I'm not going to be jumping this bandwagon, it was a good, informative read--and one I'd prefer over a Windows 8 review (something anyone has had access to for over a half a year with release previews and like I said, a $15 launch cost).Reply

Ryan and Jarred are working on the Windows 8 Performance Guide, we already posted a take on the modern UI in our Windows RT Review which hit shortly after the Surface review. Lots of stuff coming, just all takes time.

I do like Windows 8 a lot. The major UI change we really covered in our Windows RT review (RT has the same UI as Windows 8). The rest is coming in our Windows 8 Performance Guide. I don't remember the last time I wrote an iOS review, and the same is true for a Windows review. Other folks are actively working on filling in the blanks on our Windows 8 coverage now :)

Isn't 6GB/s rediculously low considering the PS3 and 360 are over 20 each? I've read on other sites that the memory is closer to 17GB/s, also based on the markings.

And holly molly, that CPU die is small! Even smartphone SoCs are larger than that. With three cores seeming most likely, each core is pretty tiny, and that's with the eDRAM on die. I wonder if there is truth to the CPU not out-muscling the PS360 CPUs then, even with their age. The Cell on the same 45nm fab is still over 100mm2 if I remember right, and that's only 200 million transistors. Reply

By the end of next year, they might have that MCM fully integrated and shrunk to 32nm. They could be selling consoles for $139, and even turning a profit on each one. Meanwhile, the other consoles will be selling for $300-$400. What does that extra money really get you? Everything is still pretty much stuck at 1080p.Reply

I'm totally agree, 1080p will be the standard for a longgg while. Beyond that price is very important.It took me 5 years before I buy my first ps3 at $250(with discount price).I got my wii during its first year deployment.Reply

Games could look massively better at 1080p. The next consoles from MS and Sony will have way more RAM and processing power. No more blurry textures, poor AA, crap physics, and low polygon counts like on all current consoles, including the Wii U.

We aint even anywhere near to photo realistic graphics at 1080p. PC's could get near to it right now, but are completely held back by all the console ports. You're extremely narrow minded and short sighted.Reply

Thats a VERY crap comparison. They don't even say what other version is being shown. And the frames look messed up as if it interlaced video for what ever console they're comparing the Wii U against.Reply

Wow, I wonder if even a large eDRAM cache can offset that much speed difference from the PS3 and 360. It's larger in capacity so it would be doing less loading/unloading than them, but some things still depend on memory streaming. Reply

Judging from the pictures, the part number is H5TQ4G63MFA-12C correct?

From their part number and this PDF (http://www.skhynix.com/inc/pdfDownload.jsp?path=/u... it is indeed 800 Mhz rated. I believe that that is the base clock though so that the bandwidth is actually 1600 MT. Note that the GDDR5 speeds, which go up to 7 Ghz effective, are not represented in that decoding table. Thus bandwidth for the Wii U would be 12.8 GByte/s. Reply

rv740 means 4770 or only 640 (vliw5 dx10) radeon shadersIf it was based off the 5770 it would have 800 (vliw5 dx11) radeon shaders, but the die size is too large for the die size would have to be 166mm^2 and anand only found about 156mm^2

And a cpu that has such a small die size only 32.76mm^2...I am pretty sure tegra3 cpu die size is larger than this (once you remove the die that is dedicated to the gpu and the companion core.)

He just said the size is a bit bigger than the RV740, that doesn't mean it's an RV740 in there. With the supposedly pretty large eDRAM in there that throws off estimates, the GPU core could be pretty customized. Reply

Considering the Xbox 360 GPU is roughly on par with the Radeon 6450 (at least in compute power) this seems to be a pretty solid upgrade on the GPU side.

I do agree on the CPU though. I think the 3DS silicone may actually be larger than that CPU die. Given the use of gamepads, I'd have thought that the CPU would need to be pretty strong.Guess we'll have to see what happens later on.Reply

The low end graphic cards have always been so poor in gaming performance on the pc side, with every iteration too, they seem to keep the same core performance and only add some compatibility stuff. Always disappointing. Reply

From my research, the Xenos (Xbox 360 GPU) puts out about 240 GFlops single-precision, and the Radeon 6450 top out at 240 GFlops as well.

Of course that doesn't tell the entire story due to the extra eDram and a few extra tidbits including microcode optimizations, but yeah...

Low end GPUs exist nowadays to upgrade older PCs for playing HD video well, or otherwise to upgrade from older (pre-Intel HD 3000) integrated graphics, mainly for HTPC use. Because it's target market doesn't rely on performance, there's little point in making an entire new core unless there's a fancy new video decoder or encoder.Reply

The XBox 360 has 48 shaders that would best be described as pre-Radeon HD 2000 series class. They're not DirectX 10 compliant, at least with what shipped with Vista. The XBox 360 was originally supposed to be DX10 compliant hardware but Vista was delayed and the PC spec changed while the XBox 360 GPU hardware was already completed.

Anyway, in terms of performance, this puts the GPU between a Radeon X1950GT and Radeon HD 2900GT in terms of capabilities and performance. GPU efficiency has crept upward with AMD's VLIW5 designs over time which would actually mean that the Radeon 6450 would be slightly faster than the XBox 360's GPU.

It appears that the Wii U's GPU has 32 MB of eDRAM on the GPU. More than likely this amount of eDRAM takes up half the GPU die but should be well worth it in terms of performance. I suspect that there are 96 VLIW5 shaders (480 ALU's), 32 TMU's and 16 ROP's. While everyone is drawing comparison with the RV740, chances are that this design incorporates many of the efficiency improvements found in AMD's VLIW5 architecture that made it into the Barts design (Radeon 6870).Reply

PS3's RSX is = 7900GT with half the memory bandwidth and half the ROPs. So it's much slower than the desktop 7900GT.

The GPU in the 360 is probably at best as fast as X1800XT despite unified shader architecture because it is also memory bandwidth crippled.

If it has an HD 4770, that is at least 2.5-3x faster already than either the PS3/360's GPUs. There is probably little doubt that the next PS4 and Xbox will be more powerful but this console is definitely more powerful than current generation.Reply

The PowerPC is not competitive anymore. I doubt IBM has been throwing any R&D at it since Apple left ship. And yes it is a derivative of POWER which IBM is still developing, but there never was that much synergy between POWER and PPC, probably less so these days.

It's pretty safe not to compare it with Sandy/Ivy bridge. But I'm pretty sure even a dual core Clover trail or even Brazos would outperform this thing.

So, my question is, why is IBM still winning these contracts? Is it just ISA compatibility? I'm not sure that's such a big concern these days, considering how quickly even the PS3 got rid of its PS2 compatibility mode. I mean, does anyone actually want to play low-res Wii games on the Wii U? What for?

And if we are to put ISA compatibility aside, why is IBM still winning these contracts? Is it because Intel isn't even bidding for them, due to low margins?Reply

It's because Intel won't licence chips out. IBM wins because they sell the design of the chip, then X console maker can modify it and shrink it on their own schedule. And also, there are whispers that the PS4 and Nextbox may be using AMD CPUs (or APUs), and AMD does already have experience catering to consoles. Reply

The weird thing about AMD's new strategy is that it doesn't necessarily mean an x86 chip inside of an APU from them anymore. They already have an ARM license and I'm sure that a licensing deal can be worked out with IBM to include a PowerPC core if the console maker desires (note that the Xbox 360 now uses a triple core PowerPC + Radeon based GPU on one die).Reply

Actually the differences between POWER and PowerPC nowadays is purely marketing. The divergence pretty much ended with the POWER5 (though Altivec SIMD wasn't added until the POWER6). The distinctions between POWER and PowerPC in terms of hardware support are in SoC features like on-die cryptography, TCP/IP off load and accelerated memory compression that fall outside of the CPU core.

And IBM has been developing new PowerPC based cores. The Wii U could use the embedded PowerPC 476 core or the PowerPC A2 core. These cores would be a minor step up from what is found in the XBox 360 or the Cell PPE inside of the PS3. Sandy Bridge is far more aggressive core design than either of the PPE, PPC 476 or PPC A2 but it is also larger and consumes more power. The main reason IBM gets these contracts is due to their willingness to design and manufacture a custom chip.

As for POWER, the new POWER7+ takes the performance crown in the high end server world. It manages to top the 10 core Westmere-EX and the 8 core Sandy Bridge-E.Reply

You mention Hynix DDR3-800 devices but I guess the 800 means 800Mhz which would translate into DDR3-1600. So that's 12.8GB/s instead of 6.4GB/s. Almost every laptop sold these days is equipped with DDR3-1600 and it's dirt cheap. I would even assume that DDR3-800 might be more expensive these days.The AMD GPU also functions as the northbridge and southbridge (just as on the Wii). So don't forget to take that into account. RV740 was 137mm². That doesn't leave too much space for the northbridge, southbridge and eDRAM. RV740 also had a power consumption of around 60-80W. Even with a decreased clock speed that wouldn't fit into the 33W of the Wii U. Or it's much less than a RV740 or it's not 40nm.I see two possible candidates for the CPU.My first guess would be the PowerPC 470. Multicore capable, very low power consumption, very small, customizable but the speed is more in the range of an ARM core. It would make sense I guess since many developers mention the lack of CPU performance.My second guess would be the PowerPC A2. Multicore capable, low power consumption, small, ... but not really meant for something like a console (but still possible).Reply

thanks for the infoi don't know where you're seeing the 12c speed grade in that pdf however

it didn't even cross my mind that the wii-u might have that much eDRAM

from watching the teardown PCPer did i can confirm that there's no further DRAM ICs on the underside of the PCB (i could identify what i believe to be the SLC NAND chip for the OS that the wii-u is supposed to have and which has been mentioned in this teardown and a bunch of smaller ICs the role of which isn't obvious to me)

They are 256mb x16 chips... so x 4 chips in the array we are looking at 1Gb total ram... so unless 4 more are hiding underneath, the video ram is somewhere else, and this ram is the ram being used by the O/S which is fine seeing as most pc's today use ddr3 1600mhz for O/S use...

What is on the underside??? In any case your estimate of 4GB (512mb) per chip is flat out wrong, as they are 256mb per chip with that model number.

If there are none below, then this might mean that the video ram is that small extra chip near the gpu/cpu and that the system ram is separate from video ram like a proper gaming pc would be per se. Which would debunk the ram bandwidth being slow etc theory, because if there is only 1Gb of the ddr3 present this is the ram nintendo said would be reserved for o/s and system use only and not be available to developers.Reply

I just think you are having a problem with basic math, 256mb x16 bus width per die chip means that the entire ram allocated is 1GB total. There is no 4GB on the wii u to begin with so tooting that number makes you look insane. Nintendo said themselves that 1GB would be dedicated to the O/S (From what we can see those hynix chips) and 1GB for developers that is 2GB

But hey it's not the first time in this article alone you guys have posted wrong technical data.

It's as bad as neogaf claiming the samsung chip was the system ram originally, turns out that was the 8GB eMMC chip aka the built in hard drive...Reply

If they are 512 mob each then yes that's all 2Gb's of ram (video and o/s) which paints a very very bleak picture for overall memory speed and layout.

Looks like no amount of frame buffer ram will fix it from being 50% less then even the 7yr old Xbox 360....

To be honest this is disheartening especially given the system price and hopes of seeing proper ports from this Gen and next Gen (720/PS4)

I hope Nintendo hasn't cheaped out to the point of no return.

Sorry for the hassle earlier guys, but a misread of those numbers was quite honestly from a hardware point of view the wii u's last hope of being viable when the big 2 hit the arena. With that settled and no other ram on the device aside from edram there's not much else to contemplate.

Anand is spot on, its actually written on the chips themselves (emphasis added) H5TQ ***4G*** 63MFR -12C. Hynix uses that in their code to differentiate their capacities of RAM, the same code with "2G" or "1G" replacing the "4G" refers to the 2Gb (Gigabit not byte) )and 1 Gb models respectively. Evidence on the chip and from Hynix here:

I agree, the tech sheets conflict a lot which is why it was confusing. That being said I can't help but feel a bit sick with the wii u's memory layout. They have shot themselves in the foot again...

Sorry again for the hassle, I just wanted 100% backing on the ram because frankly I was hoping there would be more to it but Nintendo cheaper out clearly, so not much to look forward to hardware wise lol. CheersReply

Its a shame Nintendo didnt choose to modify something in the mid-tier Radeon 5xxx series like the 5770. It might put them out of the running for being able to keep up with next gen MS and Sony consoles.. because it doesnt support DirectX 11 that is a big problem for the futureReply

We don't know that it does or it doesn't yet though, there's still no hard facts on the GPU family as far as I know. And it wouldn't use DX11 anyways, that's a Microsoft API, but I get that you mean dx11-like physical GPU features used by Nintendos own API. Off to Chipworks or someone to look at the GPU under a microscope! Reply

Keep in mind that the 4770 was among the very first products produced on a very troubled TSMC 40nm process. Anything produced now (nearly 3.5 years later) is going to have the benefit of the process maturing and lots of design experience to fall back on for optimizing the layout and transistor leakage.Reply

do we even know that the GPU is still being manufactured in the original structure size? wouldn't it be possible that they've worked with AMD on a die shrink?

AMD has been using the 32nm process in mass production for their IGPs for well over a year after all

you have a point with the refining of the process but the TDPs of the later cards manufactured in that structure size (i.e. HD 5 and HD 6) don't really support that as power consumption seems to have been largely staying level on GPUs with similar transistor numbers and similar raw GFLOPs performanceReply

The unknown is that many hint at it being dx10.1 or not even a dx instruction set (as mentioned by some Indy devs) also the inclusion of ddr3 with a 64 bit bus width puts it really far back in the 4000 series era like 4550 range of the r700 chips.

The e6760 would be epic but it uses DDR with a 128 bus width, which is why its the rumored chip for either the ps4 or 720.

The wattage is definitely massively lower though I mean the launch ps3 pushed 180 watts whereas the wii u is hitting under 40 consistently.

I can't wait until it gets the X-ray done so we can see how much or how little is really in there.

I think what is throwing us off is that the ram is shared, is it ram for general use and the discrete gpu has more of it's own ( making the gpu you suggested completely viable) or is it shared as in being used on a much older inferior gpu.

I personally hope we learn of good news like something along the e6760, rather then some horrible low wattage e4xxx series gpu with shared ram.Reply

You're forgetting the massive amount of eDRAM on the GPU die with regards to bandwidth. The width there could easily be 1024 bit wider (or wider). Bandwidth for that 32 MB of eDRAM should not be an issue.

Case in point, the PS2's Emotion Engine had a 2560 bit wide bus to its 4 MB of eDRAM on die and that was over 10 years ago.Reply

well that's the first time i heard about it (i did hear about the 2GB rumors beforehand tho.. i also heard numerous other rumors regarding different memory sizes, one of them was bound to be correct) and as far as i'm aware nintendo hasn't made any comment about it so i would be careful with the word "confirmed"Reply

I know that wireless is great and all but why not include at least a 10/100 if not 10/100/1000 Ethernet port. Wireless can be flaky at times and when the Wii was first released it was awful for me. Is there really no room for them to add that on? Is it really that costly? Reply

In regards to the GPU; it looks like the die size estimate would put it closer to the HD4670?That was a low-ball rumor floating around for a while before being replaced by e6760.Any chance you can give some insight here?Reply

Earlier this year, wasn't dev saying the system is 4-6% more powerful than current gen and the next xbox is about (rough estimate) 10-15% more powerful than the Wii u.. It's not surprising really, as did anyone expect nintendo to release a system really graphically powerful and has a mini tablet to boot at a reasonable price..

nintendo claim they are losing money per console because of the gamepad no doubt, but as someone posted earlier they'll be able to get out of the red much faster than what Sony and Microsoft has planned, and besides what nintendo is betting on in the long run is that because all games will be now on equal resolution, the average console gamer is going to find it difficult to tell which game look better than the other..

So the next Xbox will be 20% faster than the current one? That would be a huge let down. Can you link to where you read that? I would expect at least a doubling of performance if they stay at the same price point.Reply

I think he mistyped. The original rumor was Wii-U was 5-6x more powerful then an xbox 360. And that the 720 would be 20% faster then the Wii-U. Too lazy to link but there are tons of articles from 6 months that came from a microsoft document that circulated based off of the hardware projections.Reply

that leaked hardware document said that the nextbox *should* (not would.. it was a very early document apparently, even IF it wasn't fake, so likely it hadn't been determined yet) be 6-8x times as powerful as the xbox360there was no comparison to the wii thereReply

I would be incredibly surprised if this was 5-6x more powerful than the xbox 360. The whole thing draws 30 watts, the memory bandwidth is half that of the 360, the CPU is smaller than Xenon or Cell on 45nm. None of those things point to total performance by themselves, but they do seem to me like Nintendo went for an econobox. Reply

4-6%? performance differences that small are hard to measure accurately even on PCs with dedicated benchmark tools and similar architectureson consoles it will be next to impossible with their very differing hardware architecture

the most accurate statements you will get is something like "1.5 times more powerful" like you got when the gamecube was compared to the wii.. it's not really anything to go by but it gives a very very rough estimate

with a chip comparable to the RV 740 the wii-u would be somewhere near 3-4 times as powerful as the last gen, but that's only counting the performance of the GPUcomparing the CPU performance is even harder especially when factoring in the cell CPU of the PS3Reply

It is weird that I get 1175.6 for the sunspider test with my S3 (international) but here is getting over 1.4k. Despite countless times I try to run it, I am always getting below 1.2k even if I purposely try to do something on the device to slow it down.Reply

Very interesting article - I also like to "see what's inside". I did chuckle a bit with how Anand wrote it as if an instruction manual for how YOU can tear your own WiiU console apart and most likely make it inoperable :DReply

i do like how you gave instructions on how to take the wii apart. im guessing not many people are going to go buy a $300+ wii and rip it open and void the warranty. i would have like to have more info comparing the wii to xbox, ps and pc.Reply

The Wii U CPU is about as big as a *single* core Atom at 45nm, while packing three cores. Size isn't everything, but there is only so much you can do with a certain number of transistors. If three cores is true, each core has about a third the transistor budget of one Atom core. That's crazy small. Even six years later, I don't find it likely something as big as one Atom core can do more work than the Cell or Xenon, as inefficient as those were. Reply

the 3mb edram catch is thesame size and heat as a 1mb sram catch and the 3xcore powerpc 400 custom is about the same size as a broadway cpu in wii

the powerpc 476fp is 2x per clock the power of a ARM A9 and wiiu cpu is again upgraded over that core with gamecentric uogrades real time decompression data compression and graphics burst pipes/buffer AKA A NEW VERSION OF GEKKO/BROADWAY

wiiu cpu core is the most powerfull powerpc 32 bit core even made so that makes it the most powerful 32 bit risc core on earth a standard powerpc 476fp is 2x the chip of a ARM A9 at the same speed and wiiu expresso cores are a step up again

a tri core wiiu cpu at say 1.6ghz will eat a 2.0ghz arm a9 4x core for breakfast with ease

lol an and tech and all other haters you lied about gamecube vs xbox all those years ago and now your lying about wiiu

so the wiiu has a 16 bit to 32 bit bus and has bandwidth in main ram of 6.4 to 12.8 GB that is complete crap an and tech....

powerpc 32 bit 400s @ 45nm have 128 bit ring bus not a 64 bit fsb like wii and gamecube also the main bus ram was 64 bit in gamecube and wii and secondary bus was 8 bit gamecube and 32 bit wii WITCH YOU ALREADY KNOW AN AND TECH

so why the anti nintenoism nonsense wiiu has 6.4gb then its 12.8gb WHEN YOU KNOW 100% GAMECUBE AND WII WERE 65 BIT BUS NOT 32 BIT LIKE YOUR NOW TRYING TO SAY

the 2gb ram is ether of the 3 set ups that follow and nothing like what your saying OUT OF NINTENDO HATE I MAY ADD

2gb 1600 ddr3 800mhz bus 128 bit = 28gb not 12.8 or 6.4 it = 28gb

so 2x 64bit = 28gb and 1 x 128 bit = 28gb SO WERE IN HELL DID YOU GET 6.4 OR 12.8 GB

the ram is likely 1600mhz 800 dual channel the bus is likely 128 bit or dual 64bit the powerpc 400 range run on a 128 bit bus at 800mhz

so the information we ALL HAVE is 1600 mhz ram 800 bus 800 gpu and 1600 cpus

so at dual 64 bit or single 128 bit the bandwidth = 28gb not 6.4 or 12.8 or 17gb

17 was a lie 6.4 was a lie and 12.8 was a lie

powerpc 45nm runs at 1600mhz with 800mhz bus

the wii and gamecube was based around the bus speed so if that continues the LOGICAL conclusion at this point =

cpu 1600 customized powerpc 400 with ibm edram catch 3mb

gpu customized rv7 4670 with 32mb edram at 800mhz

ring bus 128 bit 800 mhz

ram ddr3 or gddr3 at 800mhz dual channel = 1600mhz and 28gb bandwidth

32mb edram buffer/catch to gpu will have a massive bandwidth and the 3mb catch to cpu will be 2x plus the bandwidth of sram under the same conditions

so high bandwidth low latency like gamecube alover again

oh that xbox vs gamecube you did years ago ASLO FULL OF SHIT

weak gpu fixed function PLEASE AN AND TECH STOP LYING

xbox gpu was 4 texture layers and 8 texture stages

gc flippewr was 8 texture layers and 16 stages

xbox gpu was 8x4 real time lighting

flipper was 8x8 real time lighting

flipper had 2.5 x the internal bandwidth of xbox gpu

theres many more facts i can add

lol at you blatant anti nintendo wiiu tear down as if wiiu only has 6.4gb bandwidth or 12.8gb as wii had

28gb bandwidth edram and 4gb 1tsram and 4gb gddr3

the main ram of wii was 4gb x 2 = 8 so your saying wiiu has less bandwidth than wii or only 50% more please stop an and tech your loosing all credabilityReply

no the bit interface checks outthe model number on the chips (which you can see on the pictures, they're not made up) leads to DRAM chips by hynix with a 16 bit wide interfacejust because the basic cpu-model has a wider interface that doesn't mean it's being used or even exists on the custom wii-u cpu, besides wider interfaces require more space on the die and as you can see space is already pretty rare

dual channel mode only works if there's actually any capacity left to run it with.. if you have a PC with 2 DRAM modules for example.. each module will have a bandwidth of 64 bits.. so if you split the data evenly between the 2 modules you effectively get a 128bit wide bus.. this is not possible on the wii-u since the DRAM chips with their 16bit bus already run at their maximum capacity.. it's effectively running quad channel with each channel being 16 bits wide

also your maths are complete shite2x64bit=28gb? what kind of screwed up calculation is that? even if that was the way you'd calculate bus transfer rates (which it isn't) it doesn't add up at all (if anything 2x64bit would be 16 byte, but that's as close to "28gb" as i can get)

800 MHz modules running in dual channel mode will achieve 1600*10^6Hz (1600 Mhz effective frequency at 800Mhz actual frqequency, thus the name double data rate)times 128 bit (dual channel bus) = 204800000000 bit/s = 25600000000 byte/s=25.6GB/s (decimal)half of that is, surprise surprise, exactly 12.8GB/s which is the speed you get when only having a 64bit wide bus which apparently is the case with the wii-u

the rest of your post is mostly a collection of speculation, complete off topic stuff, misunderstood technical data and outright wrong information all of it wrapped into a writing style that makes my toenails roll themselves into a sushi maki so i'm not even going to go into that

please do yourself a favor and don't ever get a job that requires even basic mathematic knowledgeReply

why did you go for the secondary bus of wii, as the main bus of wiiu,!!! you deliberately ignored the main bus 64 bit powerpc that gamecube and wii had!!... and went for the 32 bit secondary bus that the secondary gddr3 ram was on in wii !!!! ALREADY KNOWING FULL WELL THAT WASN'T THE MAIN BUS so you deliberately went to the lowest bit bus , to make it look like wiiu is weak DON'T DENY THIS AN AND TECH THAT'S WHAT YOU DID....

why would the secondary bus be the main bus next gen THAT'S TOTAL CRAP so gamecube and wii had 64 bit main bus and all of a sudden the bus went 32 bit the 3rd time round IN YOUR DREAMS AN AND TECH

the powerpc 32 bit core that makes up the 3x core broadway 2 EXPRESSO cpu in wiiu is based on powerpc 476fp all 32bit powerpcs now run on a 128bit ring bus the 64 bit fsb was of the past its no longer used

so why ignore the 64 bit bus and the new 128bit bus AND TELL LIES about a 32 bit bus your exposed as anti nintendo i think (weak cpu lie il debunk that with ease too)

isnt it likely the wiiu with its mcm and a 45nm powerpc broadway fied tri core WOULD ALSO BE USING THE SAME 128BIT RING BUS AH-HHHMMMMmmmmmm wiiboy cannot be fooled like ps3 silly fanboy

another point you fail to understand only powerpc 400s on this 128bit ring bus SUPPORT MULI CORE powerpc 750 of wii and gamecube DO NOT support multi core !!!!!!!!!!!!!<<<<<suggestion<<<<<

so commonsense is the ddr3 (witch is actually gddr3) samsung dont show the G in there specs so its gddr3 obviously they market the ram as so gDDR3 not GDDR3 they drop the G as its meaningless there the same ram

at 1600mhz is perfectly in line with 45nm powerpc thinking as the bus is 800mhz 128 bit and the recommended ram is ddr2 ddr3 1600mhz (u said yourselves ram looks 1600)

so the bus is most likely 128bit YOU SAID 32BIT OUT OF ANTI NINTENDO SPITE DIDN'T YOU MAKE WIIU LOOK BAD lol i see thru this pc fanboy nonsense like superman looking thru clear glass lol

if the ram is 1600mhz then its highly likely the cpu is 1600mhz (exactly lining up with ibm powerpc 476fp) and the ring bus in the mcm is 128 bit 800mhz and as the wii and gamecube were both balanced to the bus speed exactly then no doubt the wiiu is also balanced to the bus IF SO

2TO1 BALANCE REPLACES 3TO1 BALANCE Of WII AND GAMECUBE,,,, REMINDER GAMECUBE WAS ORIGINALLY 2TO1 404 CPU CLOCK AND 202 GPU CLOCK IT WAS CHANGED TO 2TO1 WHEN IBM COULDN'T GET THE OLD 64 BIT G3 BUS TO GET TO 202MHZ SO DROPPED IT TO A 3TO1 BALANCE INSTEAD OF 2TO1

SURLY YOU REMEMBER THIS GUYS I DO IM A CORE GAMER ITS ARE JOB TO REMEMBER THIS!!!!!!!

so has nintendo returned to the original wanted tighter 2to1 balance now that ibm have a half speed bus on there powerpc 32bit cpu I THINK THEY HAVE !!!!!

if the wiiu is still 64bit bus and not 128bit ring bus for what ever reason then is it not likely the 2gb ram is on a dual 64bit bus still giving single 128bit bus levels of bandwidth

and isnt it safe to say if there not using a 128bit ring bus then theres still dual memory buses just like wii and gamecube had so again its still way higher than 32 bit THAT YOUR SAYING

likely speed of ram is 800mhz x 2 = 1600 and as nintendo make clock balanced systems BECAUSE THERE NOT STUPID ENOUGH TO TRASH AND WASTE CLOCKS COMBINED WITH LOW LATENCY FAST RAM = EFFICIENT

those speeds might not be exact but there ball park UNLIKE YOURS

likely edram to gpu = 512bit or higher REMEMBER GUYS WII AND GAMECUBE HAD A 512BIT TEXTURE CATCH AND A WHAT WAS IT 360 BIT FRAME Z BUFFER JESUS CHECK YOUR OWN ARTICLES GUYS BEFORE TEARING THE WIIU DOWN

so a minimum 512bit for edram if not more and 128bit bus for main ram LOL 32 BIT ....Reply

just add that ibm edram as level 2 catch has more than 2x bandwidth of sram so the 3mb catch of wiiu has twice if not more bandwidth of the same catch made from sram

snd the edram to gpu is over 10x wii and looks very alike so has wiiu got a edram texture/shader catch as well as a frame and z buffer

frame and z buffer =22mb and texture shader catch =10mb just guesing here it may not use a texture catch like wii and gamecube BUT IT WOULD MAKE SENSE the developers have said texture and shader data can be loaded into the gpu THAT SAYS BIG EDRAM CATCH TO ME LIKE GAMECUBEReply

Excellent article but I was very disappointed that the engravings on the CPU or GPU weren't shown! Why didn't you guys clean off the thermal paste before posting pics? This is what a lot of people have been wishing to know, so I'll assume that you guys didn't show it because of legal reasons and not because you forgot.Reply

People are still using this as some sort of fact sheet, so id figure id drop you guys a line.

You cant determine bus size by looking at a ramchips specification documentation. Its not in there, as the memory architecture is part of the machine, and not the ram product.

Also, while you got the hynix ram, the samsung ram is the same exact ram chip used in og 360's, down to the serial number/ nomenclature. By your logic of chip type=bus by serial number, the wii u would have a 256bit bus. Which also isnt true.

And finally, we have the die picture of the gpu. And can clearly see the ddr3 i/0 where the ram bus plugs directly into the gpu.

If the wii u only has a 64 bit bus. Why are there 158 pins on the ddr3 i/o?

And if the wii u only had half the main memory bandwidth of ps360, how is need for speed possible?

Bigger assets for textures, and better framerate/performance, at half the bandwidth? edram cant fix that, every access to main memory is at main memories bandwidth.