I cover the video game industry, write about gamers, and review video games.
You can follow me on Twitter and hit me up there if you have any questions or comments you'd like to chat about.
Disclosure: Many of the video games I review were provided as free review copies. This does not influence my coverage or reviews of these games.
I do not own stock in any of the companies I cover. I do not back any Kickstarter projects related to video games. I do not fund anyone in the industry on Patreon.

Nintendo Wii U Clock Speeds: Much Ado About Nothing?

Nintendo’s Wii U may not have the most impressive tech specs at first blush, but comparing clock speeds makes little sense.

Kyle Orland has a smart post over at Ars Technica pointing out that the Wii U’s clock speeds may at first blush appear much slower than current offerings by Microsoft and Sony, but clock speeds are hardly the only factor we should take into account.

Orland writes that “comparing two consoles with vastly different architectures and internal chips is not nearly as simple as just seeing which one has the higher clock speed. Things like the number of computer instructions per clock cycle, the bandwidth of the RAM bus, and the overall efficiency of the architecture are at least as important as the raw clock speed at which a processor runs.”

He points out that video game journalists are making much ado about nothing—and he even quotes your humble narrator to make his point:

“For the most part, this means that the Wii U is under-powered compared to the seven-year-old Xbox 360 and the six-year-old PS3,” VentureBeat wrote of the clock speed finding (with a few caveats). “One would have at least hoped for tech that surpassed current consoles, even if only by a small margin,” a Forbes writer declared. “I honestly can’t believe what I’m reading here,” a GamingBolt writer hyperventilated. Discussions on countless message boards and online forums are even more hyperbolic concerning the importance of the clock speed comparison.

I may be just “a Forbes writer” but I’d like to make one small and perhaps unimportant observation.

You see, even though Orland quotes my initial reaction— mild disappointment, which I base partly on the observation that the system is not as “snappy” as it might be with a faster processor—he either fails to continue reading further down or chooses to ignore the actual thrust of my post—which, by the way, is essentially the exact same thing Orland himself argues. Here’s me just a bit further down the page:

Of course, we should also be cognizant of the fact that clock speeds are a very poor measure of performance. In many modern processors, clock speeds have actually gone down, while overall performance has increased. This is why a new Intel chip clocked much lower than the old Pentium 4 chips can still run circles around them.

Even if the clock speeds on the Wii U do strike us as quite low, we should realize right away that they must be performing at a much higher efficiency than the chips in the Xbox 360 and PS3 to get performance levels that are basically on par with those systems.

Still, whether or not he quoted me entirely out of context or whether my prose was simply too opaque, I agree with the thrust of Orland’s argument. Judging a new system on clock speeds alone is unhelpful at best, misleading at worst.

But even in his urging of caution he falls into something of the same performance-first trap so many others have (and with which I do sympathize at least to some degree.)

Orland notes that although we’ve seen mostly third-party ports on the system “ it doesn’t speak well of the Wii U’s processing power that we have yet to see a game that really performs markedly better than anything we’ve seen on other current consoles. If the system can’t manage that feat, it risks being quickly overshadowed by much more powerful hardware expected from Sony and Microsoft in the next year or so.”

I’d hesitate to draw that conclusion. The Wii U doesn’t need to perform at the same level as more powerful hardware that we may encounter in the next-gen PlayStation and Xbox. Nintendo’s audience isn’t concerned with being the most powerful, and it’s unlikely the Wii U will be overshadowed by specs alone.

So long as Nintendo can continue offering up a wide variety of content, and hopefully net a wider range of third-party titles, and offer their fans unique gameplay I think the Wii U will thrive. Not to mention, my bet is that developers have not yet fully grasped the Wii U’s new architecture, including the GPGPU and its capabilities. I suspect we’ll see Wii U graphics improve vastly even over the course of the next year or two.

Post Your Comment

Post Your Reply

Forbes writers have the ability to call out member comments they find particularly interesting. Called-out comments are highlighted across the Forbes network. You'll be notified if your comment is called out.

But there’s a very vocal contingent of people who say the Wii was a failure, even though it sold more than both of its competitors. The problem with any PR war is it can be won without ever having to utter a fact. It’s like Churchill said, “A lie gets halfway around the world before truth has a chance to get its pants on.” With the advent of the internet and social media I think you can remove ‘halfway’ from that quote.

I think there are enough vocal, developmentally stunted gamers out there to stigmatize everyone who owns a console, in the eyes of those who don’t. The same general population who made the cast of ‘Jersey Shore’ millionaires are the same group who sent Platinum Games death threats over the Bayonetta 2′ announcement and argue all day every day about how their chosen platform is ‘the best’. The world of gaming is often criticized for being sexist and childish and there are many gamers who are all too willing to help solidify that stigma.

One final point I want to make is that, with the success of tablet gaming, it’s amazing this sort of preoccupation with processing power, and it’s implications for success, can possibly gain any traction. Halo 4 isn’t inherently a better game than Angry Birds. It’s all subjective. There’s this false correlation people like to make wherein the more technically advanced a game is the better it is than anything that was released prior (or contemporarily). I just don’t get it. Why do people feel the need to gather into groups to approve or disapprove of what games other people enjoy? I think it’s a symptom of a much deeper problem that extends far beyond video games.

Yeah sorry, but I can’t agree. The weakest link in a hardware architecture is going to bottleneck the entire system for the remainder of its lifespan.

The other problem I see with people making these observations basically parroting sentiments like “graphics don’t matter” and “it’s all about the games!” again and again is that they don’t quite grasp that the specs aren’t just numbers people are saying and dumping on because they want to be mean to Nintendo or that they would only make the game look somewhat shinier, but that it *is* about the games. And updated hardware specs offer dozens of new possibilities in the base game design itself.

Ever wondered why there aren’t many enemies on a battlefield or why a lot of shooters are often just narrow corridor-like experiences? It’s mostly because of the hardware capabilities. There’s a trade-off between making something look presentable and closing it off in a small environment (levels are generally constrained to small sizes and possibly lots of loading in-between) and having more of an open world but being worse off as far as visuals go and having to do some coding-fu to be able to stream a lot of objects as they are needed, of which there is also an upper limit as to what can be on the screen at any one time the more constrained a platform is. For instance, remember the old story about Mass Effect 3 having no holster animation? That was apparently because they ran out of memory and had to cut something: http://www.ign.com/boards/threads/mass-effect-3-no-holster-weapon-mechanic.250051328/

There’s a lot more things to consider like complex AI, physics components and computations in games, amount of objects and entities displayed at once on the screen, complex AI, decreased scope of level design and features because of low memory, complex AI, entirely different UI and control-paradigm that has to be within certain rules and has to be adjusted for a specific controller that nobody else is likely to use instead of plain Controllers/Mouse+Keyboard, and there’s lots of other platform limitations that will come down to those games it’s apparently all about in the end, did I already mention complex AI?

If Microsoft/SONY play this well they can have all sorts of new possibilities and franchises coming to their doorstep while Nintendo has more of the same old Mario and Zelda gameplay, but this time with a gimmicky tablet controller. We will see if that will be enough this time around.

Except you are (partly) wrong. First, because there is empirical evidence of games running very well on rather slow CPUs, with what is regarded as one of the best gaming AI and with an overall fairly decent number of enemies, such as F.E.A.R., which can run on an old Pentium 4. Or even a plain ridiculously high number of enemies with Serious Sam (mind you, with a really basic AI) series (which happened to have a console version). Second, because the Wii U CPU is actually more powerful for pretty much everything that is commonly regarded as CPU-dependant thanks to its out-of-order execution (basically, it doesn’t have to wait for an instruction to be over to do something that doesn’t use the same resources) which allow for a higher instruction/clock rate. Something like AI will be much better handled with the Wii U CPU than with the Xbox or PS3 CPU. Third, because the Wii U GPU is much more evolved than the one in Xbox/PS3, and as such should support OpenCL. And with that, you can pretty much blow any CPU out of the water for physics calculation. nVidia PhysX ? It’s exactly this kind of thing. And no CPU can handle it without issue. Fourth, memory and CPU are completely different and even if Wii U appears to be lighter than what I had hoped (2GB, apparently with 1 GB for the OS), it’s still much more than the current generation. And I would bet that the holster weapon mechanic being cut due to memory is bollocks. Last but not least, because the market trend definitely don’t seem to follow a “more power !” way. The failure of the PS Vita pretty much proved that people didn’t care about graphics in their handheld console, and would rather buy a cheaper console like the 3DS with actual good games. I wouldn’t be surprised if the new Playstation and Xbox had similar specs to the Wii U.

You seem to be making the mistake of comparing the Wii U with the X360/PS3, which is apt but doesn’t do it any favours. These systems were designed ~8+ years ago and are still largely constraining the rest of the industry.

Here for instance is a comparison of Crysis (a game from 2007 no less with a lot of static objects and DX10) as it appeared on the PC compared with its XBox360 version that eventually came out: http://www.youtube.com/watch?v=AcKLjgWl7tM&hd=1 and while it is largely focusing on graphics you see the very obvious differences in vegetation levels or amount of items/entities at once on the screen and what they had to take away to make the console version be able to run in the first place. The game also had a rather open environment with “bubbles” of action how CryTek called it that you could engage in very different ways and somewhat competent AI that would react to that, although their target detection was sometimes wonky. Crysis 2 subsequently became a corridor-based on-rails shooter as it focused on consoles.

Another recent example was the Star Citizen KickStarter, Roberts made a point to mention that it’s a PC game and “not possible on consoles”, and while a lot of people took “it’ll have better graphics” out of that, there’s a lot of subtle and not so subtle things about it that ring true aside from that.

Take his GDC video for instance and go 24 minutes in http://www.youtube.com/watch?&v=7vhRQPhL1YU

You have a huge spaceship people can walk through as a singular level (instead of the level loading required in other games), you can actually view what is happening outside of it and as he showed off you can get in a ship, fly outside and can look through the window and see what NPCs and other PCs are doing inside.

This is all running on a current gen PC and if this was a console game you would likely not only have less details, but matte painted windows to all the ships (and likely some sort of level transitions), because they simply don’t have the capability.

There wouldn’t be a dozen moving pieces realistically animated and moving on your space ship and you couldn’t have hundreds of turrets and lasers all acting on their own since it would simply not be possible (although I still doubt some of his ambitions in general). You wouldn’t see everything moving in the cockpit and your player character being able to interact with it, you would largely just have a static “image” of something that looks like a cockpit or none at all since it saves resources. Add physics simulation to that, for instance ships trying to react to physical influences like impacts or changes to their damage model, which is only possible in the simplest manner on “current gen” consoles, but was possible to a much larger extent on PCs years ago and you get *glimmers* of what gaming as a medium is missing out on by still targeting ~8-10 year old hardware as standard platform and having to mostly abide by those specifications today. And some of the AI he’s shown off already would likely be solved in a scripted way otherwise. It’s small things that add up to the whole.

Even if Microsoft/SONY tried to stay somewhat on the low-powered side, which I highly doubt they’d be hard-pressed to do worse than Nintendo. Don’t forget that their current hardware is already stagnating and selling less and less numbers (and games) for a while now.

And you’re doing a bang-up job trying to defend Nintendos shortcomings in a lot of ways, from the capabilities of their CPU, the amount of usable RAM to their GPU (which is surely an improvement, but can’t do too much on its own and is still at the most capable of DirectX10.1/OpenGL 3.3 or equivalent instructions (if that) from 5 years ago. I wouldn’t want to make too many projections on their hardware capabilities without knowing what exactly we are talking about. But the proof is very much in the pudding as they say, and at the moment the pudding (or your “empirical evidence”) says that the Wii U is even having trouble with games designed for the old platforms it was supposed to compete against at resolutions of below 720p like 880×720 for Black Ops 2, and that’s not a very good start or sign that it is a technological powerhouse in any way.

As to what the PS Vita “tells us”, you seem to be jumping the gun in your analysis wanting to tell us exactly that, I’d find it much more likely that dedicated gaming handheld systems have lost a lot of their appeal with platforms like Android/iOS getting larger. Or maybe that people want to play their Uncharted and Assassin’s Creed on the big screen, just the same way I want to watch The Hobbit soon (in 3D and @ High Frame Rate)?

I am comparing the CPUs because first, that’s what everyone seem to be doing so I need to adapt to a baseline, second, because we don’t have much to work with anyway except the clock speed and the architecture. And heck, we’re having trouble comparing CPUs with similar instruction sets, I can’t imagine how much of a mess it’d be to compare it to a x86. Now, for direct comparison, I’ve looked up a french article which used a GTX 680 along with a 2006 CPU, the Athlon X2 5000+ ( http://www.comptoir-hardware.com/articles/cartes-graphiques/18613-test-la-gtx-680-sur-un-pc-gamer-de-2006.html?start=2 ), to see the impact of CPU on performance. Of course, it’s fairly obvious that there is a difference, except for a mediocre shooter (Sniper Elite V2). However, what you can see from the Crysis Warhead benchmark sings another tune compared to what you’re saying. The CPU isn’t that limiting in performance. We’re seeing a 30 FPS average on a CPU that was fine, but average, 6 years ago. Obviously, the test was made in the highest graphical settings, which shows one thing : the differences between the Xbox/PS3 version (both console have a better CPU than the Athlon here) and the PC version are far from being CPU dependant. And on the GPU, there is a massive difference between the older consoles and the newer. A game like Crysis could certainly be done on Wii U. And from what we can see overall, there is a massive difference between games that are designed for PC (Crysis, STALKER, and possibly Sniper V2) and the rest that was designed to run on console. That may very well be because of the engines being designed to offload GPU work to the CPU.

Star Citizen is interesting. I’m following this very eagerly (not only because I love space sim) but also because the videos were stunning. However, he’s overblowing a lot of things. A game like Freespace 2, even if it didn’t handle the inside of the vehicle, allowed for ships with lots of turrets and subsystems. Even more recently, a game like X3 Terran Conflict allowed for pretty big ships, and for obviously huge maps, and that ran on a Core 2 Duo. Rendering the inside of ships requires more memory, and more graphical power, but certainly not much in the way of CPU. And since I like repeating myself, game AI aren’t that complicated, games like F.E.A.R. did it on a Pentium 4, it’s mostly due to them being hard to create.

I wouldn’t advance myself too much on the GPU. It may be a E6760 (from what I remember, and it should fit clock speed-wise), and that is a DirectX11 GPU. Doesn’t matter much until we know actually about it. Still, that’s certainly a lot more powerful than the Xbox and PS3 GPU, and should allow for a lot of calculations to be done “properly” on the GPU rather than being offloaded to the CPU. Which brings me to what’s right after. A game engine is something huge. As in, hugely complicated. To completely rework it to change the work from being delivered to the CPU to being done on the GPU would require a lot of time, and money. As such, it’s obvious why a multiplatform game would run like crap. It’s not designed the same way AT ALL. A CPU like the Wii U’s isn’t designed for that kind of task (as proven by its weak SIMD), contrary to the PS3′s and the Xbox’s. A game like Nintendo Land, even if far from being a huge graphical milestone (I’m guessing we’ll have to wait for those a bit longer), runs at above 50 FPS according to Eurogamer, and at 60FPS in actual gameplay.

And for the part on the Vita, I was tired, and completely forgot one part about the 3DS, which is it actually didn’t sold at its launch price. However, at a lower price, it actually sold quite well and some games are pretty big successes. I don’t think people wanting Android/iOS games are the same public as those wanting a handheld console.

And final thing I want to mention, I’m not in any way shape or form trying to defend Nintendo as a company. I’m a PC gamer myself. However, reading everywhere than the console isn’t powerful compared to the current generation is making me mad.

I don’t really want to go into all of that again, suffice to say that Crysis as it was released on PC being able to run on the Wii U is very likely a false statement. The amount of memory alone games are alloted would prove an almost insurmountable hindrance and it would more resemble the X360 port of the game.

The only information that I can find of people saying the GPU might be able to do DirectX11 or equivalent dates 6+ months back before it was released and some very vague statement made by the Unity CEO a little while back. Everything else states that it is DirectX10/10.1 or OpenGL 3.0/3.3 or equivalent capable, the second option being if one feels optimistic about it.

We will likely not know exactly soon because Nintendo is (understandably) tight-lipped about it, suffice to say that they released a spec that their maximum texture resolution size is 8192×8192, while all DirectX11 capable chips support up to 16384×16384. All ATI Evergreen based GPU architecture suports the latter, including the 5000 series (5450 up to 5970) and the 6000 series even moreso.

Also, if you think that companies like CryTek/Unreal and similar are going to cater to Nintendo and jump on their bandwagon until they’ve proven that they can actually sell games or that there will be complex engines developed solely to complement the Wii U architecture by anything else than First-party titles you suffer under delusions :P All Nintendo is going to get in the near future are going to be the kind of ports they have gotten so far, and we will see how the matter will develop once the competitors are out sometime next year.