ONE model of Apple getting into gaming would be that AppleTV is the console, and you play games with dumb controllers --- whether D-Pad, or Wii, or even Kinect.But that's an old model, and why reinvent the wheel?

Another model is that the AppleTV is essentially a co-ordinator/server/display controller, but the bulk of the computation is happening on each user's controller (which is of course an iPad or iPhone). In such a model, there is less need for ultimate CPU/GPU performance in the Apple TV box because most of the work is done in the iPad/iPhone.

Obviously this second model requires all users to own an iPad/iPhone/iPod Touch (which makes Apple happy), but it also allows for much more total compute power to be thrown at the problem. On the other hand, it's harder to program for (though this type of programming has become pretty mainstream in gaming, I'd imagine) and, bigger problem perhaps, it does's immediately lend itself to unusual game-specific peripherals (rifles, steering wheels, rock band instruments, etc). However it would not be hard to have those peripherals connect to the "owning" iOS device by BlueTooth.

Point is, to the extent that Apple wants to go down this path (of being a living room game hub), I see they will do it by requiring each user to own their own iOS device, and so they don't need as much computational capability in the Apple TV as a traditional console.Reply

Such a programming model is not mainstream, and would be very difficult. The ability for each controller to be the powerhouse would effectively require the main console to be a server and have all of the players as network clients. Hardly an easy feat for indie developers. Sure your big name game companies won't have much of a problem making their engine work, but the hassle of ensuring a solid network layer and bug-free gameplay is far from ideal.Reply

I wonder if this is Apple's 28nm HPM TSMC test chip? Assuming the iPhone 5S will have a June WWDC and will be using a 28nm HPM TSMC A7 SoC, the A6/A6X design might have been finished too late to be used as a test vehicle and have the results fed back into A7 development. The A5 wouldn't be a very good test since it'd be a very small chip at 28nm. The A5X is a good choice since it's a known design while being a big die chip for a harder test.

On a side note, I think the realignment of the iPhone and iPad launches are largely driven by SoC development. Unlike the iPad 2/iPhone 4 A5 SoC days, the resolution difference between the devices means they can no longer share a chip. It makes sense that the iPad uses the iPhone's SoC with an enlarged SoC. That necessitates the iPhone launching first each calender year and the iPad following. So I think the iPhone launching mid-year with an A_ SoC followed by the iPad launching September/October with a derivative A_X SoC will be the pattern going forward. Reply

An acorn hits you on the head, and the sky is falling? I've seen other statements like this, and they puzzle me. Let me see if I understand this correctly: Apple will never release a new, innovative, or competitive product again.

Great. Can I buy your AAPL stock now?

But seriously, Apple will remain competitive for years and years. And you should thank them for initiating this wonderful technology horse race, Apple vs. Google vs ?Microsoft? (marginally). Apple is doing just fine.

Right now the Apple TV 3 has it's limits with the max video bit rate for video coding layer. I've never had a problem with the Apple TV 3 preset in Handbrake but I've experimented with higher bit rates and 2 things happen:

1) There isn't enough wireless bandwidth to stream it over wireless which necessitates a move to a wired connection.2) The Apple TV 3 "plays" the file with the spinner spinning away but without any audio or video on the screen besides the scrubbing bar.

So I'd be interested to see AnandTech test the new Apple TV's ability to decode varying bit rate files versus older hardware to see if we have any new capabilities. If so, then I might have to re-encode my Blu-Rays for better quality output.

Beyond that, I don't see 4K being included any time soon. HDMI 1.4a supports 4096×2160 (4k) not at 60Hz or 30Hz but only at 24Hz at 36-bit/px color maximum. 3D support is limited to 1080p24. HDMI 4.1b allows for 1080p120. We'll need HDMI 2.0 to get 4096x2160p60 which is a ways away.

As for 3D, I don't see the Apple TV supporting frame packing any time soon so, at the moment, we're stuck with doing side-by-side (SBS) or over-under (OU) split frames and then relying on the TV to put it together in 3D. This effectively gives us "half" the resolution (I cringe when I describe it like that). With that said, I'd love a free frame packing encoder so we could re-encode 3D like the big studios but I doubt we'll see it any time soon. Let's not forget we just got Stereoscopic Player to allow us to playback MVC using a PC (and hardware players from companies like Dune are just emerging).

To be completely honest, my guess is that this is a way to use all the left over A5Xs and the capabilities won't be revolutionary only evolutionary. We'll have to wait a bit longer to see 3D and 4K content.Reply

Seeing as how this would be the only product Apple currently sells that uses the A5X, perhaps this is a sign that we'll see Apple implement the A5X in another product soon. If it is a die-shrink as smalM notes it could be above, maybe they're testing the process as they did with the A5 die shrink and Apple TV v3, which later showed up in the iPod Touch, iPad mini, and iPad 2 r2.

If so, I'm a little surprised they wouldn't jump straight to some sort of A6 swift-core variant.

then again, maybe they just wanted to beef up the GPU of the apple TV and this was the most cost-effective way, which might mean volume production on the A6 isn't quite as cheap as they could get on the A5X, despite the cheaper ARM license. Reply

...Apple originally moved from A5 to A5X was to address higher resolution output for iPad Retina, not to add horsepower. If they were after the gaming market surely the A6 would be the way to go. How does A5 vs A5X rate in terms of HEVC compliance? That may provide more of a clue.

I was also hoping they could fuse AppleTV and Airport Express products to bring media streaming under better control/minimal config. but I guess they would have chosen higher throughput 802.11n or even 802.11ac if that were the case. I also think BT would work well for a separate, low-latency, easy pairing controller but I'd hoped they'd avoid the dumb controller route.

Not entirely the upgrade I was hoping for but hopefully it means they won't be going too far down the future-of-TV-is-a-bunch-of-disparate-apps path.

* Apple TV is going to ramp up in functionality and will need more horsepower.

* Apple TV is a testbed for a new chip(process node) that will eventually make its way into a new product.

* There are a bunch of leftover A5Xs and the new part number just means they've been binned/have a disabled core.

I don't know why more people don't mention the leftovers theory. That's the only reason the Apple TV got the A5

I think the testbed and leftovers theories are most likely--or some combination of them. (For example, the new Apple TV may just be the scrapings of a new smaller process A5X.) Apple may eventually add more functionality to the Apple TV but it will probably be done on the cheap.

It is weird to me still that a TV is lower resolution than a phone.Reply