We reported over the weekend that there was some confusion over exactly how Apple’s new Lightning digital AV adapter works and why it lacks the ability to carry a native 1080p signal. One theory is that Apple was using an AirPlay wireless streaming protocol, but we’ve since learned that is not the case. According to a post that purports to be from an anonymous Apple engineer explaining how the cables function, Apple does not use Airplay protocol. It instead uses the same H.264 encoding technology as AirPlay to encode the output into the ARM SoC. From there, the data is decoded and sent over HDMI:

It’s vastly the same thing with the HDMI adapter. Lightning doesn’t have anything to do with HDMI at all. Again, it’s just a high speed serial interface. Airplay uses a bunch of hardware h264 encoding technology that we’ve already got access to, so what happens here is that we use the same hardware to encode an output stream on the fly and fire it down the Lightning cable straight into the ARM SoC the guys at Panic discovered. Airplay itself (the network protocol) is NOT involved in this process. The encoded data is transferred as packetized data across the Lightning bus, where it is decoded by the ARM SoC and pushed out over HDMI.

Perhaps even more interesting is that Apple could improve the quality with future software updates since the firmware is stored in RAM as opposed to ROM. The poster noted that Apple deemed the quality “suitably acceptable” but *will* make improvements with future iOS updates: expand full story