Apple's Lightning-to-HDMI adapter and the product page for the iPad mini claims that the device is able to output 1080p images to televisions. But that may not be strictly true, according to some research from developers over at Panic. They've discovered two separate-but-related issues with the claim: first, the Lightning version of the adapter only supports an output resolution of 1600×900 from the iPad mini, not 1920×1080 as advertised. The second is that the output is full of visual artifacts consistent with signals that have been compressed and sent to a TV though Apple's AirPlay feature.

The Panic Blog proceeded to tear one of Apple's adapters apart and, with the help of an anonymous commenter purporting to be an Apple employee, figured out what was going on: the iDevice is actually sending an H.264-encoded image through the Lightning interface to the adapter, where a small ARM chip decodes that information and outputs it over HDMI.

Enlarge/ A small ARM chip decodes an H.264 stream sent from the iOS device and outputs it over HDMI.

"The reason why this adapter exists is because Lightning is simply not capable of streaming a 'raw' HDMI signal across the cable," said the supposed Apple employee. "Lightning is a serial bus. There is no clever wire multiplexing involved."

The upside of this approach is that it shifts the complexities of supporting new output interfaces away from the device and to the cable itself—if something comes along and replaces HDMI (or if you want to connect to another existing interface, like DVI or even GPIB), you won't need to throw out your expensive device and get a new one to get support. Apple just has to update its software to support the interface and make an appropriate adapter to supply the hardware, since the adapter is the piece that's actually responsible for handling output.

One downside of this approach is that it's going to result in more costly adapters for consumers—the Lightning-to-HDMI adapter in question costs $49 on the Apple store, where the older 30-pin-to-HDMI adapter costs $39. Another is that, at least for now, the image quality and resolution delivered by the Lightning version of the adapter is lesser than that provided by the older 30-pin version of the cable. However, future iOS updates should be able to add true 1080p resolution support to the adapter as well as improved streaming quality.

"Certain people are aware that the quality could be better and others are working on it," said the Panic Blog's anonymous commenter. "For the time being, the quality was deemed to be suitably acceptable."

Andrew Cunningham
Andrew has a B.A. in Classics from Kenyon College and has over five years of experience in IT. His work has appeared on Charge Shot!!! and AnandTech, and he records a weekly book podcast called Overdue. Twitter@AndrewWrites

122 Reader Comments

Yes, because it's far more likely that HDMI will suddenly become obsolete than I will have to replace my iDevice and/or lose the $50 adapter repeatedly. I'm so glad Apple has future-proofed me away from the ravages of HDMI standardization.

The upside of this approach is that it shifts the complexities of supporting new output interfaces away from the device and to the cable itself—if something comes along and replaces HDMI (or if you want to connect to another existing interface, like DVI or even GPIB), you won't need to throw out your expensive device and get a new one to get support. Apple just has to update its software to support the interface and make an appropriate adapter to supply the hardware, since the adapter is the piece that's actually responsible for handling output.

I'm not getting this upside. Why can't software in the main device switch the protocol? It would seem to be easier to manage software updates on the main device than on a cable. And if the cord and main device are dumb, why couldn't one stick an optional, smart adapter on the cord? The method Apple's using seems to be a hack around poor planning when designing Lightning or the OS that uses it.

I thought Lightning had a DisplayPort stream embedded in it or was using it as a transport? DP provides a full-fat signal that is converted by adaptors in a straightforward fashion without producing a 'livestream' of your device. Apple even pioneered DP what is going on here?

The upside of this approach is that it shifts the complexities of supporting new output interfaces away from the device and to the cable itself—if something comes along and replaces HDMI (or if you want to connect to another existing interface, like DVI or even GPIB), you won't need to throw out your expensive device and get a new one to get support. Apple just has to update its software to support the interface and make an appropriate adapter to supply the hardware, since the adapter is the piece that's actually responsible for handling output.

I'm not getting this upside. Why can't software in the main device switch the protocol? It would seem to be easier to manage software updates on the main device than on a cable. And if the cord and main device are dumb, why couldn't one stick an optional, smart adapter on the cord? The method Apple's using seems to be a hack around poor planning when designing Lightning or the OS that uses it.

Well, keep in mind that the old 30-pin adapter has been around for 10+ years at this point. If you suppose that Lightning will be around at least as long, it makes more sense (though TBH it seems like we're moving away from wires altogether as time goes on).

the Lightning version of the adapter only supports an output resolution of 1600×900, not 1920×1080

Then the upside is supposed to be for new formats...like what? 4K? If they can't even manage proper HDMI, which has been around for a while, how will they even support newer formats?

And don't get me started about OSX's lack of support for hd audio formats. When I was interested in a Mac Mini recently to use as an HTPC, the best answer I got was "bootcamp into Windows", which could support DTS-HD/Dolby TrueHD.

I love you sometimes, Apple, but seriously. WTF? Why the resistance sometimes? Why you hate HDMI?

I thought Lightning had a DisplayPort stream embedded in it or was using it as a transport? DP provides a full-fat signal that is converted by adaptors in a straightforward fashion without producing a 'livestream' of your device. Apple even pioneered DP what is going on here?

I thought Lightning had a DisplayPort stream embedded in it or was using it as a transport? DP provides a full-fat signal that is converted by adaptors in a straightforward fashion without producing a 'livestream' of your device. Apple even pioneered DP what is going on here?

That's Thunderbolt.

And people actually hook their phones or iPads up to their TVs? That's craaaazy.

The price for a much smaller interface is that Lightning simply can't support the HDMI bandwidth. Given that, compression and other games (have a cable with 8 GB of memory and store the whole movie in the cable before playing!) is the only choice available.

There is no magic button to allow a hardware interface to increase it's bandwidth.

he is saying keep it all software, don't include extra hardware but possibly at an end point.that way you buy only what you need. i.e. if you don't need the HDMI output, you don't have to pay for it.

they are doing what you suggested.

No, that isn't what I suggested. There's already a computer at one end: the device you connect the cable to. It'd be simpler for that device should handle all the smartness instead of introducing a new device in the cable. That is, there should be no need to introduce any new hardware. Of course if the cable can't handle the bandwidth, it might require a hack like what Apple did, but as I said, if a just-introduced cable can't handle current use cases, that indicates poor planning.

And people actually hook their phones or iPads up to their TVs? That's craaaazy.

Now, for a moment, think about a family traveling with kids, hooking something up to the hotel TV upon arrival. Any less crazy?

For a business traveler, imagine you're at a conference, and you get some potential customers/partners to chat. Again, connect iPad to TV, but this time, run Keynote (or DocumentsToGo or whatever) instead of some SpongeBob videos. Or maybe just give a demo of your actual product.

This is very useful stuff. Makes me glad I've got the last generation that uses the 30-pin port. Here's hoping that, before my hardware gives out, iOS switches to Thunderbolt or somesuch, or high-end devices start including HDMI connectors.

Or alternately, that someone makes an Android (or Ubuntu or whatever, maybe even Windows 8/9/X) tablet that I actually find usable. It's gotten to the point where I'll seriously consider it!

I will say this: for the business traveler case (presentations and demos), the higher price point coupled with the identical quality and more convenience makes traveling with an AppleTV start to sound a little better.

When I knew for sure I could get better results with less impact on device resources with an adapter cable, that was one thing. But if it's all being compressed into AirPlay regardless, why not spend slightly more and have the untethered video option?

(I used to travel with my own AirPort Express base station, to make hotel wireless suck less. Long before that, I used to travel with one of the old "graphite" base stations, turning dialup into wireless. I really could see doing this, depending on how things go.)

Apple without Jobs is a failure. These halfa$$ed products without proper planning are reminiscent of the 90's Apple.

At first I wouldn't agree with you. I'd lean back on the "Jobs had a roadmap planned". But now, it's not like it's failure after failure, but it's just half-assed or half-baked now. Seems like it's all the time.

I doubt I'll switch from the Apple products I have just yet -I love my iP5, theAirPlay, iTMS Match and other benefits are too good-, but I am disappointed.

Apple's now in the business of half-assing things? That's a good way to lose that luster.

No, they're in the business of convincing people to buy things without looking at features, performance, or price. As such, as long as it has an Apple logo on it, the quality of the output is neither here nor there: "suitably acceptable".

Does anybody know exactly how much bandwidth the Lightning cable has versus how much is required to support a "raw" HDMI signal? I have yet to see anything that spells out exactly how fast Lightning is supposed to be and it's now been out for a half a year already.

And people actually hook their phones or iPads up to their TVs? That's craaaazy.

Now, for a moment, think about a family traveling with kids, hooking something up to the hotel TV upon arrival. Any less crazy?

For a business traveler, imagine you're at a conference, and you get some potential customers/partners to chat. Again, connect iPad to TV, but this time, run Keynote (or DocumentsToGo or whatever) instead of some SpongeBob videos. Or maybe just give a demo of your actual product.

This is very useful stuff. Makes me glad I've got the last generation that uses the 30-pin port. Here's hoping that, before my hardware gives out, iOS switches to Thunderbolt or somesuch, or high-end devices start including HDMI connectors.

Or alternately, that someone makes an Android (or Ubuntu or whatever, maybe even Windows 8/9/X) tablet that I actually find usable. It's gotten to the point where I'll seriously consider it!

What are you looking for to make it "usable"? I find both the Nexus10 and AsusTransformer to be extremely usable.. and both have micro-hdmi out.. The Samsung has an s-pen with the ability to use more than one app on the screen at a time.. haven't tried a Windows8 tablet yet (would like to), but they seem usable..

Consider me someone surprised and disappointed by this news.. seems like a compromise/oversight that shouldn't have been made..

It seems to me they designed the connector with accessories in mind, and when it came to video output, they just had to hack something together as best they could.

With a raw stream being provided in hardware, and the actual smarts being in the hardware and software for the adapter, that means accessory developers will theoretically have a lot more flexibility than was offered by the old 30-pin adapter.

With the old adapter, the pinout was fixed, offering composite video, S-Video, audio, USB and (originally) Firewire. It could detect when you plugged in a HDMI adapter, and support that instead... but that was an exception fixed in hardware.

The new adapter offers a plain stream of data. Which means it does some things, like HDMI output much worse, but it has the potential to support accessories in a much more flexible way.

Given I use iPad/iPhone accessories regularly, and use Airplay for streaming video, I think it's a fair trade-off. If I used HDMI with my iPad extensively, I'd probably think otherwise.

Apple without Jobs is a failure. These halfa$$ed products without proper planning are reminiscent of the 90's Apple.

the iphone 5 was released just over a year after Steve Jobs resigned from Apple's board. Clearly he would've had no involvement in it's design as Apple doesn't even think about future phones until the current one is released. New connector, new adapters, new manufacturing process for the aluminum case, new OS. Jobs died, all Apple employees turned to shit and they started on a new phone.

I thought Lightning had a DisplayPort stream embedded in it or was using it as a transport? DP provides a full-fat signal that is converted by adaptors in a straightforward fashion without producing a 'livestream' of your device. Apple even pioneered DP what is going on here?

So, to solve a minor future-standards-compatibility concern, they came up with an expensive implementation that cannot properly handle HDMI 1080P (circa 2002/2003 standard)? Even if it was the correct 1920x1080x60hz progressive scan resolution, h.264 will still always be lossy and artifact. That's seen as an acceptable compromise over a network, but there's no excuse for such lazy engineering from the PC to the display.

If the adapter only supports 1600x900, how is an iOS update supposed to provide 1080p support?

That's somewhat vaguely covered by the article:

The Article wrote:

The Panic Blog proceeded to tear one of Apple's adapters apart and, with the help of an anonymous commenter purporting to be an Apple employee, figured out what was going on: the iDevice is actually sending an H.264-encoded image through the Lightning interface to the adapter, where a small ARM chip decodes that information and outputs it over HDMI.

The limitation being spoken about is that iOS is only sending the stream at 1600x900. Presumably -- and that's a large assumption -- they could increase it to 1080p from iOS on any constituent device. The assumption lies in whether or not the adapter can sufficiently handle that stream or that the stream itself can be compressed enough -- without further artifacting -- to make decoding it more efficient to work within the adapter's hardware constraints.

I would assume that there's a reason it's locked in at 1600x900, and that is entirely due to the adapter's hardware limitations. So the improvement will have to come in how the stream is sent to the adapter.

"Another is that, at least for now, the image quality and resolution delivered by the Lightning version of the adapter is lesser than that provided by the older 30-pin version of the cable. However, future iOS updates should be able to add true 1080p resolution support to the adapter as well as improved streaming quality."

Why does the story author assume Apple plans to fix this issue, or that they even can?

Yeah, if this were that forward thinking, could they not have just used a Thunderbolt interface+connector?

The form-factor for one.

Then there's the fact that Thunderbolt combines PCI Express and Displayport.

Sure, they could support Displayport, but PCI Express on a mobile device?

Well, they could just have used a different connector if that were the problem.

The PCIe would let them use an off-the-shelf chips for USB 3 or SATA interop as needed, and I don't think, at this point in time, that it would imply overhead that is out of reach for mobile platforms.

"Another is that, at least for now, the image quality and resolution delivered by the Lightning version of the adapter is lesser than that provided by the older 30-pin version of the cable. However, future iOS updates should be able to add true 1080p resolution support to the adapter as well as improved streaming quality."

Why does the story author assume Apple plans to fix this issue, or that they even can?

Because the story isn't about the issue specifically, but rather that Lightning was designed with this flexibility in mind. Whether or not this specific issue *can* be fixed with the current adapter is not the focal point of the article -- the focal point of the article is illustrating how much of the processing has been moved to the cable/adapter to meet Lightning's design requirements.

In theory, this would allow more efficient streaming mechanisms to be utilized with the current adapter to produce 1080p.

Why does the story author assume Apple plans to fix this issue, or that they even can?

The limitation seems to be in the adapter itself.

I'm sure they can fix it... but not necessarily with the current adapter. By the time the next iPad rolls around, they will be able to put a faster ARM chip in the video adapter, and market it as the "New New iPad Video Adapter With Crisper Video".