Share this story

Apple's Lightning-to-HDMI adapter and the product page for the iPad mini claims that the device is able to output 1080p images to televisions. But that may not be strictly true, according to some research from developers over at Panic. They've discovered two separate-but-related issues with the claim: first, the Lightning version of the adapter only supports an output resolution of 1600×900 from the iPad mini, not 1920×1080 as advertised. The second is that the output is full of visual artifacts consistent with signals that have been compressed and sent to a TV though Apple's AirPlay feature.

The Panic Blog proceeded to tear one of Apple's adapters apart and, with the help of an anonymous commenter purporting to be an Apple employee, figured out what was going on: the iDevice is actually sending an H.264-encoded image through the Lightning interface to the adapter, where a small ARM chip decodes that information and outputs it over HDMI.

Enlarge/ A small ARM chip decodes an H.264 stream sent from the iOS device and outputs it over HDMI.

"The reason why this adapter exists is because Lightning is simply not capable of streaming a 'raw' HDMI signal across the cable," said the supposed Apple employee. "Lightning is a serial bus. There is no clever wire multiplexing involved."

The upside of this approach is that it shifts the complexities of supporting new output interfaces away from the device and to the cable itself—if something comes along and replaces HDMI (or if you want to connect to another existing interface, like DVI or even GPIB), you won't need to throw out your expensive device and get a new one to get support. Apple just has to update its software to support the interface and make an appropriate adapter to supply the hardware, since the adapter is the piece that's actually responsible for handling output.

One downside of this approach is that it's going to result in more costly adapters for consumers—the Lightning-to-HDMI adapter in question costs $49 on the Apple store, where the older 30-pin-to-HDMI adapter costs $39. Another is that, at least for now, the image quality and resolution delivered by the Lightning version of the adapter is lesser than that provided by the older 30-pin version of the cable. However, future iOS updates should be able to add true 1080p resolution support to the adapter as well as improved streaming quality.

"Certain people are aware that the quality could be better and others are working on it," said the Panic Blog's anonymous commenter. "For the time being, the quality was deemed to be suitably acceptable."

Share this story

Andrew Cunningham
Andrew wrote and edited tech news and reviews at Ars Technica from 2012 to 2017, where he still occasionally freelances; he is currently a lead editor at Wirecutter. He also records a weekly book podcast called Overdue. Twitter@AndrewWrites

120 Reader Comments

The lightning connector can push uncompressed 1080p. The A6 cannot encode the screen output into 1080p h.264 in real time, so it pushes a lower-res stream to the adapter, which then turns it into raw, uncompressed HDMI bitstream. When the A7 or whatever is released, using the same adapter, the quality will go up as the newer chip will be able to encode h.264 at a higher res in real time. In 5 years, the same adapter, in theory, should be able to output 4k over HDMI, as long as the hardware on the device side can keep up. That's the cool part about isolating the HDMI conversion from the device.

These two sentences contradict one another, if it's encoded it's not uncompressed. If lightning could do uncompressed HDMI then the speed of the A6 would be irrelevant it would just dump the stream to the wire and the wire would dump it to the display. All the complexity is there to reduce the bandwidth to what fits over 'Lightning' connections. Thus far that would appear to be 1600x900, if they start to encode to 1080p then they would necessarily have to use a higher level of compression to keep the overall b/w the same.

Yeah, I think it would make sense if it said "can't push uncompressed 1080p" - I'd guess that's what was intended?

I'm not sure what customer today will be impressed that their $50 HDMI cable might work better in 5 years. Cool, sure. The A7 or some later chip may even do 4K well, which makes this solution technically really elegant, but the issue is that the current chips can't even do 1080p well, so it's a really terrible solution to roll out the door. Hello Theory, I'd like to introduce you to my friend Blurry Suttery Laggy Artifacted Video. (Yes, he has a long name...)

Yeah, if this were that forward thinking, could they not have just used a Thunderbolt interface+connector?

The form-factor for one.

Then there's the fact that Thunderbolt combines PCI Express and Displayport.

Sure, they could support Displayport, but PCI Express on a mobile device?

Well, they could just have used a different connector if that were the problem.

The PCIe would let them use an off-the-shelf chips for USB 3 or SATA interop as needed, and I don't think, at this point in time, that it would imply overhead that is out of reach for mobile platforms.

There is a lot of engineering that can be thrown at this discussion, but that doesn't seem to fit with what Ars is these days. I personally like PAIe, I have a fair amount of experience with it. But it doesn't really make sense for an ARM based mobile platform. The Lightning connector does make a lot of sense though, from what I have been able to learn, it's a bit like "thunderbolt lite". With the number of processors doing work rather than ASICs, I will bet there is the opportunity for things to get updated via software. This connector may have been about meeting the hardware engineering challenges first then following along with software to complete the solution. At least, that's the approach I would take if running a project like this.

If you don't like the way Apple has decided to provide HDMI, then don't buy it. (i don't like it, being a video producer)

But lets not lose touch with reality - if the 0.0002% of people that can even notice the lack of quality will stop buying Apple stuff and go buy a HTC Galaxy Droid, it won't change the fact that Apple sold more iPhone 4sess than Samsung sold Galaxy S3s, and Apple sold more iPhone 5s than Samsung sold Galaxy S3s.

That is to say: in a world where good enough is good enough - MP3s (instead of lossless), digital HD downloads at 4mbps (instead of BluRays at 35 mbps), Apple's plan will, as always, probably thrive despite the outrcy of the Nerds because the convenience, ease of use, and reduction in size and cost of the Phone (seriously - how many people do you think ever plug an HDMI cable into their phones that doesn't read Ars/Gizmodo/TheVerge???)

Lol.... Samsung sells more than just 1 phone. I believe their market share is significantly bigger than Applesauces.

If the limitation really only applies to display mirroring as some comments suggest, given that the iPad mini (which is apparently what was tested) only has VGA resolution to begin with, why would anybody even expect the output to be 1080p for this particular use case?

So, if it's similar to the way the video cable works on the 30-pin iOS devices, "display mirroring" isn't quite right.

When you plug a video cable of any kind into a pre-Lightning iOS device, the software can detect this. You can discover that there's an external display, and you can configure it somewhat via the frameworks. You can ask for a list of supported resolutions, and you can force the display to switch to a different resolution. You can actually use it for stuff other than mirroring or media -- for example, presentation software can show slides on the external display and notes on the built-in one. If used in this mode, the resolution (and orientation, portrait vs. landscape) supported externally is in *no* way constrained by the built-in display.

On sufficiently recent devices, when you're running software that doesn't "know about" multiple displays, it just mirrors to them. But that's actually a fallback (and it was added later), not the most basic functionality.

I'll go dig out my old source code for a trivial iOS app that probed all the displays and reported on all the resolutions each of them could support, and see if I can get a buddy with a Lightning-based iOS device to run my test code.

Apple's 30-pin connector was far more stable than the other "standardized" connectors, especially USB.

So, WTF did they ditch it for something inferior?

Physics. They were trying to shrink the amount of space the connector took on the board and all, so they could make the phone thinner and lighter and stuff. They've actually been pretty clear about that.

I really wish they'd stuck with the 30-pin connector, but it was preventing them from moving in the miniaturization directions they wanted.

(Myself, I'd almost certainly buy a thicker iPhone with better battery life if they offered one. It's already more than thin/light enough. Increasingly, I'm just not their target consumer.)

Yes, because it's far more likely that HDMI will suddenly become obsolete than I will have to replace my iDevice and/or lose the $50 adapter repeatedly. I'm so glad Apple has future-proofed me away from the ravages of HDMI standardization.

I thought Lightning had a DisplayPort stream embedded in it or was using it as a transport? DP provides a full-fat signal that is converted by adaptors in a straightforward fashion without producing a 'livestream' of your device. Apple even pioneered DP what is going on here?

That's Thunderbolt.

And people actually hook their phones or iPads up to their TVs? That's craaaazy.

I plug my LG Optimus G straight into my TV with an MHL cable. The phone supports it and I can watch movies I rent from the play store on my TV. And my phone supports miracast and wifi-direct.

You mean in lossy H.264 just like the case here? Except now encoding a larger surface to the same bandwidth. Congratulations, but it can it send 1080P without artifacts, or even just mirror its native resolution?

Some of us were telling you from the very beginning that Lightning was likely nothing more than a non-standard USB port. "Oh no," you replied, "that's not clever enough for Apple!". We'll apparently Apple is so unclever that they actually did repurpose USB for Lighting, and not only that they apparently also weren't' clever enough to include MHL either.

Reading all the comments, I think there's a lot of speculation and misunderstanding flying around. Ars should have investigated further on this instead of basically repeating what that anonymous guy at Panic blog said.

I do have an iPad mini, but not the adapter, so I can't test the claims right now. But there are people saying that playing 1080p video on the device will make the adapter output 1080p. This would be without adding more compression artifacts, because the H.264 stream is passed directly to the adapter. This should be tested.

Another thing one might take a look at is if alternate media players available on the App Store, such as AVPlayer, can output similarly with non-H.264 stream.

Next, there's the issue of the AirPlay Mirroring getting 1600x900 resolution. One problem, as a few people noticed, is that iPad mini's resolution is only 1024x768. This means the screen is actually upscaled. I'm not sure why it's not just outputting at 1024x768, or heck, if the screen ratio is to be kept at 16:9, go with 1366x768... But in any case, the adapter doesn't actually NEED to go 1920x1080.

Now, that means one thing we can check is to plug the adapter into 4th gen. iPad, which has 2048x1536 resolution, and see if the mirrored screen comes out at (downscaled) 1920x1080. This is also testable.

These are the minimum that Ars, or anyone else actually willing to go beyond the fixation of the supposed 1600x900 limitation, should test. I hope for a follow-up.

Here comes the class action lawsuit for false advertising. But at least false advertising is a good reason to take legal action against a company, although it would probably be better to file the complaint with the FTC instead of a civil lawsuit.

Yes, because it's far more likely that HDMI will suddenly become obsolete than I will have to replace my iDevice and/or lose the $50 adapter repeatedly. I'm so glad Apple has future-proofed me away from the ravages of HDMI standardization.

I vaguely recall comments on why Apple was right to refuse USB3 standardization because they were able to push uncompressed 1080p video.

The lightning connector can push uncompressed 1080p. The A6 cannot encode the screen output into 1080p h.264 in real time, so it pushes a lower-res stream to the adapter, which then turns it into raw, uncompressed HDMI bitstream. When the A7 or whatever is released, using the same adapter, the quality will go up as the newer chip will be able to encode h.264 at a higher res in real time. In 5 years, the same adapter, in theory, should be able to output 4k over HDMI, as long as the hardware on the device side can keep up. That's the cool part about isolating the HDMI conversion from the device.

Actually, the devs at Panic used an iPad mini for this test, which has an old A5 chip in it. Would an A6 device be able to encode 1080p...?

If the adapter only supports 1600x900, how is an iOS update supposed to provide 1080p support?

That's somewhat vaguely covered by the article:

The Article wrote:

The Panic Blog proceeded to tear one of Apple's adapters apart and, with the help of an anonymous commenter purporting to be an Apple employee, figured out what was going on: the iDevice is actually sending an H.264-encoded image through the Lightning interface to the adapter, where a small ARM chip decodes that information and outputs it over HDMI.

The limitation being spoken about is that iOS is only sending the stream at 1600x900. Presumably -- and that's a large assumption -- they could increase it to 1080p from iOS on any constituent device. The assumption lies in whether or not the adapter can sufficiently handle that stream or that the stream itself can be compressed enough -- without further artifacting -- to make decoding it more efficient to work within the adapter's hardware constraints.

I would assume that there's a reason it's locked in at 1600x900, and that is entirely due to the adapter's hardware limitations. So the improvement will have to come in how the stream is sent to the adapter.

Could it be that the limitation is that current iDevices cannot (will not?) realtime encode 1080p in order to generate the bitstream to send out the lightening connector?

The original article had folks asking "If you tried to play a pre-recorded 1080p MP4, so as to bypass encoding of the framebuffer, is the output native 1080p then?" Last I read through there didnt seem to be a conclusive answer.

As to why they encode first and don't just serialize and stream the raw framebuffer is beyond me, I guess they hit some hard bandwidth limitations against other design tradeoffs?

Does anybody know exactly how much bandwidth the Lightning cable has versus how much is required to support a "raw" HDMI signal? I have yet to see anything that spells out exactly how fast Lightning is supposed to be and it's now been out for a half a year already.

I too would like to know this. With the capacity of iDevices hitting 128 Gb (and only going to increase beyond that) it's astonishing to me if they designed an interface that can't at least match USB 3 transfer rates if not exceed them. I know there's more to it than that, and I'm no expert, but I expect that if I plug a brand new/latest tech iDevice with Lightning connector into my Mac with USB 3 that I'll get very fast transfer rates at least equivalent to USB 3. I haven't seen (I have neither an iPhone 5 or USB 3 capable Mac) or read any evidence that this is the case, and if not why not???

The thing is, they need only have added a few more pins, which would've made hardly any difference to the overall size of the adapter, and it might've solved this and many other issues. This adapter is not an elegant solution at all.

This, Maps and the insufficient battery in iPhone 5 makes me seriously question what has happened to the talent at Apple.

So, to solve a minor future-standards-compatibility concern, they came up with an expensive implementation that cannot properly handle HDMI 1080P (circa 2002/2003 standard)? Even if it was the correct 1920x1080x60hz progressive scan resolution, h.264 will still always be lossy and artifact. That's seen as an acceptable compromise over a network, but there's no excuse for such lazy engineering from the PC to the display.

I must be missing some crucial application people are using their ipad minis for when they plug them into a tv but if they are playing films then those films will be h264 compressed anyway so even if it was doing uncompressed hdmi it would still be decompressing a h264 compressed file, so *zero* gain in quality, particularly since it can likely stream the file straight to the connector meaning not even any transcoding during the transition.

The only thing I can think of is games? Other apps? I suppose a painting app that connected to the tv or something if such a thing existed might be nice to have pixel perfect but games = movement (video) for which people are robust against compression artefacts anyway (they notice them much more in static pictures). Again I suppose displaying pictures but is that a common use case?

I suppose I would rather have pixel perfect if it had no cost but I would take the lightning connector over the old crappy one any day - it's smaller and it doesn't matter which way up it goes (micro USB etc can be quite annoying to get oriented and into place in comparison too), it also comes out a lot less and I use it every day unlike connecting it to a tv to do stuff per than watch h264 encoded films.

I thought Lightning had a DisplayPort stream embedded in it or was using it as a transport? DP provides a full-fat signal that is converted by adaptors in a straightforward fashion without producing a 'livestream' of your device. Apple even pioneered DP what is going on here?

You're thinking of Thunderbolt. Lightning is more like a souped up USB (not really, but useful for comparison)

There seems to be a lot of confusion about the capabilities of the Lightning interface, and I think it would be helpful to stop and consider that this interface essentially has no host controller whatsoever. Take a look at any of the tear-downs of the new iDevices and you'll not find a discrete high speed serial controller chip anywhere between the SoC and the Lightning receptacle. Furthermore, nobody has spotted any new functional blocks in the die shots of the 32 nm revision of the A5 used in the iPad mini and iPod touch (5th Gen).

Lightning is just a 9-pin plug and receptacle that supports two differential signaling pairs for serial communication. The physical layer protocols used on those pairs have to be protocols supported by the SoCs in the devices. In the case of the A5, the only serial protocols available that can be implemented on just 1 or 2 signaling pairs without an additional SerDes or some form of proprietary multiplexing are USB 2.0 and basic serial (UART). It appears that the Swift based SoCs may be in the same boat.

So even though the A5 can output uncompressed digital video at 1080p30, it can't do so without using more than 5 pins. Apple's transitional solution was apparently to use the H.264 encoding and streaming framework that was already in place for AirPlay to encode the frame buffer so that it could be sent over Lightning using the fastest available protocol (ostensibly USB 2.0).

While this does make Lightning appear to be bunk compared to other interfaces, bear in mind that it is only limited by the current SoCs. Envision Lightning once those SoCs have MIPI M-PHY capability and things start to look much better. Lightning can then handle 5-6 Gbit/s, SuperSpeed USB, uncompressed digital display output and even PCIe (PCI SIG is actually already working with MIPI on this). So suddenly in that context the similarities between Thunderbolt and Lightning run deeper than just the names bestowed on them by the marketing department.

I thought Lightning had a DisplayPort stream embedded in it or was using it as a transport? DP provides a full-fat signal that is converted by adaptors in a straightforward fashion without producing a 'livestream' of your device. Apple even pioneered DP what is going on here?

That's Thunderbolt.

And people actually hook their phones or iPads up to their TVs? That's craaaazy.

Well, I've actually noticed this issue recently (with a MONITOR) as I'm trying to use the iPad mini as a PCoIP client, with a wireless keyboard and lightning-VGA adapter on my desk.

If I'm able to stream a movie on the iPad, why not watch it on a TV if that is the most convenient option available.

* There seems to be a lot of confusion about this HDMI over Lightning issue because only Panic has done testing. And Panic used an iPad Mini which has CPU/video limitations.- Would the lack of HDMI over Lightning also happen with an iPad 4? Unknown.

I've seen the new Lightning connector and physically it's a nice design - small yet solid and secure. But overall it's a bit of a ripoff. Apple did not need to put processors and authentication chips into it. It's a money grab, because they know people will pay it.

The info on the adaptor and iPad Mini claim it outputs 1080p and it does not. That's major. Compression and artifacts pile on. So people are paying more money for a worse connection, backed up by false claims.

According to the "Apple employee" the cable cannot stream uncompressed 1080p. That they send h.264 instead of raw digital over what is basically a video cable backs that up. It would be easier, simpler, cheaper, and consume less battery life to pipe raw video (or even just multiplex the stream) than do realtime video encoding in the iPad and decoding in the cable.

They can't just stream compressed video from a file directly to the chip either because it's not a reliable solution. That tiny ARM chip would need to support (including licensing costs!) every codec the iPad Mini plays back, or it would only work part of the time and Apple would still have to do all the work to support the rest of the formats. Not to mention you'd still get artifacts when displaying images generated by the mini (like when using the TV as a second or mirrored display).

Does anybody know exactly how much bandwidth the Lightning cable has versus how much is required to support a "raw" HDMI signal? I have yet to see anything that spells out exactly how fast Lightning is supposed to be and it's now been out for a half a year already.

I too would like to know this. With the capacity of iDevices hitting 128 Gb (and only going to increase beyond that) it's astonishing to me if they designed an interface that can't at least match USB 3 transfer rates if not exceed them. I know there's more to it than that, and I'm no expert, but I expect that if I plug a brand new/latest tech iDevice with Lightning connector into my Mac with USB 3 that I'll get very fast transfer rates at least equivalent to USB 3. I haven't seen (I have neither an iPhone 5 or USB 3 capable Mac) or read any evidence that this is the case, and if not why not???

The thing is, they need only have added a few more pins, which would've made hardly any difference to the overall size of the adapter, and it might've solved this and many other issues. This adapter is not an elegant solution at all.

This, Maps and the insufficient battery in iPhone 5 makes me seriously question what has happened to the talent at Apple.

Want fairy dust on that order? Show me a connector that has what you ask for that is small? Thanks for coming out and telling us your fantasies. Your expectations are far higher than your intelligence.

I've seen the new Lightning connector and physically it's a nice design - small yet solid and secure. But overall it's a bit of a ripoff. Apple did not need to put processors and authentication chips into it. It's a money grab, because they know people will pay it.

The info on the adaptor and iPad Mini claim it outputs 1080p and it does not. That's major. Compression and artifacts pile on. So people are paying more money for a worse connection, backed up by false claims.

According to the "Apple employee" the cable cannot stream uncompressed 1080p. That they send h.264 instead of raw digital over what is basically a video cable backs that up. It would be easier, simpler, cheaper, and consume less battery life to pipe raw video (or even just multiplex the stream) than do realtime video encoding in the iPad and decoding in the cable.

They can't just stream compressed video from a file directly to the chip either because it's not a reliable solution. That tiny ARM chip would need to support (including licensing costs!) every codec the iPad Mini plays back, or it would only work part of the time and Apple would still have to do all the work to support the rest of the formats. Not to mention you'd still get artifacts when displaying images generated by the mini (like when using the TV as a second or mirrored display).

I think you're the one making some false claims. First of all, take a look at Apple's financial statement from the past quarter where they break out the numbers for accessories. If Lightning was a money grab, it failed miserably, because there was no noticeable increase in the amount of revenue reported by Apple.

As for authentication, even the 30-pin connectors contained EPROMs to store digital certificates so that MFi developers could write apps leveraging the iAP protocol (i.e. apps can determine whether a device with certain capabilities or produced by a certain manufacturer is present.)

And I'm fairly certain that video mode does output 1080p and does not transcode.

Counterintuitively, it would also appear that proprietary multiplexing solutions such as MHL are not as power efficient as Apple's method, and they place the complexity and cost in the device as opposed to the adapter.

As I pointed out in my previous post, using H.264 for display output is simply a result of using pre-Lightning SoCs in conjunction with the new interface.

Counterintuitively, it would also appear that proprietary multiplexing solutions such as MHL are not as power efficient as Apple's method, and they place the complexity and cost in the device as opposed to the adapter.

Why would you want the complexity to be in the adapter? All that would do is drive up the cost of adapters, rather than using the power of the device you've already got.

Apple have this massive piggybank full of billions of dollars, it's obvious they're making money hand over fist, and you guys are saying it's a good thing they don't pile these features in the device because it drives up the price? iPhones are already 30-100% more expensive than their competitors. Are you seriously telling me they *need* to pass on the extra cost on bringing things into the device to the consumer?

I feel I have to agree that Apple seems to have lost it's sheen and perhaps run out of ideas. They haven't innovated anything special since the iPhone 3g. And yet people swallow up their advertisements like it's propaganda.

For example.. The iPhone 4/4s is quite a heavy phone, and at it's release it was nowhere near the lightest smartphone, and did anybody care about the extra weight, really? I mean there's probably grumbles here and there but that's it. iPhone 5 comes out advertising how light it is, suddenly everyone and their dog is bleating on about how amazingly light it is and how it's so cool and great. Or let's go back even further, where Apple released siri with the 4s, which is nothing new, it's just speech recognition with some AI to process sentence structure. It has been around for a while, but when Apple do it, suddenly it's a must have feature. I had an iPhone 4s, do you know how often I used siri? I could count it on my digits, and most of those were in the first week before I got bored. And then when you consider that it needed a 3g or wifi connection to work and didn't have the best accent detection, is it really that big a feature? Oh and I almost forgot, most of it's features don't work outside the US.

I digress, I don't mean to turn this into an Apple-bash. My point being that it isn't the first time Apple have used the powers of advertising to make people think they're getting a great feature or deal, when in fact it is just par for the course.

EDIT:

Also, if the cost of the adapter is so little compared to the phone, I don't understand why the feature isn't built in. Do you really go look at products and think "Ah yes, this is good, the adapter will work with my next phone and save me a (relatively) small amount of money!"

Apple have this massive piggybank full of billions of dollars, it's obvious they're making money hand over fist, and you guys are saying it's a good thing they don't pile these features in the device because it drives up the price? iPhones are already 30-100% more expensive than their competitors. Are you seriously telling me they *need* to pass on the extra cost on bringing things into the device to the consumer?

I feel I have to agree that Apple seems to have lost it's sheen and perhaps run out of ideas. They haven't innovated anything special since the iPhone 3g. And yet people swallow up their advertisements like it's propaganda.

For example.. The iPhone 4/4s is quite a heavy phone, and at it's release it was nowhere near the lightest smartphone, and did anybody care about the extra weight, really? I mean there's probably grumbles here and there but that's it. iPhone 5 comes out advertising how light it is, suddenly everyone and their dog is bleating on about how amazingly light it is and how it's so cool and great. Or let's go back even further, where Apple released siri with the 4s, which is nothing new, it's just speech recognition with some AI to process sentence structure. It has been around for a while, but when Apple do it, suddenly it's a must have feature. I had an iPhone 4s, do you know how often I used siri? I could count it on my digits, and most of those were in the first week before I got bored. And then when you consider that it needed a 3g or wifi connection to work and didn't have the best accent detection, is it really that big a feature? Oh and I almost forgot, most of it's features don't work outside the US.

I digress, I don't mean to turn this into an Apple-bash. My point being that it isn't the first time Apple have used the powers of advertising to make people think they're getting a great feature or deal, when in fact it is just par for the course.

EDIT:

Also, if the cost of the adapter is so little compared to the phone, I don't understand why the feature isn't built in. Do you really go look at products and think "Ah yes, this is good, the adapter will work with my next phone and save me a (relatively) small amount of money!"

Just my two pence.

Lack of innovation isn't something to credibly be said of Apple.

The 4 had the revolutionary screen.

The 4S was a revolution in phone photo quality.

The iPhone 5 has a 1420 mAH. The Galaxy SIII has a 2100mAH, battery. Droid 2200mAH. Lumia 920 is at 2000mAH. For the 1/3 reduction in battery there is a revolutionary change in the efficiency of the platform to get the 8 hours of talk time on it. It is incredibly light, remarkably....it really is and you have to try it to understand what they did.

iPhone 5 also has a revolutionary new cord. That is simple and easy and secure. Above and beyond anything else that is out there.

Maps and Siri are an ongoing project with small improvements as time goes on and a development for the platform as time goes on.

The Screen on the 4 was not revolutionary, maybe it was high for a smartphone at the time , but it isn't something that wasn't already a natural evolutionary step, and going to come with the next samsung model and its amoled.

The Camera, again, is just progression not innovation.

I have used the iphone5, and I admit, it is very light, but clearly it has some drawbacks. The screen isn't wide enough compared to the competition so you have less screen real-estate. What use is the high DPI if you can't fit much on the screen?

A revolutionary new cord? It's a power lead, it plugs in and charges the phone, there's nothing special here.

Maps was an utter failure, and although Siri is a nice feature to have, it's rarely used for the majority of people.

Why would you want the complexity to be in the adapter? All that would do is drive up the cost of adapters, rather than using the power of the device you've already got.

Ah, see part of the answer lies in your question. The power of Lightning is equivalent to the power of the SoC in the device you've already got. At the moment, those SoCs have limited serial capabilities and require expensive adapters to funnel functions available via the much wider 30-pin dock connector through the new 9-pin interface. Once we start seeing SoCs with integrated host controllers specifically designed to leverage Lightning, the need for expensive adapters will be reduced significantly.

Apple knows exactly what the sales volume of their 30-pin AV adapters was like compared to overall device sales. Clearly they decided that it would be better if the limited number of folks buying these adapters carried the burden of the additional cost during this transitional period, rather than saddling everyone who buys a new Lightning device with paying for a function they will probably never use. And the costs go beyond just the BOM. Adding a discrete chipset solely to handle digital display output would have also taken up valuable PCB real estate, interior volume, and affected the power and thermal budgets. Apple can't just add hardware features to these devices "for free" at this point, and they can't risk negatively impacting the user experience for the majority while attempting to provide the seamless continuation of a seldom used feature.

arachnae wrote:

I don't get it.. I mean literally, I just don't understand...

A lot of what I already said applies to what's causing your bewilderment as well. But to specifically address why Apple's billions in the bank doesn't translate into cool free features in all new Apple gear, you might have noticed that their stock price has gotten absolutely caned since September, and some of that has to do with the headwinds their gross margins are facing. Apple is not currently in the midst of creating a new market where there are few true competitors. Their primary markets are maturing and the competition is quite real. In order to continue to expand marketshare they are introducing products such as the iPad mini, with lower ASPs and lower margins. Having a unified accessory market for so many years was clearly beneficial to Apple, and I'm sure they would like to continue that with Lightning. If you're designing an interface that you know will be used for devices that start at $149 or less, you'll want to engineer it such that the complexity required on the device side is at a bare minimum.

Does anybody know exactly how much bandwidth the Lightning cable has versus how much is required to support a "raw" HDMI signal? I have yet to see anything that spells out exactly how fast Lightning is supposed to be and it's now been out for a half a year already.

Well, uncompressed 1080p HDMI is around 3-4 gbit/sec, so I think it is safe to say that Lightning isn't able to reach that rate.

Based on this revelation, it looks to me like lightning is an all digital interface, that is to say, it doesn't 'reassign pins' or the like. It is simply 2 or 3 differential pairs of all digital data transfer. It also makes me wonder if the so-called 'authentication' chip isn't really just a digital Lightning to USB converter... USB requires a fair amount of power circuitry to control the power flowing over the bus, which would explain the design of the chips in the USB lightning cable quite well.

It would be shocking to me if they had a design capability of less than 3gbit a pair. That is really old and well understood technology at this point. But who knows, it could be standard high speed USB2 for all we know <shrug>

The lightning connector can push uncompressed 1080p. The A6 cannot encode the screen output into 1080p h.264 in real time, so it pushes a lower-res stream to the adapter, which then turns it into raw, uncompressed HDMI bitstream. When the A7 or whatever is released, using the same adapter, the quality will go up as the newer chip will be able to encode h.264 at a higher res in real time. In 5 years, the same adapter, in theory, should be able to output 4k over HDMI, as long as the hardware on the device side can keep up. That's the cool part about isolating the HDMI conversion from the device.

Lightning connector most likely can't do uncompressed 1080p. Or maybe you meant adapter instead of connector?

PS: It's not really "cool", because they most definitely have HDMI interface part on A6 silicon(probably not the chip to handle the electrical connection, though).

Does anybody know exactly how much bandwidth the Lightning cable has versus how much is required to support a "raw" HDMI signal? I have yet to see anything that spells out exactly how fast Lightning is supposed to be and it's now been out for a half a year already.

Well, uncompressed 1080p HDMI is around 3-4 gbit/sec, so I think it is safe to say that Lightning isn't able to reach that rate.

Based on this revelation, it looks to me like lightning is an all digital interface, that is to say, it doesn't 'reassign pins' or the like. It is simply 2 or 3 differential pairs of all digital data transfer. It also makes me wonder if the so-called 'authentication' chip isn't really just a digital Lightning to USB converter... USB requires a fair amount of power circuitry to control the power flowing over the bus, which would explain the design of the chips in the USB lightning cable quite well.

It would be shocking to me if they had a design capability of less than 3gbit a pair. That is really old and well understood technology at this point. But who knows, it could be standard high speed USB2 for all we know <shrug>

That's 1080p @ 60Hz. 30Hz is about 1.5 Gbps.

CEA-861 1080p30 (24-bit RGB or YCbCr 4:4:4) is 1.782 Gbit/s, but if 8b/10b encoding is used the physical layer gross bitrate would need to be 2.25 Gbit/s (which happens to be the bandwidth provided by MHL 1.0).

Pins 6 and 7 are clearly connected to a second differential signaling pair if you look at any of the device PCBs or cable assemblies. I was unable to determine from the teardown photos of the Digital AV Adapter if they were being used or not.

So to do uncompressed digital display output over those two pairs, the SoC would need a serial interface capable of either 2.25 Gbit/s full-duplex, or 1.125 Gbit/s over two uni-directional channels. There are not a lot of mobile SoCs in shipping devices with serial interfaces that can do more than 1 Gbit/s. In fact I can't think of any.

The point being that you could probably push 6 Gbit/s full-duplex through the Lightning receptacle and plug just fine, but you need to have some sort of controller capable of that in the device, which we just don't have yet.

jalexoid wrote:

selectodude wrote:

The lightning connector can push uncompressed 1080p. The A6 cannot encode the screen output into 1080p h.264 in real time, so it pushes a lower-res stream to the adapter, which then turns it into raw, uncompressed HDMI bitstream. When the A7 or whatever is released, using the same adapter, the quality will go up as the newer chip will be able to encode h.264 at a higher res in real time. In 5 years, the same adapter, in theory, should be able to output 4k over HDMI, as long as the hardware on the device side can keep up. That's the cool part about isolating the HDMI conversion from the device.

Lightning connector most likely can't do uncompressed 1080p. Or maybe you meant adapter instead of connector?

PS: It's not really "cool", because they most definitely have HDMI interface part on A6 silicon(probably not the chip to handle the electrical connection, though).

Are you sure about the A6/A6X having an HDMI interface? I mean, a lot of ARM SoCs do, but why would Apple include one if they knew they were never going to use it?

And I'm pretty sure encoding 1080p30 is not an issue, even for the A5, since the camera ISP can handle it just fine. If you look at the resolutions of the various Lightning devices, you've got:

None of those are 1920x1080, so mirroring 1:1 pixel output would result in serious letterboxing or cropping. 1600x900 is the obvious target that falls right in the middle of those three resolutions and most likely why Apple settled on it.

The statement that, "Lightning is a serial bus.... not capable of streaming true HDMI over this bus" is technically wrong.

It's a data connection. ANY digital signal could, if sent correctly, be sent over that bus. The limitation isn't the bus; the device is obviously either not able to properly decode, or decode quickly enough, the data stream it's being sent. Either that, or the data stream being sent is somehow corrupt.

Even though there's obviously something wrong, or not enabled, to allow 1080p, the artifacting of the output image points to a decompression issue; it should be able to output the 1600x900 image correctly. This is likely due to either improper signals for the ARM to decode, lack of buffer memory resulting in drops or a hardware/software configuration issue within the iPad itself.

If the device is properly transmitting a signal that starts and leaves the device as 1080p, then there's probably an issue with buffering within the adapter.

My last guess is that the refresh rate isn't being set correctly for the receiving device. I'm not sure what the iPad's refresh rate is, but if it's not matching the TV or monitor, this could result in artifacts, but more likely a buffer/memory limitation.

I doubt that this person was an Apple employee, or at least not an EE for Apple who had a hand in designing this adapter.