That's the whole point - Apple can't be that dumb. I want to believe that there has to be some deeper reason behind all this. I am no tech expert, so I can't know if all this is possible or not, but what if the circuitry was to future-proof the cable for the next 10 years?

I hope youīre right. I really do. Because if itīs more future-proof, then it needs to be technically better as a whole, not just because itīs thinner and more flexible, it needs to retain the quality on levels we had before and not make it worse, at least for the 4th/5th generation of mobile devices. We donīt know what Apple will do in the future and I am no prophet, but currently you are pretty much forced to buy it when youīre on a lightning mobile device. Maybe they are going to change something in the 6th or coming generation again, but itīs seems very plausible that itīs lightning connector, bandwidth or DRM related.

Your point falls apart once you acknowledge that you honestly don't know what chips did what.Can you assume - sure. But that doesn't mean you know. Nor does Panic. That's why I asked you to define conjecture for me.

Even if you refuse to believe that claim that this is a decoding chip, which is actually based on good evidence(shocker!), we know there's a 256MB RAM because it's clearly labelled. In other words, it's 100% positive that this whole thing isn't just a part that was shifted away from the earlier iPad in anyway which means your assertion was wrong no matter what. Plus why the heck would an earlier iPad have a decoding chip that decodes its own video stream? That's completely non-sensical but you didn't read the article.

Quote:

Originally Posted by samcraig

You could have just wrote your post arguing my assertion. But that wasn't enough. You took it a step further by trying to make insinuations (which were wrong) about my motivation.

Sorry but that's what you do. I don't know if you do it without realizing it but it's easy to go back on your posting history and show that you almost always lean on the argument against Apple no matter what the story is.

Quote:

Originally Posted by samcraig

I understood your point crystal clear. I can't say I agree or disagree with it since it's not based on facts that are known.

No it's simpler than that: you didn't read the article but when someone said something positive about Apple being generous with profit margin on the adapter, you tried to spin it into Apple being greedy.

I see that you're trying to cling to a "if you cannot prove it 100% you cannot say I'm wrong" argument. But there is a very clear label on the RAM saying this isn't something that wasn't taken off the iPad. Or are you going to say I'm still wrong because I cannot 100% prove the iPad 1, 2, 3 didn't have a hidden 256MB RAM somewhere?

My other question is... can USB 2 carry enough bandwidth to handle video? I'd wager a guess of no.. without the same technique. So yes, the lightening connector may top out at a specific configurable digital signal and have to send this data in a compressed form, but I also wager a guess that if there is something wrong with the MPEG decoding that, Apple will be able to fix it with an update.

Well, they could have used the same technology as in Thunderbolt: 10 Gbit/s per device (20 total), which would allow for even HDMI 1.3 (allowig for even WQXGA (2560Ũ1600) at 60 Hz and requiring 10.5 Gbit/s), let alone older / less capable HDMI versions (for example, the less than 5 Gbit/s, 165 MHz HDMI 1.0, which allows for up to 1920Ũ1200 (that is, vertically more than Full HD) at 60 Hz). Then, they wouldn't need to compress / decompress the video signal and could use a far simpler (and cheaper) adapter.

Of course, this may have cost them a lot more (dunno about the actual prices for Thunderbolt vs. Lightning circuitry; I only assume the latter costs less than the former because it's far slower) - maybe this is why they went with the, quality-wise, far inferior solution.

----------

Quote:

Originally Posted by iSunrise

I hope youīre right. I really do. Because if itīs more future-proof, then it needs to be technically better not just because itīs thinner and more flexible, it needs to retain the quality on levels we had before and not make it worse, at least for the 4th/5th generation of mobile devices. We donīt know what Apple will do in the future and I am no prophet, but currently you are pretty much forced to buy it when youīre on a lightning mobile device. Maybe they are going to change that to native processing in the 6th or coming generation again

I'm afraid they won't be able to do this. Even if you subsample colors with 4:2:0 (which halves the bandwidth needed, see my prev. comment above: http://forums.macrumors.com/showpost...&postcount=172 ) and "only" use 1920*1080 at 60p (which the iDevice hardware H.264 decompressor has always been capable of decoding, starting with the A4 CPU), you need to transfer around 2 Gbit/s. (With 30p, half of it.) If the hardware just can't sustain that bitrate, then, you in no way will ever be able to transfer uncompressed signals, no matter how hard you try. It's simple physics.

All in all, Apple should have used the same chipset as with the Thunderbolt. Then, it'd be easily possible to drive external HDMI1.3-capable monitors at even native Retina (!) iPad resolutions at native 60p framerate.

Agreed. Lightning is a horrible adapter. It's not meant to benefit the consumer, but to shift extra cost onto the consumer. In order to make the iPhone lighter and thinner (something not demanded by the customer, but by marketing), they took out most of the onboard processing of video, audio, etc. But they didn't cut the cost of the unit.

The lightning connector could easily have been twice as wide with the same utility, the same ability to flip it, etc., and then the phone could be doing the processing, we would still have analog audio and video out, and HDMI out. But Apple wanted to cut corners and screw over customers in the process.

Lightning is just one of the negatives of the iPhone 5, but I'm stuck with mine.

so, where'd you get your electrical engineering degree from? I'd really like to see your product portfolio so I can better weigh your technical opinion. have a link to share?

Eventually all mobile devices and monitors will use AirPlay or similar for this purpose. Nobody will be using cables.

See my posts above: AirPlay just can't be as effective as (real; not that of Lightning) cabled connections, unless all you do is playing back iOS-native video files. It's only with the latter case that AirPlay can deliver the same quality as (old) cabled solutions. This is all laws of physics (en/decoding lag and resolution restrictions, Wi-Fi lag).

I'm afraid they won't be able to do this. Even if you subsample colors with 4:2:0 (which halves the bandwidth needed, see my prev. comment above: http://forums.macrumors.com/showpost...&postcount=172 ) and "only" use 1920*1080 at 60p (which the iDevice hardware H.264 decompressor has always been capable of decoding, starting with the A4 CPU), you need to transfer around 2 Gbit/s. (With 30p, half of it.) If the hardware just can't sustain that bitrate, then, you in no way will ever be able to transfer uncompressed signals, no matter how hard you try. It's simple physics.

All in all, Apple should have used the same chipset as with the Thunderbolt. Then, it'd be easily possible to drive external HDMI1.3-capable monitors at even native Retina (!) iPad resolutions at native 60p framerate.

Yes, thatīs what I fear as well.

Unfortunately I could not find any document on the web that says something about the bandwidth of the new lightning connector. What would theoretically be possible is that the connector is just a 1st generation and that leaves a door open for future faster chipsets that are also compatible with older lightning connectors (like Firewire did) and then we would have enough bandwidth to at least get 1080p@23.976/24Hz/30Hz uncompressed. Or Apple simply just doesnīt care and we need to wait for faster hardware/encoders or better codecs like H.265/HEVC to squeeze about 50% more data in. Forcing Airplay onto people must makes a lot of sense for Apple, because licensing should also get a bit easier and itīs a closed system.

The lightning connector may be flexible (and thus future-proof), but it looks more and more like they had to make a lot of sacrifices, because their major selling point "thinner and lighter" is marketable for the masses and that leads to buyer interest.

so, where'd you get your electrical engineering degree from? I'd really like to see your product portfolio so I can better weigh your technical opinion. have a link to share?

He is at least partly right - see my Ligthning vs. Thunderbolt post at http://forums.macrumors.com/showpost...&postcount=179 . Apple does have the technology that would it make possible to easily transfer even the highest-resolution uncompressed HDMI signals, let alone "simple" Full HD ones.

Of course, as we don't know how big / power hungry / costly Thunderbolt chipsets are, it's not possible to tell whether they can be used in as small / constrained device as an iPhone.

Reading through this blog, the author makes several assumptions he calls "theories" even going as far as saying "Are we off base? Let us know"

He appears to be a programmer and not an electronics engineer.

Yet he provided images and comparisons along with a maximum output resolution of 900p with the lightning adapter. He doesnīt need to be a Ph.D. in physics or engineering, because he provided proof to his written claims.

To fake something like that would be possible of course. I highly doubt that, though.

That's the whole point - Apple can't be that dumb. I want to believe that there has to be some deeper reason behind all this. I am no tech expert, so I can't know if all this is possible or not, but what if the circuitry was to future-proof the cable for the next 10 years?

This probably seems redundant, even wasteful now, but I daresay that years down the road, we will be praising Apple for their foresight when their future products start tapping on features in the lightning cable.

You're mixing the Lightning cable and this adaptor. Yes, the Lightning cable is built very well for the future. The adaptor is not. Sure you can update the software in the adaptor, but you can't update the insufficient hardware it uses. It already can't output 1080p, according to the article (and conflicting with store.apple.com's claims).

I think Apple CAN be dumb sometimes. Ping? Game Center? The buttonless iPod shuffle? I think there are just a few idiots responsible for these screwups in every company... like whoever made the Google Pixel.

See my posts above: AirPlay just can't be as effective as (real; not that of Lightning) cabled connections, unless all you do is playing back iOS-native video files. It's only with the latter case that AirPlay can deliver the same quality as (old) cabled solutions. This is all laws of physics (en/decoding lag and resolution restrictions, Wi-Fi lag).

The laws of physics? Hardly. It's simply a limitation to our current technology. After all a radio wave, like all forms of light, travels almost three-hundred million meters per second. That's safely far beyond human perception.

__________________

There is something deeply wrong with a society more offended by breasts than by entrails.Pebble SmartWatch | iPhone 5c | 11" Macbook Air '13 | HTPC | TV | Numerous Old Consoles

Yet he provided images and comparisons along with a maximum output resolution of 900p with the lightning adapter. He doesnīt need to be a Ph.D. in physics or engineering, because he provided proof to his written claims.

To fake something like that would be possible of course. I highly doubt that, though.

Theories, as he calls it, is not proof, even he is not sure. Just because you read it off the internet does not make it true or factual.

If what Panic says is true, yes. But Apple's site claims otherwise. Why am I the only person noticing this besides one other guy? It's an important mystery.

The Apple TV is an amazing device. For $100, you get a wireless video receiver at least, which alone would normally cost more, and at most a great movie-watching and music-playing box. All they have to do is release a little software update, and it can have apps... GAMES.

I agree with you, but itīs as close to a proof as he can make with an article that is published on the web. I like his honest approach in that he openly admits that itīs possible in that he made a stupid mistake somewhere and that he may be wrong.

Since we're "back in my day"-ing, it's 12 times the hard drive space of my first Mac... and 2048 times the RAM in my first Apple device (a //c).

It's a little silly to say that Lightning can't output raw HDMI--of course it can't. Raw HDMI requires nineteen wires, which Lightning has nowhere near. The legitimate question would be whether an iOS device an output uncompressed digital 1080p video, which the existence of a mini computer in the adapter in no way answers.

It's unlikely it does, [corrected my statement after reading the linked article] given the compression artifacts noticed, but it's not impossible that higher-throughput less- or un-compressed video could be output via a later adapter, or maybe even a firmware modified version of this one (given that it is, after all, a full computer--depends on the max bandwidth of its lightning connection to the host iOS device).

Personally, I think the lightning connector is awesome, because it's massively future-proofed; since the pins are fully reconfigurable, you can add just about anything you want to it down the line. If/when MHL (a phone-centric HDMI replacement that uses less wires) hits the market, you just have a new adapter with a different internal processor to handle the transcoding. Or you produce a new 4K adapter that takes advantage of a more powerful CPU in future iDevices to stream more data through the same connector and drive a higher-res display. Or whatever Apple decides to do with it down the road.

Yes, geeks can whine about expensive cables, but the fact is that the vast majority of consumers just buy whatever overpriced cable is on the shelf at RadioShack anyway, so they're highly unlikely to notice or care. At least in this case they're actually getting some technology for their buck, rather than a gold-plated placebo with an 800% markup.

Quote
So the adapter is sort of a hack from apple to emulate airplay.
Because the lightning connector cannot outpot raw hdmi signal, damn that is awful. Something always gotta give with apple.

Well, there have been numerous cases Apple was caught for blatantly lying. (Just one example: "You're holding it wrong" at the same time of posting job ads for antenna / wireless engineers and posting (and then silently removing) videos "proving" the phones of other phone manufacturers also have "death of grip".) I wouldn't trust them when the Panic guys have posted at least one screenshot clearly showing the video stream is recompressed.

All in all, Apple should have used the same chipset as with the Thunderbolt. Then, it'd be easily possible to drive external HDMI1.3-capable monitors at even native Retina (!) iPad resolutions at native 60p framerate.

Then again Thunderbolt is not Apple's technology. It's Intel's. Would Intel implement Thunderbolt on an ARM device? Don't think so. I don't even think it's as easy as putting a chipset on the logic board.

Well, they could have used the same technology as in Thunderbolt: 10 Gbit/s per device (20 total), which would allow for even HDMI 1.3 (allowig for even WQXGA (2560Ũ1600) at 60 Hz and requiring 10.5 Gbit/s), let alone older / less capable HDMI versions (for example, the less than 5 Gbit/s, 165 MHz HDMI 1.0, which allows for up to 1920Ũ1200 (that is, vertically more than Full HD) at 60 Hz). Then, they wouldn't need to compress / decompress the video signal and could use a far simpler (and cheaper) adapter.

Because Thunderbolt requires an on-board chip. One that is not only large, but also extremely power hungry.
(It requires far more power than any Macbook SSD, and even many HDDs)

You really don't want that on a Mobile Phone.

In addition, Intel sells the controllers (in bulk) for 40-50 USD if I recall correctly.
Its for this reason, Thunderbolt also isn't appearing in any other devices but RAID enclosures, Monitors, and occasionally connectivity hubs.

The price of a hard drive would jump by a huge amount if they used it, and power consumption would basically double. For obvious reasons, that makes it even less viable for mass storage devices like USB sticks. Forget about using it in keyboards and mice, or most any other cheaper USB device. It could be implemented in Webcams to allow them to transmit an uncompressed feed, but the question is if that matters when that stream is then down compressed to less than a megabyte when transmitted over the Internet. Particularly now when USB3 already has enough bandwidth to transfer an uncompressed 1920x1080x24bitx60hz stream.

Future revisions of Thunderbolt will reduce these issues - but that future was not 6 months ago, and its not 6 months from now either.

Then again Thunderbolt is not Apple's technology. It's Intel's. Would Intel implement Thunderbolt on an ARM device? Don't think so. I don't even think it's as easy as putting a chipset on the logic board.

No it's simpler than that: you didn't read the article but when someone said something positive about Apple being generous with profit margin on the adapter, you tried to spin it into Apple being greedy.

I see that you're trying to cling to a "if you cannot prove it 100% you cannot say I'm wrong" argument. But there is a very clear label on the RAM saying this isn't something that wasn't taken off the iPad. Or are you going to say I'm still wrong because I cannot 100% prove the iPad 1, 2, 3 didn't have a hidden 256MB RAM somewhere?

No. You interpreted my comments as Apple being "greedy". Let's be clear about that. It's your assumption that's what I meant. However - you'll never know what I meant for sure because you aren't me at the keyboard. However I do know what I meant - so it's all good.

I never said I was right. In fact - I'm most likely not. However just because I'm wrong doesn't make your opinions correct or factual.

Because Thunderbolt requires an on-board chip. One that is not only large, but also extremely power hungry.
(It requires far more power than any Macbook SSD, and even many HDDs)

You really don't want that on a Mobile Phone.

In addition, Intel sells the controllers (in bulk) for 40-50 USD if I recall correctly.
Its for this reason, Thunderbolt also isn't appearing in any other devices but RAID enclosures, Monitors, and occasionally connectivity hubs.

The price of a hard drive would jump by a huge amount if they used it, and power consumption would basically double. For obvious reasons, that makes it even less viable for mass storage devices like USB sticks. Forget about using it in keyboards and mice, or most any other cheaper USB device. It could be implemented in Webcams to allow them to transmit an uncompressed feed, but the question is if that matters when that stream is then down compressed to less than a megabyte when transmitted over the Internet.

Future revisions of Thunderbolt will reduce these issues - but that future was not 6 months ago, and its not 6 months from now either.

I wouldn't trust them when the Panic guys have posted at least one screenshot clearly showing the video stream is recompressed.

Depends on how he did his testing. 1600 X 900 is the native resolution of the iPhone 5. So did he do mirroring from the screen or did he play a 1080p video where the video would not have to be upscaled to 1080p? Not sure if that has anything to do with that or not.

Having a screenshot of some artifacts does not give us how he did his testing.