Skip to main content

Lightning to Digital AV adapter could be pushing lower res H.264 because it's not yet capable of streaming raw HDMI

Earlier today we linked to a post by Cabel Sasser on the Panic Blog about Apple's Lightning Digital AV connector, which is basically their HDMI adapter. Sasser had discovered that the adapter provided a less-than-1080p signal, broke it open, and found a tiny computer contained inside. Lots of speculation followed as to why that was, and what might be going on. A comment left on the Panic Blog by "Anonymous Coward", however, implies internal Apple knowledge of the matter, and purports to have the answer. In part, the comment says:

The reason why this adapter exists is because Lightning is simply not capable of streaming a “raw” HDMI signal across the cable. Lightning is a serial bus. There is no clever wire multiplexing involved. [...] Airplay uses a bunch of hardware h264 encoding technology that we’ve already got access to, so what happens here is that we use the same hardware to encode an output stream on the fly and fire it down the Lightning cable straight into the ARM SoC the guys at Panic discovered. Airplay itself (the network protocol) is NOT involved in this process. The encoded data is transferred as packetized data across the Lightning bus, where it is decoded by the ARM SoC and pushed out over HDMI.

If the comment is legitimate -- and there's no way to know at this point if it is or not -- Apple is basically hanging all the electronics outside the device, rather than glutting Lightning up with signals that may fall into disuse over the time. There are trade-offs, to be sure, but given how the old 30-pin Dock connector abandoned FireWire over time, added HDMI, and jumbled in everything from line to serial, if accurate, this could also be a better, more future-proof solution.

And, since the on-device side is software-centric, an update could improve the HDMI out to true 1080p. Fingers crossed.

If anyone has any specific knowledge of how this kind of stuff works, give the full comment a read via the link below and then please weigh in a let me know how likely (or unlikely) it all sounds.

Source: Panic Blog comment

Rene Ritchie

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.

  • Huh?!
  • Why are we crossing our fingers for an update, why isn't the functionality there already? This is more B.S.!
  • Seems like a legitimate reason to me.
  • As much as I dislike Apple's bullheadedness when it comes to proprietary connectors, they seem to have a great little "adaptable" solution having an in-line computer. However, with Ethernet, 3D and 4K compatibility, I don't see HDMI going anywhere soon - and by that time we may say to hell with wires altogether.
  • Granting this explanation as true, this is the sort of approach some iOS fans sneer at Andoid about - technically interesting, engineered flexibly for the future - and completely unable to meet the existing consumer desire. I'll change my opinion if and when this approach bears other fruit, but, as it stands now, it is a visible step back without compensating payback. (Also, if that comment is accurate, a software update might still just send a higher resolution but still mpeg encoded signal, instead of the raw stream, which still would be lesser quality than what we had before.)
  • It does output 1080p as I do video capture of games and apps with my iPhone 5 and the adapter.
  • but what this source is saying is that it is not native 1080p, that it is up-scaled from 1600x900.
  • If you read the panic blog, specifically "Mystery #1. You will see they also tried the old dock connector, which did hit 1080p. Since the iPad mini does not have a dock connector, it means they used the adapter but the point remains; the lightning connector can easily transit 1080p over HMDI as it does so using the older adapter. It seems like you guys are trying real hard to justify Apples bad engineering on this.
  • its not bad engineering, its bad design. they are doing the best they can with a limited design. the number of pins on the lightning cable cannot support 1080p at all. I dont understand how this could have gone through when they were designing the dang cable.
  • Technically the lightning cable is capable of pushing HDMI encoded as a digital data and reassemble it on the other side, which, is what it sounds like this cable is doing, albièt at a lower quality. This was my theory in the beginning when I heard the SoC business. If this is the case, maybe the cord isn't capable of reassembling the airport-compressed video in a higher end format because it (the cord) is not powerful enough, yet.
  • ...which brings us back to the original question: why design a system that requires a powerful, complex cable when simpler, cheaper existing systems do the job? Especially when those simpler, cheaper systems do the job *better* than your new more expensive design? Either Apple is incompetent, greedy for royalties, or has/had something else in mind that justifies this trade off and regression. The first is not true, and I like to think the second is not, as well. But I would like to see some more examples of the third.
  • There aren't not positives about the move to Lightning.
    1. All digital means that pins won't become useless after technology or component shifts, plus, their data channels are dynamic so they can be reconfigured during transfer. 2. This pulls hardware out of the device and into the cord, which is doing the work to convert the lightning digital signal to whatever is on the other end. Then, the iDevice doesn't need hardware to interface with the peripheral and solely can depend on Lightning. 3. ^ example: You can upgrade Lightning peripherals and receive hardware-only improvements without having to upgrade your device (like the digital AV cord can be replaced with a better one as soon as it's applicable without worrying about the iDevice on the other end). I'm sure there are more reasons, but these are a few that come to head or I can logically think of from what we know.
  • All true, but where is the benefit to the customer for any of the three? #1 is in a theoretical future, and means nothing to a typical end user, little different than slapping NFC on an Android phone "just in case" when there are little practical venues for use. #2 presumes that there will be massive shifts in signal requirements during the lifespan of a typical iPhone, *and* that people will be willing to buy (and keep track of) one specific cable per accessory, or at least per use case. #3 is another theoretical exercise, whose first concrete example will be in requiring customers to buy another expensive cable to "upgrade" a lightning device to the same capabilities a dumber, cheaper device had beforehand. I'm not saying there are no benefits to attempting a future-proof design, but, if you are going to introduce a regression from what you currently have, you had better make sure you introduce something *NOW* that your consumers will benefit from.