Fortunately, we were able to get things moving quite fast on mobile. Uploading videos to Buffer on iOS worked great—but then one day Fabric let us know that we had likely missed an edge case:

We weren’t immediately sure of the cause. When I began to dig deeper, though, the culprit was clear: slow motion videos on iOS.

Here, I’d love to take you though the process of how I discovered a strategy to deal with slow motion videos on iOS??!

Walking up the call stack

As with any issue that I tackle from a bug report, my first step was to reproduce the problem. This part was a bit challenging, because even though I had the exact line where the exception occurred, I didn’t have any information about the video causing it.

By looking at the call stack, I could see that the heart of the issue was that AVComposition didn’t respond to a URL message. In fact, it doesn’t have any URL data directly associated with it.

The message invoked right before this happened was [PHImageManager requestAVAssetForVideo:options:resultHandler:].

That particular method returns an instance of AVAsset in the completion handler, and for quite some time the way we handled things had worked crash free in production:

1

2

3

NSURL *URL=[(AVURLAsset *)asset URL];

NSData *videoData=[NSData dataWithContentsOfURL:URL];

[selfuploadSelectedVideo:video data:videoData:]

From looking at this, our first issue was clear—we never expected an instance of AVComposition to be returned! It was also obvious from our crash reports that for a long time, this hadn’t been an issue.

Hmmm ?…

Understanding AVAsset

In my years of iOS development, I haven’t really had to venture too deep into AVFoundation. It’s a big world, and if you are just starting out it can be a bit intimidating to get started with. Thankfully, AVAsset is fairly easy to grasp.

Essentially, AVAsset represents timed audiovisual media. The reason why it didn’t immediately seem familiar to me is that it acts as an abstract class. Since Objective-C has no notion of abstract classes like C# or Java, this can be a “gotcha” as Clang will not show any warnings or compiler errors from code like this:

1

AVAsset *anAsset=[AVAsset new];

After a trip to Apple’s documentation, I learned that AVAsset has two concrete subclasses that are used. You can probably guess both of them at this point, but they are:

AVURLAsset is fairly straight forward, but I knew I had more to learn about AVComposition. For example, here is the summary from Apple’s documentation over AVURLAsset:

Certainly easy enough, right? And for comparison, here is AVComposition:

Obviously, there is just a tad more going on with AVComposition. Still, I was eager to wrap my head around it!

Reproducing the bug

Now that I had a bit more knowledge of the code and classes involved with the crash, I was in a much better position to reproduce it. So, naturally, I started making several test posts that had cute videos of my son attached to them:

After still not having any luck, I reread the class summary for AVComposition. A particular line stuck out:

…combines media data from multiple file-based sources in a custom temporal arrangement, in order to present or process media data from multiple sources together

This was where my “AHA!” moment occurred. AVComposition represents media from multiple sources, and to my knowledge there was really only one candidate for a video to combine media on iOS—slow motion ones.

Sure enough, as soon as I tried to upload a slow motion video to a post, Buffer crashed!

The fix!

At this point, I was in a good position to code up a fix. The questions I had now were:

How do I retrieve a video from AVComposition

How do I ensure it’s below our 30 second limit

For starters, I needed to tweak the logic in our completion handler. Now, I knew that we should expect either an instance of AVURLAsset or AVComposition.

This approach may not be bulletproof, but from the several slow motion videos I’ve used, it works like a charm:

The tracks property on AVComposition is an array of AVCompositionTrackType instances. These are super helpful, as they contain information about all sorts of things like the track identifier, media type, segments and more.

Getting the video length

Now that I knew when I’d be dealing with slow motion videos, my next task was discovering the video’s length. Apple has provided a lot of useful struct types just for this purpose!

In the segments array, the last track comes back as the video while the first represents the audio. Using AVCompositionTrack, I was able to grab the video track and calculate the length using CMTimeMapping.

Exporting a slow motion video

The last piece of the puzzle was getting a hold of the NSURL from the video. To upload videos in our codebase, we need an NSData representation of it. Once we have a url, we can get that data using [NSData dataWithContentsOfUrl:].

To achieve this, I basically went through a three-step process:

Create an output URL for the video

Configure an export session

Export the video and grab the URL!

After it was all said and done, the implementation of this step looked like this:

Going back a little, you don’t have to worry about any of this if you are returned an AVURLAsset back from the completion handler. All that’s required in that case was the code I shared above that we had been using previously.

Wrapping up

This was certainly one of the more rewarding bugs I have come across and fixed so far at Buffer. It didn’t seem that there was much talk about it among the community, so diving in and relying solely on Apple’s docs was a good reminder that they are always a great place to start!

If you’ve run into a snag with slow motion videos on iOS, I certainly hope this helps you out! It was my first pass at it, so I’d love to know if you have any suggestions on how to improve it.

Are there some obvious ways to optimize it? I’d love to learn from you!

You are genius, I have exactly the same problem. yesterday spent hours wondering how can I get the URL for the slow motion videos. I read lots of Apple documentation, but I gave up in the end, I just converted your code to swift and it works like charm.
Thank you.

You’re totally right here – I discovered those PHAssetMediaSubstype enums a little while ago, they are super helpful! I use them for detecting slow mo/live photos/timelapses now, thanks for the tip :)!

martsen

Thanks. Very helpful.

Pradeep Bisht

Hi.

Thank you for the post. I have a similar issue with slomo videos.

I’m able to record slow motion videos at 120fps using AVFoundation. These videos play as expected on iphone and mac quicktime player. But other players like videojs and my own player[AVPlayer] are not able to play it in slow motion. But if I create a slow motion video using iphone native camera app, then these same players are able to play it as expected.

I compared the two videos using mediainfo and two differences came up – 1. Format Profile: High@L5.1 vs High@L4.1 . I used ffpmeg to make my video’s profile to High@L5.1 but it didn’t make any difference.

Format GOP : M=1, N=120 vs M=1, N=30. Again I used ffmpeg to set N=120 on my video but it didn’t make any difference.
I also read online that exporting using PHAsset may help but my video files are created inside my Documents directory and will never go to Photo Album/Camera roll. I can probably try exporting to camera roll using PHAsset and then move it to my Documents directory as a hack (if it works at all). But I really need to know the underlying reason.

Any ideas what is causing this issue ? Thanks

PS: The condition that you use to check slomo video fails for slowmo videos created using native camera app. It has only one track.

nipun rajput

Hi pradeep, did you manage to solve this?

Utsav Dusad

I have a issue with this asynchronous method. It gives this below runtime exception in the debug console. How is a Calendar related to a slow motion video?
[_NSCFCalendar components:fromDate:]: date cannot be nil.

Future exception.
A few of these errors are going to be reported with this complaint, then further violations will simply be ignored.

Andy

In my experience, the slo-mo videos exported file can be non-determinsitic, e.g. file size difference and the encoded files are binary different too (by a lot). I’m wondering have you seen this case ? Basically run the same export code multiple times on a slo-mo video.

nipun rajput

Hi. Great article. One issue I have is, I am recording a 240 fps video via my app and when I try to play it in An AVPlayer, it plays like a normal video. If I save it in the Library, slowmo works. Do you have any idea about the discrepancies in AVPlayer with respect to high frame rates?