Animations are assets: using Core Animation archives on iOS

Nov 11, 2018

I should start by explaining what I mean by “animations are assets”. I don’t mean that every single animation in an app must be represented by an asset and can’t be done programmatically, since that would be dumb. What I do believe in is that complex animations, especially ones that are not very dependent on dynamic data that’s only known at runtime, should be assets.

I’ve always been in favor of letting assets be assets. If you have a button and that button has an icon, that icon should be an asset, either a set of PNGs or a PDF if you’d like to keep the vector data. Some people like to draw everything in code, even using apps such as PaintCode to generate it for them. I’m not a fan of that approach, and the same thing goes for animations.

One way to represent animations as assets is to encode them as video files and play them back using something like AVFoundation. That’s a valid approach depending on what you’re doing. If you don’t have to support a very large variety of screen shapes and dimensions, a simple video should work. If you need animated vector graphics that can be scaled and possibly transformed in other ways at runtime, you’re better off by using another technique.

The one I’m going to propose today is not used a lot by third-party apps, but it is used a lot by Apple’s apps and system components. There’s this thing called a “Core Animation Archive”, you can find one if you look into Apple’s apps and frameworks, usually represented by a file with the extension .caar. These files are actually fairly straightforward: they consist of a Core Animation layer tree which is archived using NSKeyedArchiver, resulting in a “frozen” layer tree you can store on disk and then load again at runtime.

If you’re not familiar with NSKeyedArchiver, all you need to know is that it’s a very old API (being old doesn’t mean bad or deprecated, it’s just old) that takes objects from memory and encodes them in a way that can be stored on disk and then transformed back into objects later. Storyboards and XIBs work this way at runtime: they’re kinda like freeze-dried objects.

So all you need to know to create yourself a Core Animation Archive is how to use NSKeyedArchiver and which format the archive should be in. CAAR files usually consist of a dictionary as the root object, this dictionary has a key called rootLayer, with its value being, you guessed it, the root layer of the archive that should be read by the application and drawn on screen.

Here’s a simple way to create a Core Animation Archive programmatically:

Notice the code will first try to find the asset on an asset catalog. That’s the way I prefer to ship assets with my apps, including animation assets. You can read more about asset catalogs and how to use data assets on this article. If it can’t find the asset in a catalog, it will try to load it from the bundle’s resources folder, assuming the caar extension.

Using Kite

The example above was used to illustrate how simple it is to create and read Core Animation Archives. It’s not always practical to create the animation in code, if we were always creating the animation in code there would be no need to archive it to disk and load it later (we could just use the code directly).

But there’s a tool that makes creating animations with CoreAnimation much easier: Kite. I think of Kite as “Sketch, but for Core Animation”. My workflow usually goes like this: create animation assets in Sketch, import with Kite, animate and export to CAAR.

Let’s say there’s a flow in my app the user must do and I want to reward them at the end with a nice haptic feedback and a custom, animated checkmark.

I start by creating a simple checkmark in Sketch, which is then imported into Kite. In Kite, I animate the strokeEnd property for both the checkmark and the circle around it, creating a nice little animation. Then I choose the Export > Core Animation Archive option, save the caar file and add it to my asset catalog.

I then create a simple UIView subclass that can be initialized with an instance of that AnimationArchive class I showed earlier:

The code above is fairly straightforward. The AnimationView class is initialized with an AnimationArchive, from which it gets its animationLayer. I use layoutSubviews as the signal to install the animation layer on view’s layer tree and to also layout it according to the view’s bounds, the CATransaction calls are required to prevent CoreAnimation from automatically animating the changes we do to the layer in those methods.

Setting isGeometryFlipped on the animation layer is necessary because we exported it from macOS, but are using it on iOS, which has a different coordinate system.

The method layoutAnimationLayer is not shown, but you can find it in the sample project. It does some math to transform the animation layer so it fits the view’s bounds, without distorting its contents.

To control the playback of the animation, I implemented stop() and play() methods:

Those are very straightforward: stop() rewinds the layer so it goes back to the beginning of its timeline, then sets its speed to zero, preventing any animations from playing. The play() method sets the speed of the animation layer to 1 and it’s beginTime to CACurrentMediaTime() to make sure the animation starts playing from the beginning immediately after it’s called.

That’s it! There are other things you can implement by messing around with the timeOffset and speed properties of the animation layer such as reversing the animation or driving the animation using a gesture recognizer, which is what I did for the onboarding shown on my previous article.

Background

So now that I’ve shown how it’s done, maybe I should explain why I prefer to treat animations as assets instead of coding them by hand or using 3rd party animation frameworks.

The use case that made me adopt Core Animation Archives wasn’t directly related to animations. When I was making the first version of my app ChibiStudio, I wanted a way to store vector data that could be manipulated at runtime (such as changing the fill color of a layer) for the items a user can pick to create their character.

I thought about using SVGs, but there’s no native way to turn an SVG into CoreAnimation layers on iOS, which means I would need to ship a large dependency such as SVGKit with my app to do it at runtime. Shipping SVGs with the app would also make the app a lot larger and have a performance impact because those SVGs would have to be parsed and turned into CoreAnimation layers at runtime.

Then I learned about Core Animation Archives while doing some reverse engineering of iOS and decided to use them instead. The app has been up since 2016 and this technique has been proven to work very well for its needs.

I’ve been asked before if this is using a private API, to which the answer is: definitely not. The CALayer (public) class adopts the NSCoding (public) protocol and we use NSKeyedArchiver and NSKeyedUnarchiver (both public) to save/read the archive. There’s no private API involved, we’re just using NSCoding for CALayer like we would for any other object such as NSString, NSNumber, etc. CALayer’s conformance to NSCoding (more specifically NSSecureCoding) is even documented.

So no, this is not using a private API and it’s not likely to break any time soon. I wish this technique was more widely known and documented by Apple because I think many apps could benefit from it.

Main advantages

These are, in my opinion, the main advantages of using this technique instead of code generation or a 3rd party animation framework:

You get to use Kite (a visual editor) but without having to use its generated code, which can be big and not necessarily pretty

It avoids adding another dependency to the app, which for me is always a win

Since the output is a CoreAnimation layer tree, manipulation can be done to change colors, transform layers or change the animation behavior

Being an asset means that it can be added to an asset catalog, directly to a bundle or even downloaded from a server

I hope this article has inspired you to try out this technique. As always, you can reach me on Twitter.