Speed up deployments using gulp-changed

A full post on our deployment method is in the works, but in the meantime I wanted to share a little tidbit of knowledge that just saved us a bunch of time on our deployments:

Growella uses DeployBot to handle deployments, but we’re also using Gulp to run webpack, Sass, and other compilation tasks, then building a dist/ directory that acts as the root of our deployment. This ensures we’re only deploying the files we need on production, while leaving out development assets.

We’re also building Growella as a twelve-factor application, so we’re using Composer and WPackagist to pull in our dependencies, which is super easy to do with DeployBot. We’re even able to cache the composer install, ensuring it’s only run when something has changed in composer.json (for instance, installing/upgrading a plugin).

On paper, this looks great: we’re only running Composer when we need to, and we’re able to prepare a nice, packaged version of the site for delivery to the target server, which is taking advantage of DeployBot’s atomic deployment pattern.

Here’s the rub: it was slow. Unacceptably slow, taking 10-15min to deploy a WordPress site.

Speeding up deployments

As it turns out, DeployBot uses modification timestamps to determine if a file has changed locally compared to what’s on the target server. In the first iteration of our gulp copy task, we were completely wiping out dist/ with each build, resulting in the entire directory appearing to have been changed (at least, as far as DeployBot was concerned). This meant that every deploy was a completely new copy of the codebase, and all the nice caching DeployBot puts in place was being completely bypassed.

No longer were we wiping out the entire dist/ directory, but our deployments were still taking just as long. As it turns out, Gulp doesn’t care whether or not a file has been modified before overwriting it, so every file was still being touched.

gulp-changed works by filtering the files passed into it and removing anything that hasn’t been changed in the target directory (in our case, dist/). This prevents our copy task from overwriting everything and limits the changes to those files that have actually been touched.

Configuring gulp-changed to use a file hash instead of timestamps

There’s another step we can take to make our deployments faster, but much to my chagrin, it doesn’t work on DeployBot. If you’re on a deployment service that allows persistent data, read on!

By default, gulp-changed looks at file timestamps, which is a very inexpensive operation that lets it quickly determine which files have changed. However, this approach can have unforeseen impacts on your deployment builds: remember when I said that DeployBot caches our composer install results until composer.json changes?

Let’s say you’re using Composer to manage WordPress dependencies, and a new version of one of your plugins is released. When you run composer update locally to update that dependency, your composer.json file (or at least your composer.lock) file will be updated.

When you push this change, the build server will recognize that it has to throw away its cache of Composer dependencies and pull anew — resulting in every file getting a new modification timestamp. Updating a single plugin will make the service re-deploy every dependency!

To get around this, we can take advantage of the hasChanged argument for gulp-changed, and have it create SHA1 digests (a “fingerprint” for the file) to use for comparison rather than modification times:

While the SHA1 fingerprints take a bit longer to generate (thus making our copy task take a few seconds longer), the time you save by not re-deploying WordPress and every installed plugin makes it well worth it!

Wondering why this doesn’t work on DeployBot?

When DeployBot clears its cached commands, it completely rebuilds the deployment container, meaning we don’t have anything for gulp-changed to compare against!

It could be worth investigating mounting a separate volume, compressing the dist/ directory and offloading it to a remote server, or otherwise saving the SHA1 digests for built files, but that’s a thread for another day.