Automating App Deployments with User-Data

Automating common development tasks such as building, testing, and deploying your application has many benefits, including increasing repeatability and consistency by removing the potential for interference by "the human element." Deploying your applications by running a single command from the commandline means that your team can spend their time working on the app and rather than the care and feeding of installations.

There are some very convenient use-cases for creating new Droplets and automatically running applications on them. Your team may want to deploy a feature-branch containing new customer or user-facing code in order to get feedback or stand up a demo-instance of your product for a customer at the touch of a button. This blog post will cover how you can accomplish these and other use cases with the DigitalOcean API.

Mitchell Anicas has written about using Metadata via the API in the DigitalOcean Community. With that as a starting point, we can create some workflows that automatically deploy applications to Droplets. With the DigitalOcean API and CloudInit accessed via User-Data we can

Get an application or source code onto a Droplet

Run an application in a Docker container so that it "just works" with a
single API call

Setup configuration management tools automatically

Getting your application code to the Droplet

Before we can run our application, its source code or binary needs to be on a Droplet. As Mitchell described, spinning up a new Droplet via the API is very simple so our only modification will be in setting up an application stored in public version control, specifically GitHub. If your project happens to be on another service such as BitBucket or another hosted version-control service the appropriate changes should be simple.

Suppose I have a public GitHub repository housing a Rails application that I would like to deploy to a Droplet via the API. Using the User-Data functionality I can simply install Git and clone the repository in the runcmd block of the Cloud Config:

Getting your application running!

Simply cloning your code onto a fresh running Droplet is nice, but is not nearly as useful as having your application "just work" on that Droplet. We've written fairly extensively about Docker previously, including a Getting Started Guide to using it on DigitalOcean. Not every image at DigitalOcean supports User-Data but conveniently our Docker Application Image does, allowing you to deploy a running instance of your application on it.

We are going to work through the process of getting an example Rails 4 application up and running on a new Droplet using User-Data and Docker.

I have forked the Sample Rails 4 application from railstutorial to my personal github in the sample_app_rails_4 repository. In my fork I included a Dockerfile which configures a Docker container with all of the application's dependencies, sets up its database, and finally runs the application.

With that file in the repository, modifying our User-Data to run the application is very simple. First change the image from "ubuntu-14-04-x64" to an image that ships with Docker (to find those use our /v2/images API endpoint with application image filters). In this case we will use Docker 1.4.1 on 14.04 whose slug is docker. We can instruct Docker to build and run our container while exposing ports 80 and 443 to the application's HTTP(s) server port (in this case 3000) by changing the user_data field in our JSON body as follows. Walking through the commands below, we first install git and clone down our sample application with it. We then instruct Docker to build a container from the application, run it, and bind ports 80 and 443 to the rails server running on port 3000.

For the sake of simplicity and brevity in this post, we have simplified the deployed application to use SQLite3 in production. In the case where you have a more realistic infrastructure including relational databases, key-value stores, full-text search engines, etc, you will need to build separate Docker containers for each and link them up. The dockerfile project on GitHub has Dockerfiles for many of your favorite projects to help you on your way.

Building new Droplets using Configuration Management

For larger and more complicated infrastructure many teams will lean on sophisticated configuration management tools to automate everything and refocus their attention on more challenging problems than installing dependencies. The DigitalOcean community has covered several options in their tutorials: Puppet, Ansible, and Chef. Many of those tools include modules for interacting with DigitalOcean already such as Knife's DigitalOcean Plugin and Ansible's DigitalOcean Module but at the time of this writing they do not include User-Data support. Much of the same functionality from our previous User-Data example can be replicated in a Configuration-Management system such as Puppet, Chef, or Ansible. As the complexity of your configuration grows User-Data alone can become unwieldy. Configuration management tools allow you break your configurations into more manageable units.

We can use User-Data to install and configure our configuration-management tools, which can in-turn, configure your application. Using the previous User-Data techniques we can install Puppet, fetch your manifests, and configure the Droplet. Here we fetch Puppet Lab's package and install it (per their instructions). We then update Apt and install both puppet and git. After getting those packages installed, we clone our Puppet manifests and apply them. After that we are free to do whatever we like with our newly configured Droplet.

Conclusion

The User-Data functionality support in DigitalOcean's API allows you and your team to automatically run your code on Droplets. By automating the deployment process, your team will be able spin up new instances of your application on Droplets as quickly as running any other command. From there testing new features or letting prospective clients use their own demo instance is one command away!

Have any questions about automating your infrastructure using User-Data? Found any exciting use cases? Let us know in the comment section!