Almost every company in this day and age is trying to figure out how to become more agile. Now that infrastructure can be delivered as software, I am seeing a lot of innovation around the automation of environments. One of the most interesting innovations that I have seen is Docker. Docker is a Linux container engine that is simplifying the process of delivering environments.

Problem Statement

As technology evolves, the infrastructure and the software stacks are become larger and more complex. Envision a spreadsheet where the rows are the different hardware options and the columns are the different software options and you have a complex compatibility matrix that needs to be solved. One way of simplifying the problem is to limit the rows and columns, which means you take hardware and software options off the table for developers. This is the equivalent of taking tools out of the construction worker’s toolbox and expecting him not to lose any productivity. Another way to handle this is to write a whole bunch of Chef or Puppet Labs code for each combination of hardware and creating a dependency where developers can only run on machines that have Chef or Puppet installed. A better way to handle this is by leveraging containers, which is what Docker does.

The Shipping Analogy

The shipping analogy is a great way to describe what Docker does. The goal of shipping companies is to get products from point A to point B as fast as possible. The problem is that one customer may want to ship coffee beans while another may want to ship chemicals. The shipping company would have to schedule two different trips because they could not put the chemicals near the coffee beans. So they invented containers. These containers could keep products cooled, isolate products from other products, and make it so the shipping company could fill the boat with containers without worrying about any dependencies between the various products. What the shipping companies did was create a separation of concerns. The customer worries about putting goods in containers and the shipping company worries about shipping containers from point A to point B. This made shipping companies more agile because they could ship products more efficiently.

Docker takes this same approach for infrastructure and software. With Docker an entire environment can be delivered as a container and run on any infrastructure; public, private, virtual machines, bare metal machines, laptops, etc. The compatibility problem is solved by separating the concerns. The complexity of integrating the infrastructure and software stack is hidden in the container and is not of concern to the users of the containers.

How Does It Work?

Docker is made up of two layers: the container engine and a library. The library includes a starter kit that includes things like the dotCloud PaaS software, MySQL, base Ubuntu, base Rails, etc. Developers can enable any of the existing libraries or add other libraries like Redis and Mongo, for example. Once all of the desired libraries are enabled, a portable container is created that can be deployed anywhere and run consistently in any environment. Anything that can be packaged into a container can now be part of your continuous integration process.

True Hybrid PaaS

DotCloud started out as a public PaaS. Docker can now be used to deliver the dotCloud PaaS on any infrastructure. So for companies looking for hybrid PaaS solutions, dotCloud can now fit that bill because the dotCloud PaaS can be delivered in a container to both private and public infrastructure. What is even more compelling is that traditional private PaaS solutions have to make assumptions about the underlying infrastructure and the applications stack. In this model, there never will be a perfect PaaS because every customer has a different compatibility matrix. With Docker, each customer can customize their container to meet their needs, thus eliminating the restrictions forced upon them by other PaaS solutions.

Strong Open Source Community

Docker is open source and has an extremely active community. I attended the first-ever Docker Demo Day back in February and was one of six people in attendance. A month later 60 people showed up. In March dotCloud officially announced that Docker was open source and a huge community has spawned worldwide. Community members are creating all kinds of containers, such as a Redis container and a Postgres container. Companies like eBay, Mozilla, and MailGun are already using Docker.

All in the Name of Agility

Why would I use Docker? Docker helps simplify the creation of environments so that the developer’s laptop has the exact same environment as the QA environment, the staging environment, the production private cloud environment, and the production public cloud environment. By having consistent and portable environments early in the SDLC, we increase quality, decrease development time, and get products to market faster. Gone are the days of wasting precious time mucking with environment configurations and creating bugs due to inconsistent environments.

Docker can be found on Github. Check it out and let us know what you think.

Share this Article:

Mike is a VP/Principal Architect for Cloud Technology Partners. Mike has served in numerous technical roles such as CTO, Chief Architect, and VP positions with over 25 years of experience in software development and architecture. A pioneer in cloud computing, Mike led a team that built the world's first high speed transaction network in Amazon's public cloud and won the 2010 AWS Global Startup Challenge. An expert in cloud security, Mike is currently writing a book for Wiley Publishing called "Architecting the Cloud: Design Decisions for Cloud Computing Service Models (IaaS, PaaS, SaaS)" which is expected to be released in late 2013.