Docker is a powerful tool for creating distributed application and service environments, which can make it very useful for dedicated server hosting clients.

Why Use Docker On A Dedicated Server?

Unless you’ve been living under a rock for the last few months, you’ll already have at least an inkling of what Docker is and what it can do for you. The container management technology has taken off in a big way in 2014. As a lightweight alternative to virtualization and a way to easily create portable development and production environments, Docker can’t be beaten, but in spite of its rapidly growing popularity, I frequently speak to people who have a vague idea what Docker is for, but can’t really see why it would be useful for them. In this article, I want to discuss why dedicated and cloud server hosting clients might want to take a closer look at Docker.

What is Docker?

Docker is a system for creating and managing Linux Containers. You can think of a Linux Container as a self-contained environment that holds everything necessary to run an application. So far so similar to a traditional virtual machine, but where Linux Containers differ is that they don’t use a hypervisor and they don’t contain their own operating system kernel.

Containers use and are managed by the host kernel. That means containers are very lightweight: they can be started in fractions of a second. All that’s needed to run Docker containers on Linux is the Docker daemon and a dockerfile: a configuration file that tells Docker how to build an image, which is then run as a container.

What’s the point of Docker?

The reason Docker has so many people excited is that it allows for the creation of a strictly isolated environment that can contain everything that’s needed to develop, test, and deploy an application. The same container can run on a production server and the developer’s MacBook (with a lightweight virtual machine). Development environments can be distributed across multiple systems without anyone having to worry about having the right software versions. The containers can be pushed to dedicated servers, cloud servers, virtual servers, or any other instance of Linux.

What does that mean for you?

Docker containers allow server owners to isolate services and applications. On a dedicated server it can be tricky to run different services and applications with divergent requirements, because it can be complex to manage dependencies in a mixed environment. With Docker, the problem doesn’t exist: each service runs in its own container with everything it needs. Isolation also offers security benefits: containers have no access to other containers or to the host operating system.

Furthermore, because you develop in the same containerized environment as you deploy, if the application runs in the dev environment, it will run on the server. No more: “But it’s running fine for me; the problem must be on your end.”

Is Docker for everyone?

Docker has many uses, but it’s not panacea to all system administration problems. As Matt Jaynes has explained: in some cases, Docker can lead to increased complexity. It’s a relatively immature solution and still has a way to go before it’s universally useful.

That said, it’s well worth looking at if you run services and applications from a dedicated or virtual server.