Misconception 1: Docker is all-or-nothing

Some large companies view Docker as an all-or-nothing proposition, but that’s just not true. As we saw with ADP, an overnight refactor simply isn’t possible or practical.

Not only is it possible to dip your toes in the Docker pool before diving in headfirst, it’s encouraged. The beauty of Docker is that its containers are designed to be lightweight and opinionated. Even a simple LAMP application can be migrated piecemeal. Isolate the API first, then the web service, and then maybe the CRM.

Misconception 2: Docker means no more IT

Despite its popularity, “DevOps” is still a pretty nebulous term. A lot of tools and methodologies imply that developers have to take over the role of IT within an organization, but that simply isn’t the case.

It’s true that tools like Docker can better facilitate developer responsibility across the entire stack and bring a development mindset to service management. But there is absolutely no reason why the IT and development teams have to be merged into one hybrid department.

Misconception 3: Docker requires the cloud

While there are a number of amazing cloud providers out there that give great Docker support, there’s no rulebook that says you have to use Docker only in the cloud.

Due to the size and scope of their offerings, many enterprises manage a combination of bare-metal, virtualized, co-located, in-house, and cloud-based servers. Codeship is of course a big proponent of moving to the cloud, but Docker is platform-agnostic. The needs of your organization can come first. You can read more about how Docker helps with reliability, parity, and maintainability in your organization in Codeship’s “How Docker Streamlines Production Deployments” ebook.

Misconception 4: Docker is insecure

Security and privacy are major concerns with any service, especially in light of recent highly publicized denial-of-service attacks and security breaches. A common misconception about Docker is that it is inherently insecure. While this may have been true in the past, many of these concerns can be mitigated by taking the same due diligence one would take when securing bare-metal or virtualized servers.

Getting Started with Docker in the Enterprise

Despite the high-level benefits of containerization and rapid adoption within the tech industry, there is still a significant level of hesitancy to take advantage of containers in many enterprise-level organizations. This stance is understandable, as the cost of migrating to a new infrastructure is significantly higher for larger companies. But just because migration is difficult doesn’t mean there are no steps that can be taken to test the waters before committing to a drastic overhaul.

So, let’s take a look at how to overcome the (perceived or actual) challenges associated with adopting Docker.

Containerizing apps

If we take a page out of ADP’s book, getting started with Docker is as simple as isolating a service into its own container. This could be an API with few dependencies, an internal administrative interface, or even a simple content management system. Any feature that can be easily isolated from the rest of the application environment will do (although be wary of dependencies, as migrating even one virtual machine to a Docker container can present challenges).

Hosting

Another big question mark when it comes to Docker is hosting. Many enterprise organizations do not take advantage of cloud-hosting services because a 1:1 migration of existing services could be significantly more expensive in the cloud. Docker helps mitigate this by isolating functionality into lightweight containers, which means that spinning up a new service using Docker is going to be significantly cheaper than spinning up an entire virtual server.

The bottom line: Migrating to Docker is much easier than you might think — especially if you run Docker in the cloud. This is true for large enterprises as well as small companies.

Conclusion

The beauty of Docker and the cloud is that they are utterly flexible. You can migrate your entire infrastructure over time or a small part of it overnight. You can host Docker in-house, in the cloud, or manage a hybrid of both. You can use it across your entire stack, or simply streamline your unit test pipeline.

In the end, how you use Docker is significantly less important than simply using it. Its best use case varies from organization to organization, but one thing about Docker is for sure: It makes life easier.

Subscribe via Email

Over 60,000 people from companies like Netflix, Apple, Spotify and O'Reilly are reading our articles. Subscribe to receive a weekly newsletter with articles around Continuous Integration, Docker, and software development best practices.

We promise that we won't spam you. You can unsubscribe any time.

Join the Discussion

Leave us some comments on what you think about this topic or if you like to add something.

In my opinion Docker Container doesn’t add much value to small & mid size production environments while comparing with cost of skilled resources required. It definitely adds lot of value for large scale production environments.

Christopher Langton

Docker allows me to hire into my small team and have the new recruit coding in the time it takes docker-compose up to finish as opposed to installing and configuring databases today, an API tomorrow, and another day setup that piece of processing infrastructure you rarely change and is notoriously difficult to run locally. With docker all of our production resources are up locally and the dev can jump right in. Day to day this is a powerful advantage to the team, any dev can swap between the moving parts and repos quick and easily, with knowledge their environment mirrors the lead developers on that part of the business.

All this for the cost of a few hours setup and in most cases no maintenance (in terms of development, production likely requires the same level of attention it has without docker)

Igor Cicimov

I think you are missing the main points why companies avoid using docker in production:

1. Docker is all about moving forward and introducing new features without taking care for backwards compatibility which makes upgrades a nightmare 2. Because of 1. there are bugs in the OS kernels especially around the docker engine it self and the overlay FS drivers that cause instability, crashes and even data lose on the docker host

In short if docker wants to be an enterprise product then it should start behaving as one, braking changes are unacceptable in the enterprise world.

Christopher Langton

You assume that the O/S in docker needs to differ from what they currently use, why? because “docker”? That’s preposterous. It doesn’t. Your points are the same for current state of organisations before “docker” as after “docker” all they’ve done is isolate the same environment as they previously used UNLESS they make a conscious decision to change it, which is essentially the same thing if they made the exact same conscious decision outside using docker.

It seems you have some misconceptions about docker as well.

Igor Cicimov

Ha? Who said anything about the OS in docker? I’m talking about the host where docker is running on. Sorry but kernel upgrade of my OS does NOT brake the VMs running on it. Good luck with that with Docker.

Yeah it is all nice in theory but it doesn’t work in practise that easy as you like people to think it is. Except if you have an example of a large scale, multi host, multi tier production environment (and I’m not talking about WordPress here) that has been working for the last 2 years without issues and has had regular HOST os and kernel upgrades as well as docker ones? They are all probably still stuck on docker from 2 years ago if they even exist.

Christopher Langton

@igorcicimov:disqus thanks for clearing up your meaning, yes that is a challenge you’re correct. Yet no more challenge then the alternatives, server or desktop environments all suffer degree’s of what you describe, it cannot be denied. It is as fundamental to our profession as wood is to the carpenter. Arbitrary measurements like your 2 years statement show this opinion you express is purely pessimistic and likely unfounded on fact, the fact is the problem exists in both endeavours involving docker or not and it is how you prepare and execute that will define the difference in your enterprises experience. I’ve had my small team use docker for 15 months, at enterprise level this is a small team that operates across all projects docker or otherwise, this project is involving auto scaled and load balanced scaled web apps, api’s consumed by customers and native apps, clustered data infrastructure, build servers, batch processing, clustered lambda data pipelines, and all migrated to docker containers. We have gone through several iterations where we have recently started using ECS, and in other places swarm mode via compose v3 which is possible in the latest docker release and at the same time moved to the choice of using alpine for all our containers. Yes, we experienced host (both Amazon linux and centos with SE linux) issues during the upgrade but this didn’t effect us or our customers any more than it did 2-5 years ago when the same host issues used to arise, in fact now i believe it is simpler as the host has far fewer services and networking quirks to have to concern itself over now that the docker container abstract that for us so these are becoming more of a non-issue as time passes then if we had not started using docker at all and have to still manage everything manually.

Igor Cicimov

Not pessimistic but realistic I would say. The fact is that until recently, at least until 1.9 or 1.10 when the docker networking was introduced, interconnecting docker hosts, containers and services was so painful that it easily outweighs its benefits. Not to mention managing shared storage and clusters of database backends, high-availability, autoscaling and self healing which are all requirements for a production system. Then monitoring and logs collections from the containers etc. etc. etc. Are you saying the docker was ready to handle all these in the past?

Add on top of that the insecurity of the future braking changes that might happen with any next release does not help either. Do you want to be the one explaining to your bosses why did your bank lost millions because of your decision to move all your services to technology that is not mature enough? I don’t …

I’m not against docker, the question asked here was WHY is not EVERYONE using docker? Well here are some of my reasons why. That said with the new docker 1.12 and 1.13 the future is bright. The past? Not really.

At the end of the day it is all about user case isn’t it. Think with your own head and decide based on your applications and services you have to deal with. Or you can blindly join the hype of using docker everywhere, even for powering our coffee machines, and live with the consequences.

We agree on one thing though, the ONLY way I would run docker in production atm is on ECS or Google Cloud. Period.