Containers: A New Cure, or a New Management Challenge?

Why is it important to be agile in IT? If there is no agility, it is very difficult to adjust to changing requirements; static, rigid designs limit your mobility, decrease efficiency and eventually impact your ability to deliver reliable service to your customers. We looked at some agility challenges around compute, storage and network fabrics. But the ultimate agility need is around applications which deliver the useful service. How is application agility accomplished?

Applications: cutting the VM umbilical cord

This seems to be a no-brainer: virtual machines! You install an application once into a VM and then you move it around between hosts, datastores and clouds. But what is the price of that? A VM is a legacy container which mimics the behavior of a physical box the applications used to run on. It has all its artifacts: memory, CPU, I/O, network, storage. It has all necessary packages, patches, operating systems and settings. Exactly like a physical box, right? But what was good for the physical world becomes an unnecessary burden for the virtual and cloud environments. Why do we need to carry around all this weight, identical operating systems, packages, components? Install them, patch them, back them up? Just to move an application from a location to location where it can get more resources you need to carry the baggage and pay huge price.

Moreover, you now need to size not only the physical infrastructure which is understandable, it provides real resource supply. You now have to size virtual machines which actually don’t have any resources of their own, they just consume the physical resources on behalf of applications running inside.

Virtual machines were very useful at the early day of virtualization when the biggest burden was to migrate the load from physical boxes. And the closer the physical world was to virtual the easier the process was. But we are now moving to a new era and carrying all that baggage is slowing us down. What is the cure? A new, rapidly moving wave offers a great way to capture the minimal things applications need to run, package them into a container like popular Docker and then just move containers around. Containers are not the units of workloads, they just convenient packages to capture the necessary application components. And they are also very efficient packages as they use various techniques to minimize resource redundancy, like Union File Systems which implement translucent storage and share the common blocks.

But once containers are installed, applications consume the real single OS resources, not separated by the virtual machine layer. Yes, you can run them inside a VM to simplify application deployment, but the truth is you don’t have to, it could be an entirely new way to share infrastructure resources.

It is a great invention and it may change the world similar to original revolution of virtualization. But what challenges will it create? Actually, the challenges are very similar to those virtualization created. You still need to know where to run your applications and the different resources it would need to run reliably. You can now freely move containers between hosts and clouds, but where to? If the application needs more resources what is better – move it to another host, to the cloud or add more memory or storage to the existing location? You no longer have a VM to size up but you still need to provide needed resources. By cutting the VM umbilical cord you gave your applications more freedom but you also created more choices and options. And new challenges in application management.

As usual, agility comes with a price. And it is very important to have a way to control a trade-off between performance, efficiency and agility regardless where your applications run: inside a VM, in a Docker container, in a private or public cloud. Are you ready for the new container challenge?

Image Source: Neo as he cuts the cord, without realizing the management challenges he’s about to face, from The Matrix