How to containerize a Node.js Application using StrongLoop Process Manager

The rise and rise of Node.js applications is, quite simply, phenomenal! Thanks to a JavaScript-everywhere architecture, the difference between backend and frontend is no longer as evident as it used to be. From emerging startups to large enterprises, every business is trying to leverage the lightweight and lightning quick Node.js framework to build high-performance applications across numerous use cases. For those interested to drill down a bit more, here’s an article explaining the reasons why Node.js is so popular.

Equally up to the task, or perhaps, even more evident, is the use of cloud! All these applications are largely built for the cloud, where one needs to minimize the dependency on OS and the environment to truly leverage the power of the cloud. And that’s where containerization technology comes to the fore, helping developers like you and me to break these apps down into binary and deploy them on cloud clusters!

One of the most commonly used tools to achieve containerization is, Docker. In a nutshell, Docker is a software (or a containerization platform), that provides an abstraction of OS-level virtualization. These containers contain the real operating system, the software that you build along with all dependencies to run the software in a variable environment. In this blog post, we’ll explore how to containerize a node.js application using Docker. But before that, we need to break the app into binary – and for that very purpose, we’ll use StrongLoop Process Manager.
If you’re a developer, get yourself a host-machine with Docker-Engine and Docker-Compose installed. And it goes without saying, that you need a node.js application to containerize a node.js app!

While we’re at it, let’s take a look at some of the Docker components, since we’ll be using some of them in the subsequent processes.

Next up, are some of the best practices. Trust me, although this looks simple enough, one step amiss and you’ll have to run-around quite a bit!

Avoid installing unnecessary packages

Run only one process per container

Minimize the number of layers

Sort multi-line arguments

Having checked for the best practices, here’s a quick look at some of the Dockerfile instructions we’ll use:

FROM: Sets the base image for the subsequent instructions

MAINTAINER: Allows to set the author field of the generated image

RUN: Allows you to execute any command in a new layer on top of the current image

CMD: The main purpose of CMD is to provide defaults of an existing container i.e Starting a service

LABEL: Adds the meta data to the image

EXPOSE: Informs docker that the container listens on specific network port at run time

ENV: Sets the environment variable to the

ADD & COPY: Both performs the same functionality. COPY supports only basic copying of local files into container; while ADD has some features like local only tat extraction and remote URL support

ENTRYPOINT: Allows to configure a container that run as an executable

VOLUME: Used to expose any data storage area on host to the docker

WORKDIR: Working directory, instead of proliferating instruction like RUN cd.. && perform some function use WORKDIR to define working directory.

And finally, we’ve reached a point where we can install StrongLoop Process Manager and get done with the rest of the process in a breeze! So, here’s the step-by-step process you’ve been waiting for:

Installation of Strongloop process manager on docker container

Download and run the StrongLoop Process Manager container

# curl -sSL https://strong-pm.io/docker.sh | sudo /bin/sh

Verify the Docker image

# docker images

Verify the Docker container and ports

# docker ps

Note: Port 8701 is the deployment port while 3001 – 3003 is the manager port.
Once the Strongloop process manager is up and running, it’s time to build the Dockerfile – this goes a long way in ensuring we don’t run multiple complex executables.

Saved ourselves the hassles of building a separate node.js environment just for that single, awesome app!

The next time you come across an application built on node.js and you want to deploy it without worrying about OS environments, you know what to do! And now that we already have a container up and running, you may choose to deploy any other app, irrespective of the OS - on the same host!

Doesn’t that save you a good bit of cost and all the troubles involved in building a new environment from the scratch? Trust us, it does! If you have any questions, we’re here to answer. Just drop us a comment and we’ll be glad to reply in double-quick time.