I suppose you already have a basic knowledge about the main Docker commands (run, pull, etc.).

I have been using Docker version 1.12.3 and Docker-compose 1.8.1 (be sure you docker-compose version supports the version 2 of docker-compose file)
We can directly pull the images for Elasticseach and Kibana (I am using the latest version 5.0.1):

I defined a docker-compose.yml file to ship two containers with the previously pulled images, I exposed the default ports, 9200 for Elasticsearch and 5601 for Kibana. The environment variable defined within the Kibana service, represents the Elastichsearch url (within Docker you just need to specify the service name, it will automatically resolve it to an IP address).

1

2

3

4

5

6

7

8

9

10

11

12

version:"2.0"

services:

elasticsearch:

image:elasticsearch

ports:

-9200:9200

kibana:

image:kibana

ports:

-5601:5601

environment:

-ELASTICSEARCH_URL=http://elasticsearch:9200

With the docker-compose version 2 you do not have to specify the linking between the services, but they will be automatically placed within the same network (beside you specify a custom one).

The last version of Elasticsearch is more strict about the bootstrap checks so be sure to correctly set the vm.max_map_count and the file descriptors number (Wiki: file descriptor)

In this post we are going to see how to build a web application using Django and Docker Compose.Django is a high-level Python Web framework that encourages rapid development and clean, pragmatic design. Docker-Compose is a tool that allow you for defining and running multi-container Docker applications (see my previous post for major details and to see how to install it). I run this example on an Ubuntu 14.04 machine with Docker version 1.12.1 and Docker-Compose 1.8.0.

Create a directory where put the files needed for this example.

1

mkdir compose_django

Create a Docker file (named Dockerfile, no extension needed)

1

2

3

4

5

6

7

FROM python:2.7

ENV PYTHONUNBUFFERED1

RUN mkdir/code

WORKDIR/code

ADD requirements.txt/code/

RUN pip install-rrequirements.txt

ADD./code/

This Docker file defines a basic image based on Python 2.7, creates a folder named code, adds the requirements.txt file to the folder and runs a pip install command.

We have to create now the requirements.txt file, with the following content:

1

2

Django

psycopg2

These requirements are needed to run the Django web framework and to connect to a Postgresql db.

Create now a docker-compose.yml file that will contain our service definition: a Django web server a Postgresql database.

1

2

3

4

5

6

7

8

9

10

11

12

13

version:'2'

services:

db:

image:postgres

web:

build:.

command:python manage.py runserver0.0.0.0:8000

volumes:

-.:/code

ports:

-"8000:8000"

depends_on:

-db

The command section defines the Django instruction to run the web server (see this link to the full documentation about manage.py).

To build the Django web service container we can use the docker-compose run command.

1

docker-compose run web django-admin.py startproject composeexample.

After we run this command we can see that the Django project (compossexample folder) has been created.

The file created by the Django-admin are owned by the root user. Use this command to change the permission for those files.

1

sudo chown-R$USER:$USER.

We need now to edit the Django configuration file (composeexample/settings.py) to set the Database connection string. Edit the DATABASE section and add this new configuration (the parameters of the connection have been defined in the docker-compose.yml file).

1

2

3

4

5

6

7

8

9

DATABASES={

'default':{

'ENGINE':'django.db.backends.postgresql_psycopg2',

'NAME':'postgres',

'USER':'postgres',

'HOST':'db',

'PORT':5432,

}

}

Check this link to see the full Database Django configuration documentation.

To start the two containers we can user the docker-compose up command:

1

docker-compose up-d

The both services (shipped in two different containers) are running:

To list all the container (with extra information like ID,Status, Names, ecc) you can use the docker ps command:

1

docker ps

The Django web application is The Django web server is now running on port 8000.

In case we would like to connect to the running container to perform some Django operation (like create a super user) we can use the Docker excec command.

1

docker exec-it9c91da7fd7a7sh

The id of the container is the one shown by the ps command.

We can now create a super user to login to our Django web application (http://ip_address:8000/admin).

We combine Docker and Docker-Compose to create two containers for our Django web application. The first container contains the Django web server and the second one contains the Postresql database. These containers can be hosted on the same machine in a development environment but can splitted up when the application is deployed to a production environment.

What is Docker?

Recently I had the opportunity to use Docker for a small project and I realized how cool it is!

But what is Docker? “Docker is the world’s leading software containerization platform” (Docker official site).

It allows you to pack your application into a standardized unit for software development. It define itself as:

Lightweight: Containers running on a single machine share the same operating system kernel; they start instantly and use less RAM

Open: Docker containers are based on open standards, enabling containers to run on all major Linux distributions and on Microsoft Windows

Secure by default: Containers isolate applications from one another and the underlying infrastructure, while providing an added layer of protection for the application

Is the Docker approach similar to the Virtual Machine approach? Containers and virtual machines have similar resource isolation and allocation benefits but a different architectural approach allows containers to be more portable and efficient.

Virtual machine architecture: (note that an entire guest operating system is necessary)

Docker containers architecture (the kernel is shared between the containers)

So Docker allows you to host different application (shipped by containers) while sharing the same operating system kernel and keep the application isolation.
Docker comes with a lot of tools like Docker Engine, Machine, Kitematic and Docker Compose.

Compose is a tool for defining and running multi-container Docker applications. You can create a file (called docker-compose.yml) where you define the services that compose your application.

Using this approach you can build an application (composed by different services) and keep the services in separated containers (that can be deployed everywhere: on the same host or on more different hosts).

Install Docker

Now we are going to see how to install Docker. I am using Ubuntu 14.04.4 LTS (codename: trusty). If you want to use a different version of Ubuntu, your kernel must be 3.10 at minimum.

Now add the Docker repository. Edit the file /etc/apt/sources.list.d/docker.list (create it if it does not exist) and add the following line:

1

deb https://apt.dockerproject.org/repo ubuntu-trusty main

Now get the added repository and purge the eventually existing package:

1

2

3

sudo apt-get update

sudo apt-get purge lxc-docker

apt-cache policy docker-engine

And finally install the docker-engine:

1

2

3

sudo apt-get update

sudo apt-get install docker-engine

sudo service docker start

To check if Docker is correctly installed, you can check your docker version with

1

docker version

You should see the version of the Docker client and server.

If you see the error “Docker command can’t connect to docker daemon” while connecting to the docker server, you need to add your current user to docker group as follow (suppose you are using an account called ubuntu):

1

sudo usermod-aG docker ubuntu

To run a test imagine in a container you can use:

1

sudo docker run hello-world

Now you can download, create and run your own Docker image (to see the full list of Docker commands and images, I suggest you to look at the official Docker documentation).

Install Docker-Compose

Now that we have installed Docker, we can install Docker-Compose. On the official site page you can find all the different installation modes, but I suggest you to use pip (Python Package Index) to install it.
If you do not have pip installed, you can get it with:

1

sudo apt-get install python-pip

Now you can easily install docker compose:

1

sudo pip install docker-compose

Tc check the Docker-Compose installation run:

1

docker-compose--version

Now that we have Docker-Compose, we can define a new configuration file to deploy the services of our application in different container and compose (link) them. Here an example of docker-compose.yml file:

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

services:

db:

image:mysql:5.7

volumes:

-"./.data/db:/var/lib/mysql"

restart:always

environment:

MYSQL_ROOT_PASSWORD:wordpress

MYSQL_DATABASE:wordpress

MYSQL_USER:wordpress

MYSQL_PASSWORD:wordpress

wordpress:

depends_on:

-db

image:wordpress:latest

links:

-db

ports:

-"8000:80"

restart:always

environment:

WORDPRESS_DB_HOST:db:3306

WORDPRESS_DB_PASSWORD:wordpress

We defined an DB service (with mysql image) and a wordpress service (that depends from the DB service). These two container can be deployed on the same host or two different hosts. As you know, when you start a new application project is common to build everything on the same machine and often is impossible to split the services later, this approach will help up to split up the application when the application grows or the demand for the different parts changes.

So we moved from a single monolith application to a multi- containers (that can be deployed everywhere!) application. In next post I will describe how to build a Python Django web application using Docker-Compose (web server and Postresql services splitted up in two containers).

This website or its third-party tools use cookies, which are necessary to its functioning and required to achieve the purposes illustrated in the cookie policy. If you want to know more or withdraw your consent to all or some of the cookies, please refer to the coockie policy. Got it!Reject.