Feeds

Author

Setting up a new local environment can be challenging and really time-consuming if you're doing it from scratch. While this might not be a big deal when working as a single developer on a project, in a team-based scenario it's important to share the same infrastructure configuration. That's why we highly recommended using a tool like Docker to simplify the process.

Last summer, Jesus Manuel Olivas (Project lead) and I started working on a new project, and we had to discuss which setup we should use for the local environments. Since the project was already set up to use Lightning and BLT, we both agreed to use DrupalVM with Vagrant. Everything seemed to work great apart from some permissions conflicts, which we could easily resolve since the project only had two developers at the time.

DrupalVM is a tool used in creating Drupal development environments quick and easy, and it comes with the option to use Docker instead of or in addition to Vagrant but is mostly known and used with Vagrant.

Why We Switched to Docker?

After a few weeks of development, more developers came on board on the project and we started running into some issues. Vagrant was not working as expected on some machines and we were spending way too much time researching and fixing the provisioning issues, so Jesus and I had to go back to the drawing board to come up with a comprehensive solution, and we decided to switch from Vagrant to Docker.

Trying Docker

Docker is a tool for building and deploying applications by packaging them into lightweight containers. A container can hold pretty much any software component along with its dependencies (executables, libraries, configuration files, etc.), and execute it in a guaranteed and repeatable runtime environment.

This makes it very easy to build your app once and deploy it anywhere - on your laptop for testing, then on different servers for live deployment, etc.

There are plenty of 'ready to use' tools to implement Docker with Drupal, just to mention a few:

At this point we didn't want to add an extra layer or tool to the setup process, so we decided to go straight to a plain vanilla Docker configuration.

How To Implement a Basic Docker Configuration For Drupal

Installing Docker

This should be an easy step and once the installation is completed you should have the docker daemon running, confirm by running docker on your terminal and you should see the list of available commands. Download link

Step 4. Starting the containers

To start the containers you need to execute the following command docker-compose up -d, grab some coffee or a beer and be patient while the images are downloaded to your local computer.

Step 5. Importing a database dump (optional)

You can import previously exported DB dump by copying the dump file under the mariadb-init directory and uncommenting the following line on your docker-compose.yml file.

- ./mariadb-init:/docker-entrypoint-initdb.d

Step 6. Checking for used ports

One common issue you'll likely run into while starting the containers, is finding the ports in use, this could mean an instance of Apache, Nginx, MySQL or other service is already running, so if you want to know what is using the ports you can run this commands on your terminal:

lsof -i :<PORT_NUMBER>

Useful docker-compose commands

Starting the containers, using detached mode

docker-compose up -d

Stopping the containers.

docker-compose stop

Destroying the containers

docker-compose down [-v]

NOTE: You can pass the -v flag to destroy the shared volumes as well. Be careful this will destroy any data on the shared volumes between the container and the local machine.

Checking the logs

docker-compose logs -f <CONTAINER_NAME>

Executing CLI commands.

While working with containers is common to see developers ssh-ing into the machine to execute commands. To avoid this practice you can take advantage of the docker-compose exec command.

docker-compose exec <CONTAINER_NAME> <COMMAND_NAME>

Using Composer

Drupal 8 really takes a lot of advantages of using composer, you can install/uninstall dependencies and patches. Although it’s a good practice to run this commands inside your container because if you have a PHP version on your local machine, you could install dependencies that are not suitable for your container instance.

docker-compose exec --user=82 php composer <COMMAND_NAME>

Using DrupalConsole

If you want to use DrupalConsole on your project you can add an alias file to the repo at console/sites/site.yml containing the following configuration.

On the other hand, if you prefer the direct way to run the commands you can use

docker-compose exe --user=82 php drupal <COMMAND_NAME>

Wrapping up

The new setup worked really well on everyone’s computer, and we didn’t have any more issues on this since we changed, now the project went live and we got a great experience and we plan to keep using Docker for future projects.

If you have the feeling Docker’s architecture is hard to understand and could be complex to get up and running you can take advantage of the projects mentioned below (Lando, Docksal, etc) to make it easy for you to start working with containers.

UPDATE:Because of the nature of the project a Drupal site, we based our docker configuration on the Drupal4Docker project by wodby. For other projects using technologies as Symfony, ReactJS, MeteorJS we create our own custom Docker files and custom images.