In this post, I’ll show how to containerize an existing project using Docker. I’ve picked a random project from GitHub that had an open issue saying Dockerize to contribute and use as an example here.
Why in the world do you want to Dockerize an existing Django web application? There are plenty of reasons, but if you don’t have one just do it for fun!
I decided to use docker because one of my applications was getting hard to install. Lots of system requirements, multiple databases, celery, and rabbitmq. So every time a new developer joined the team or had to work from a new computer, the system installation took a long time.
Difficult installations lead to time losses and time losses lead to laziness and laziness leads to bad habits and it goes on and on… For instance, one can decide to use SQLite instead of Postgres and not see truncate problems on a table until it hits the Test server.
If you don’t know what docker is, just picture it as a huge virtualenv that instead of containing just some python packages have Containers for isolating everything from the OS to your app, databases, workers, etc.
Ok, talk is cheap. Show me some code, dude.
First of all, install Docker. I did it using Ubuntu and Mac OS without any problem, but on Windows Home, I couldn’t have it working.
To tell Docker how to run your application as a container you’ll have to create a Dockerfile
FROM python:3.6ENV PYTHONUNBUFFERED 1RUN mkdir /webappsWORKDIR /webapps# Installing OS DependenciesRUN apt-get update && apt-get upgrade -y && apt-get install -y \libsqlite3-devRUN pip install -U pip setuptoolsCOPY requirements.txt /webapps/COPY requirements-opt.txt /webapps/RUN pip install -r /webapps/requirements.txtRUN pip install -r /webapps/requirements-opt.txtADD . /webapps/# Django serviceEXPOSE 8000
So, let’s go line by line:
FROM python:3.6
Here we’re using an Image from docker hub. e.q. One pre-formated container that helps build on top of that. In this case, Python 3.6 is an Ubuntu container that already has Python3.6 installed on it.
You can create all sort of Environment Variables using Env.
ENV PYTHONUNBUFFERED 1 # Here we can create all Environment variables for our container
For instance, if you use them for storing your Django’s Secret Key, you could put it here:
ENV DJANGO_SECRET_KEY abcde0s&&$uyc)hf_3rv@!a95nasd22e-dxt^9k^7!f+$jxkk+$k-
And in your code use it like this:
import osSECRET_KEY = os.environ['DJANGO_SECRET_KEY']
Docker Run Commands are kinda obvious. You’re running a command “inside” your container. I’m quoting inside because docker creates something as sub-containers, so it doesn’t have to run the same command again in case of rebuilding a container.
RUN mkdir /webappsWORKDIR /webapps
# Installing OS DependenciesRUN apt-get update && apt-get upgrade -y && apt-get install -y \libsqlite3-dev
RUN pip install -U pip setuptools
COPY requirements.txt /webapps/COPY requirements-opt.txt /webapps/
RUN pip install -r /webapps/requirements.txtRUN pip install -r /webapps/requirements-opt.txt
ADD . /webapps/
In this case, We are creating the directory that will hold our files webapps/.
Workdir is also kind of self-evident. It just docker to run the commands in the indicated directory.
After that, I am including one OS dependency. When we’re just using requirements.txt usually we are not including any OS requirement for the project and believe me, for large projects you’ll have lots and lots of OS requirements.
Copy and ADD are the same. Both copy a file from your computer (the Host) into the container (The Guest OS). In my example, I’m just coping requirements to pip install them. But Add can COPY from a URL.
Expose Instruction is for forwarding a port from Guest to the Host.
# Django serviceEXPOSE 8000
Ok, so now what? How can we add more containers and make them work together? What if a need a Postgresql inside a container too? Don’t worry, here we go.
Docker-compose is a tool for running multiple Docker containers. It’s a yml, you just need to create a docker-compose.yml on your project folder.
version: '3.3'
services:
db:image: postgresenvironment:- POSTGRES_USER=postgres- POSTGRES_PASSWORD=postgres- POSTGRES_DB=postgres
web:build: .command: ["./run_web.sh"]volumes:- .:/webappsports:- "8000:8000"links:- dbdepends_on:- db
In this case, I’m using an Image of Postgres from Docker Hub.
Now, let’s change the settings.py to use Postgres as Database.
DATABASES = {'default': {'ENGINE': 'django.db.backends.postgresql_psycopg2','NAME': 'postgres','USER': 'postgres','PASSWORD': 'postgres','HOST': 'db','PORT': '5432',}}
We’re almost, done. Let me talk a little about the docker compose file.
Remember vagrant?
Once upon a time, there was Vagrant and it was a form to run a project inside a Virtual Machine but easily configuring it forwarding ports, provisioning requirements and, sharing volumes. Your machine (Host) could share a volume with your Virtual Machine (Guest). In docker, it’s exactly the same. When you’re writing a file on a shared volume this file is being written on your container as well.
volumes:
In this case, the current directory (.) is being shared as webapps on the container.
links:
You can refer to another container that belongs to your compose using its name. Since we created a db container for our Postgres we can link it to our web container. You can see that in our settings.py file I’ve used ‘db’ as host.
In order to your application work, your database has to be ready for use before web container, otherwise, it will raise an exception.
depends_on:
Command is the default command that your container will run right after it is up.
For our example, I’ve created a run_web.sh, that will run the migrations, collect the static files and start the development server.
#!/usr/bin/env bash
cd django-boards/python manage.py migratepython manage.py collectstatic --noinputpython manage.py runserver 0.0.0.0:8000
One can argue that run the migrate at this point, automatically, every time the container is up is not a good practice. I agree. You can run it directly on the web machine. You can access your container (just like the good’ol vagrant ssh) :
docker-compose run web bash
if you’d like you can run it without accessing the container itself, just change the last argument from the previous command.
docker-compose run web python manage.py migrate
The same for other commands
docker-compose run web python manage.py testdocker-compose run web python manage.py shell
With our Dockerfile, docker-compose.yml and run_web.sh set in place, just run it all together:
docker-compose up
You can see this project here on my GitHub.
Originally published at Fernando Alves.