Django Docker Deployment
Introduction
Docker has revolutionized how we develop, ship, and run applications. For Django developers, Docker provides a consistent environment that works identically across development, testing, and production stages. This eliminates the infamous "it works on my machine" problem and streamlines the deployment process.
In this tutorial, you'll learn how to containerize a Django application using Docker. We'll cover the basics of Docker, create a development environment with Docker Compose, and prepare your application for production deployment.
Prerequisites
Before we begin, make sure you have:
- Basic knowledge of Django
- Docker installed on your machine
- A Django project ready to containerize
If you don't have Docker installed, visit the official Docker website for installation instructions.
Understanding Docker Basics
What is Docker?
Docker is a platform that uses OS-level virtualization to deliver software in packages called containers. Containers are isolated from one another and bundle their own software, libraries, and configuration files.
Key Docker Concepts
- Dockerfile: A text document with instructions to build a Docker image
- Image: A lightweight, standalone package that contains everything needed to run an application
- Container: A running instance of an image
- Docker Compose: A tool for defining and running multi-container Docker applications
Creating a Dockerfile for Django
Let's start by creating a Dockerfile in your Django project's root directory:
# Use an official Python runtime as a parent image
FROM python:3.9-slim
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
# Set work directory
WORKDIR /app
# Install dependencies
COPY requirements.txt /app/
RUN pip install --no-cache-dir -r requirements.txt
# Copy project
COPY . /app/
# Run the application
CMD ["gunicorn", "--bind", "0.0.0.0:8000", "myproject.wsgi:application"]
Let's break down this Dockerfile:
- We start with an official Python image
- Set environment variables to optimize Python in a Docker container
- Create and set a working directory
- Install the project dependencies
- Copy the project files
- Specify how to run the application using Gunicorn
Creating a Docker Compose File
For development, we'll use Docker Compose to manage our application services. Create a docker-compose.yml
file:
version: '3'
services:
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/app
ports:
- "8000:8000"
depends_on:
- db
environment:
- DEBUG=1
- DATABASE_URL=postgres://postgres:postgres@db:5432/postgres
db:
image: postgres:13
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
- POSTGRES_PASSWORD=postgres
- POSTGRES_USER=postgres
- POSTGRES_DB=postgres
volumes:
postgres_data:
This compose file:
- Defines two services:
web
(our Django app) anddb
(PostgreSQL database) - Maps port 8000 to our host machine
- Mounts the current directory as a volume for live code changes
- Creates a persistent volume for the database
- Sets environment variables for both services
Setting Up Django for Docker
Update settings.py
Modify your Django settings to work well with Docker:
# settings.py
import os
# Allow specific hosts
ALLOWED_HOSTS = ['localhost', '127.0.0.1', '0.0.0.0']
# Use environment variables for database configuration
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ.get('POSTGRES_DB', 'postgres'),
'USER': os.environ.get('POSTGRES_USER', 'postgres'),
'PASSWORD': os.environ.get('POSTGRES_PASSWORD', 'postgres'),
'HOST': os.environ.get('POSTGRES_HOST', 'db'),
'PORT': os.environ.get('POSTGRES_PORT', '5432'),
}
}
# Static files configuration
STATIC_URL = '/static/'
STATIC_ROOT = os.path.join(BASE_DIR, 'staticfiles')
Create a requirements.txt file
Ensure you have a requirements.txt
file listing all your dependencies:
Django>=3.2,<4.0
psycopg2-binary>=2.9.1
gunicorn>=20.1.0
Development Workflow
Now let's run our application in development mode:
# Build the images
docker-compose build
# Start the containers
docker-compose up
# In another terminal, run migrations
docker-compose exec web python manage.py migrate
# Create a superuser
docker-compose exec web python manage.py createsuperuser
Your Django application should now be running at http://localhost:8000.
Production Deployment Setup
For production, we need additional security and performance considerations. Let's enhance our setup.
Create a production Docker Compose file
Create a docker-compose.prod.yml
file:
version: '3'
services:
web:
build: .
command: gunicorn myproject.wsgi:application --bind 0.0.0.0:8000
volumes:
- static_volume:/app/staticfiles
- media_volume:/app/mediafiles
expose:
- 8000
environment:
- DEBUG=0
- SECRET_KEY=${SECRET_KEY}
- DATABASE_URL=postgres://${POSTGRES_USER}:${POSTGRES_PASSWORD}@db:5432/${POSTGRES_DB}
depends_on:
- db
db:
image: postgres:13
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
- POSTGRES_USER=${POSTGRES_USER}
- POSTGRES_DB=${POSTGRES_DB}
nginx:
build: ./nginx
volumes:
- static_volume:/home/app/web/staticfiles
- media_volume:/home/app/web/mediafiles
ports:
- 1337:80
depends_on:
- web
volumes:
postgres_data:
static_volume:
media_volume:
Create an Nginx configuration
Create a directory named nginx
and add a Dockerfile
inside it:
FROM nginx:1.21
RUN rm /etc/nginx/conf.d/default.conf
COPY nginx.conf /etc/nginx/conf.d
Now create a nginx.conf
file in the same directory:
upstream django {
server web:8000;
}
server {
listen 80;
location / {
proxy_pass http://django;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
}
location /static/ {
alias /home/app/web/staticfiles/;
}
location /media/ {
alias /home/app/web/mediafiles/;
}
}
Create a .env file for production
DEBUG=0
SECRET_KEY=your_super_secret_key_here
POSTGRES_USER=postgres
POSTGRES_PASSWORD=postgres_password
POSTGRES_DB=postgres
Make sure to add .env
to your .gitignore
file.
Running in Production Mode
To run your application in production mode:
# Build and start production containers
docker-compose -f docker-compose.prod.yml up -d --build
# Run migrations
docker-compose -f docker-compose.prod.yml exec web python manage.py migrate
# Collect static files
docker-compose -f docker-compose.prod.yml exec web python manage.py collectstatic --no-input
Your production Django application should now be running at http://localhost:1337 with Nginx serving static files and proxying requests to Gunicorn.
Deploying to a Cloud Provider
Most cloud providers support Docker containers. Here's a general approach for deployment:
- Push your Docker images to a container registry (Docker Hub, AWS ECR, Google Container Registry)
- Set up a container orchestration service (Kubernetes, AWS ECS, Google Cloud Run)
- Configure environment variables for production settings
- Set up a CI/CD pipeline to automate deployments
For example, to deploy to AWS ECS:
- Create a repository in Amazon ECR
- Build and push your Docker images
- Create a task definition in ECS
- Create an ECS service and configure load balancing
Common Issues and Troubleshooting
Database Connections
If your Django app can't connect to the database, check:
- Database service name matches the hostname in your Django settings
- Environment variables are correctly set
- Database container is running before the web container
Static Files
If static files aren't serving correctly:
- Ensure
STATIC_ROOT
is set correctly in settings.py - Run
collectstatic
command in the container - Check Nginx configuration for the static files location
Permissions Issues
Docker containers may have permission conflicts with mounted volumes:
- Check the user running the process inside the container
- Set appropriate permissions on host directories
- Consider using named volumes instead of bind mounts
Summary
In this tutorial, you've learned how to:
- Create a Dockerfile for a Django application
- Set up Docker Compose for development
- Configure Django to work with Docker
- Prepare a production-ready Docker setup with Nginx
- Deploy your containerized application
Containerizing your Django application with Docker provides consistency across environments, simplifies deployment, and makes your application more portable. It's a modern approach to deployment that works well with CI/CD pipelines and cloud platforms.
Additional Resources
Exercises
- Extend the Docker setup to include Redis for caching
- Add a Celery worker container for background tasks
- Configure a CI/CD pipeline using GitHub Actions to automatically build and push Docker images
- Set up automated database backups for your containerized Postgres database
- Implement health checks for your Docker containers to improve reliability
If you spot any mistakes on this website, please let me know at [email protected]. I’d greatly appreciate your feedback! :)