Python – How do I deploy a flask or django application using jwilder/nginx-proxy?

How do I deploy a flask or django application using jwilder/nginx-proxy?… here is a solution to the problem.

How do I deploy a flask or django application using jwilder/nginx-proxy?

I’m thinking about migrating some of our web servers into docker containers. The jwilder/nginx-proxy image looks interesting and seems to meet our needs, but how do we properly deploy a flask application in a container and get it to work with the jwilder/nginx-proxy server? To be clear, the flask application will also run in a docker container.

In a separate but related issue, how do I do this for a django application?

It looks like there is a popular Tiangolo/uwsgi-nginx-flask image, and a similar dockerfiles/django-uwsgi-nginx image. In this setup, I understand that the nginx-proxy container directs traffic to the uwsgi-nginx-flask or django-uwsgi-nginx containers. Is this a common way to do this?

My main idea is that in such a setup, we are running additional nginx instances – one for each python/django application. Is this common? Or is it possible/beneficial/common to somehow have nginx-proxy talk directly to uwsgi inside the python application container?

I see that the nginx-proxy image has a VIRTUAL_PROTO=uwsgi option that other containers can start with. Is this something that can be used to improve efficiency? Or is the effort put in more than its value?

EDIT: Or is the nginx instance that comes with the flask/django project beneficial because it can be used to serve static content, without which you would need to configure nginx-proxy images with the location of each project’s static files?

Solution

Personally, I prefer to have Django have a container, NGINX in a separate container, other applications in other containers, etc. For this I prefer to use docker-compose see my implementation using Django + NGINX + PostgreSQL here. (I haven’t used jwilder/nginx-proxy, but the official NGINX docker image.)

But putting NGINX and Python servers in the same container doesn’t sound so bad. I use lightweight alpine-based images to deploy python, for example:

FROM nginx:mainline-alpine

# --- Python Installation ---
RUN apk add --no-cache python3 && \
    python3 -m ensurepip && \
    rm -r /usr/lib/python*/ensurepip && \
    pip3 install --upgrade pip setuptools && \
    if [ ! -e /usr/bin/pip ]; then ln -s pip3 /usr/bin/pip ; fi && \
    if [[ ! -e /usr/bin/python ]]; then ln -sf /usr/bin/python3 /usr/bin/python; fi && \
    rm -r /root/.cache

# --- Work Directory ---
WORKDIR /usr/src/app

# --- Python Setup ---
ADD . .
RUN pip install -r app/requirements.pip

# --- Nginx Setup ---
COPY config/nginx/default.conf /etc/nginx/conf.d/
RUN chmod g+rwx /var/cache/nginx /var/run /var/log/nginx
RUN chgrp -R root /var/cache/nginx
RUN sed -i.bak 's/^user/#user/' /etc/nginx/nginx.conf
RUN addgroup nginx root

# --- Expose and CMD ---
EXPOSE 5000
CMD gunicorn --bind 0.0.0.0:5000 wsgi --chdir /usr/src/app/app & nginx -g "daemon off;"

Although it looks a bit messy, it works well. See my full implementation in here

Depending on how you want to deploy the docker image, you can use either method. But using docker compose would be IMHO’s best solution. In both settings, you can use NGINX to serve your static content (without having to configure it for each static file).

Related Problems and Solutions