Docker meets Python & Flask

Jaden Lemmon
6 min readMar 25, 2022

--

Modernize your Flask application by introducing containerization and ephemeral environments.

If you’re not familiar with Docker yet, I highly encourage you to check out some great blog posts on Medium and watch some videos. The benefits are endless and moving towards containers has increased our efficiency 10x.

Out of the box, Flask is a great, minimal framework. Lately, I prefer to use minimal frameworks so I can start as small as possible with no bloat. When I’m building a micro-service, I don’t want to use a bulky framework for something that may only handle a few functions.

A Flask application can be as simple as this.

Along with containerizing a Python App, we’re also going to look at how to utilize ephemeral environments to speed up the feature release process.

An ephemeral environment is one that lives for only a short amount of time.

I’m always a little annoyed when a tutorial uses such a simple application that makes it difficult to relate to a real-world problem. So today we will reference an example project called Island List.

Island List is an illustrious and groundbreaking application that allows you to buy your very own island. (Very relatable, yeah?)

Here’s a quick Live Preview thanks to ephemeral environments.

Application Tech Stack

I’m hoping that you’ve found this blog post because you’re familiar with Python and Flask so I’m not going to spend too much time inside the application. Instead, we’ll spend most of our time talking about how to set it up via Docker.

The stack of this application consists of:

For this particular application, I’m going to ship it as a single container. Everything will be rendered by the server without any client-side frameworks. I will be using a single multi-stage Dockerfile.

Depending on your need you may opt to ship separate containers for a front-end/back-end.

Building a Dockerfile

To start off my Dockerfile I’ll add these lines at the top.

FROM voyageapp/node:17.6-alpine as node
WORKDIR /app
COPY package*.json ./
RUN npm ci
...

You might be wondering why I’m importing a node image when we’re working with a Python application. I do this for two reasons. First, I need to import the npm packages, like TailwindCSS so I’ll need a node environment.

Secondly, this is what Docker refers to as a Multistage Build. Multistage Builds allows us to walk through several steps within the build process while keeping our images as small as possible. I import node to start but in a few steps, it will be overwritten.

The voyageapp/node:17.6-alpine image is a pre-built image that includes the dockerize library. I like to include this library as I like to use it to ensure my database is ready before starting the service inside the container.

After I’ve imported the node image, I set the working directory within the container.

Next, I’m copying in the package.json and package-lock.json and installing my npm dependencies within the containers working directory /app. When dealing with dependencies, it’s important to only copy in these types of files first in order to fully take advantage of Docker layer caching.

Then I install my dependencies via npm ci as it’s a better version of npm install when being used in environments like this as it ensures a clean install.

To finish up the first stage in my Dockerfile, I’m going to add 2 more lines.

FROM voyageapp/node:17.6-alpine as node 
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
RUN npm run css
...

The COPY command will now copy all my local files into the container.

You may be thinking “what was the point of copying the package.json files if we’re just going to overwrite them?” Again, this is because we want to take advantage of the layer caching docker provides. “But Jaden, what about my local node_modules directory? Wont they overwrite the directory we just created via line 4?” That is correct. Ideally, you will create a .dockerignore file and add node_modules to the file so that your local files won’t overwrite the files in the container.

Since I’m using Tailwind CSS, I’m going to use it via their preferred method by installing via npm and building with their CLI.

To build my CSS file, I need to have a tailwind.config.js file in my project. This tells Tailwind to scan my templates directory for class name usage to only bundle the classes I’m using.

Lastly, the npm run css is bundling the CSS via the Tailwind CLI. I’ve added this command inside my package.json.

{  
...
"scripts": {
"css": "npx tailwindcss -i ./lib/static/css/style.css -o ./lib/static/dist/main.css",
...
}
}

The second stage of my Dockerfile is going to handle the Python application. I will start by using a pre-built Python image from Dockerhub. Following the minimal, lightweight theme for today, I’m using an alpine image.

....  
FROM python:3.8-alpine
WORKDIR /app
COPY requirements.txt requirements.txt
RUN pip3 install -r requirements.txt
...

Again, I set the working directory with WORKDIR /app, and then theCOPY and RUN commands are going to use the same strategy as the first section to utilize Docker layer caching so I’m only coping the requirements.txt file first.

I’ll cap off the Dockerfile with these final lines.

...COPY --from=node /usr/local/bin/dockerize /usr/local/bin/dockerize
COPY --from=node /app/lib/static/dist ./lib/static/dist
CMD dockerize -wait "tcp://$DB_HOST:5432" -timeout 60s ; python3 app.py

The COPY commands will copy dockerize and the CSS bundle into the final image. The CMD will run dockerize first to ensure my database is ready and then execute my Flask application.

Docker-Compose

For local development, I like to use docker-compose. It makes it incredibly easy to spin up a Full-Stack application with a single command.

The voyageapp/postgres is a pre-built convenience image with the username, password, and database already set to “voyage”. This just makes it easier when testing and developing locally to not have to specify more environment variables.

Running the Application Locally

This application is not optimized for production so when it does come to deploying to production there are probably some tweaks you will want to make but for the sake of this demo, I wanted to keep it simple.

The command docker-compose up is all that is needed to run this application.

In my app.py file I’ve configured the app to create and migrate the tables and then seed the database.

When it comes to development I like my application to start, migrate, and seed with a single command. I encourage everyone to include migrations and seeders in their applications. You may also read another article I’ve written with more details on that.

Ephemeral Environments

As a real-world example, I’ve decided to update the colors inside my application. I’ve got it ready to play with and checked out a new branch called feat/colors . I’d like to simply deploy my progress to share with the team and get their feedback.

I’m going to integrate Voyage to deploy this Flask application for each opened pull request in Github. Following the Voyage docs, I’m going to create a .voyage/config.yml file inside the project.

The file is fairly straightforward and now my application will auto-deploy when a PR is opened for that branch and stay up to date as new commits are pushed. When the team approves and the branch is merged, the environment will automatically destroy itself.

See the deployed application here.

That’s it. Hopefully, this helped someone. Feel free to ping me on Twitter with any questions.

--

--

Jaden Lemmon
Jaden Lemmon

Written by Jaden Lemmon

I solve problems with valuable products and tools. In reality, no product is ever unique. It’s already been done. Success is created from execution not ideas.

Responses (1)