This step-by-step tutorial takes you through the process of deploying a simple Python application on Kubernetes.

Get the newsletter

Kubernetes
is an open source platform that offers deployment, maintenance, and
scaling features. It simplifies management of containerized Python
applications while providing portability, extensibility, and
self-healing capabilities.
Whether your Python applications are simple or more complex,
Kubernetes lets you efficiently deploy and scale them, seamlessly
rolling out new features while limiting resources to only those
required.
In this article, I will describe the process of deploying a simple Python application to Kubernetes, including:

Creating Python container images

Publishing the container images to an image registry

Working with persistent volume

Deploying the Python application to Kubernetes

Requirements

You will need Docker, kubectl, and this source code.
Docker is an open platform to build and ship distributed applications. To install Docker, follow the official documentation. To verify that Docker runs your system:

Containerization at a glance

Containerization involves enclosing an application in a container
with its own operating system. This full machine virtualization option
has the advantage of being able to run an application on any machine
without concerns about dependencies.
Roman Gaponov's article serves as a reference. Let's start by creating a container image for our Python code.

Create a Python container image

To create these images, we will use Docker, which enables us to
deploy applications inside isolated Linux software containers. Docker is
able to automatically build images using instructions from a Docker
file.
This is a Docker file for our Python application:

This Docker file contains instructions to run our sample Python code. It uses the Python 3.5 development environment.

Build a Python Docker image

We can now build the Docker image from these instructions using this command:

docker build -t k8s_python_sample_code .

This command creates a Docker image for our Python application.

Publish the container images

We can publish our Python container image to different private/public
cloud repositories, like Docker Hub, AWS ECR, Google Container
Registry, etc. For this tutorial, we'll use Docker Hub.
Before publishing the image, we need to tag it to a version:

docker tag k8s_python_sample_code:latest k8s_python_sample_code:0.1

Push the image to a cloud repository

Using a Docker registry other than Docker Hub to store images
requires you to add that container registry to the local Docker daemon
and Kubernetes Docker daemons. You can look up this information for the
different cloud registries. We'll use Docker Hub in this example.
Execute this Docker command to push the image:

docker push k8s_python_sample_code

Working with CephFS persistent storage

Kubernetes supports many persistent storage providers, including AWS
EBS, CephFS, GlusterFS, Azure Disk, NFS, etc. I will cover Kubernetes
persistence storage with CephFS.
To use CephFS for persistent data to Kubernetes containers, we will create two files:
persistent-volume.yml

Deploy the application to Kubernetes

To manage the last mile of deploying the application to Kubernetes,
we will create two important files: a service file and a deployment
file.
Create a file and name it k8s_python_sample_code.service.yml with the following content:

Your application was successfully deployed to Kubernetes.
You can verify whether your application is running by inspecting the running services:

kubectl get services

May Kubernetes free you from future deployment hassles!Want to learn more about Python? Nanjekye's book, Python 2 and 3 Compatibility
offers clean ways to write code that will run on both Python 2 and 3,
including detailed examples of how to convert existing Python
2-compatible code to code that will run reliably on both Python 2 and 3.