Menu

Get storage for your Severless Functions with Minio & Docker

In this post I'll show you how to make use of storage for your Serverless functions with OpenFaaS using Minio - a lightweight S3 alternative. That means you can run your functions with storage on any cloud on your own terms.

Introduction

Let's talk a little about storage for Docker. Typically you will find two types of storage available:

Block storage

Block storage is effectively like a disk mounted on your PC or within a container. You can read/write exactly as if it were an SDD or HDD plugged in to the physical host.

Block storage is usually attached through mounting a volume from the host directly to the running container. This often causes unexpected behaviour with file permissions. When you start to do this across a cluster mounting a volume from a ephemeral host doesn't make much sense. Network-distributed block storage like Ceph aims to solve this problem. A good example of where you would want this kind of block storage is with an existing application such as a Postgres DB or legacy system.

Given that Serverless functions are:

short-lived (seconds)

stateless (no in-memory state/direct storage)

single-purpose

It should be considered an anti-pattern to attach permanent storage volumes to a function. That means we should evaluate the other options such as using APIs and Object storage.

In this tutorial we will setup Minio and write two Serverless Functions for OpenFaaS that work together to process images in a pipeline. You can then take that template and make it do whatever you need.

Pre-requisites

These are the pre-requisites for the tutorial, you'll need to configure OpenFaaS before you start.

You'll see that the convertbw function is using an image from the Docker hub so we can add a skip_build entry. It also doesn't need access to Minio so we don't provide the environmental variables for it.

processimages/requirements.txt

minio
requests

Our handler for processimages is very similar to the one we made for loadimages. It does the following:

Parses the input as JSON

Downloads each file from a Minio bucket - one-by-one

Sends each file as a binary HTTP POST to the convertbw function and saves the result to disk

The output from the ProcessImages function gives the new location in the processed bucket of the black and white photos.

You can copy them back to your computer using mc cp or view them with the Minio UI at localhost:9000 (or wherever you have set this up). For instance:

$ mc cp myminio/processed/6b70561c-35a4-410c-a216-5582c6da6afe .

In our instance if you add a .jpg extension to the file you should be able to open it in Finder / Windows Explorer etc.

Summing up

We now have two functions which form part of a pipe-line so that we can separate the downloading of files with the processing of them. Object storage is provided by Minio which means we can store the results of the functions easily and pass them along from one function to another by filename.