Running OpenFaaS on GKE - a step by step guide

In this tutorial I'm going to show you how to set-up OpenFaaS on Kubernetes with monitoring and alerting using Weave Cloud. Using OpenFaaS for serverless avoids being locked-in to one of the cloud vendors. It's easy to run in Kubernetes, letting you mix different services depending on your applications needs.

OpenFaaS is a serverless framework that runs on top of Kubernetes. It comes with a built-in UI and a handy CLI that takes you from scaffolding new functions to deploying them on your Kubernetes cluster. What's special about OpenFaaS is that you can package any executable as a function, and as long as it runs in a Docker container, it will work on OpenFaaS.

What follows is a step by step guide on running OpenFaaS with Kubernetes 1.8 on Google Cloud.

This setup is optimized for production use:

Kubernetes multi-zone cluster

OpenFaaS system API and UI are username/password protected

all OpenFaaS components have 1GB memory limits

the gateway read/write timeouts are set to one minute

asynchronous function calls with NATS streaming and three queue workers

optional GKE Ingress controller with Let's Encrypt TLS

Create a GCP project

Login to GCP and create a new project named openfaas. If you don't have a GCP account you can apply for a free trial. After creating the project, enable billing and wait for the API and related services to be enabled. Download and install the Google Cloud SDK from this page. After installing the SDK run gcloud init and then, set the default project to openfaas and the default zone to europe-west3-a.

You can access the kubernetes-dashboard at http://localhost:9099/ui using kubectl reverse proxy:

kubectl proxy --port=9099 &

Create a Weave Cloud Instance

Now that you have a Kubernetes cluster up and running you can start monitoring it with Weave Cloud. You'll need a Weave Could service token, if you don't have a Weave token go to Weave Cloud and sign up for a trial account.

Deploy the OpenFaaS functions

With the com.openfaas.scale.min label you can set the minimum number of running pods. With com.openfaas.scale.max you can set the maximum number of replicas for the autoscaler. By default OpenFaaS will keep a single pod running per function and under load it will scale up to a maximum of 20 pods.

Deploy nodeinfo and echo on OpenFaaS:

faas-cli deploy

The deploy command looks for a stack.yml file in the current directory and deploys all of the functions in the openfaas-fn namespace.

Weave Cloud extends Prometheus by providing a distributed, multi-tenant, horizontally scalable version of Prometheus. It hosts the scraped Prometheus metrics for you, so that you don’t have to worry about storage or backups.

Weave Cloud comes with canned dashboards for Kubernetes that you can use to monitor a specific namespace:

You can also make your own dashboards based on OpenFaaS specific metrics:

Create functions

With the OpenFaaS CLI you can chose between using a programming language template where you only need to provide a handler file, or a Docker template that you can build yourself. There are many supported languages like Go, JS (node), Python, C# and Ruby.

Let's create a function with Go that fetches the SSL/TLS certificate info for a given URL.

First create a directory for your functions under GOPATH:

mkdir -p $GOPATH/src/functions

Inside the functions dir use the CLI to create the certinfo function:

cd $GOPATH/src/functions
faas-cli new --lang go certinfo

Open handler.go in your favorite editor and add the certificate fetching code:

This will check your code for proper formatting with gofmt, run go build & test and pack your binary into an alpine image. If everything goes well, you'll have a local Docker image named username/certinfo:latest.

Push this image to Docker Hub. First create a public repository named certinfo, login to Docker Hub using docker CLI and then push the image:

docker login
faas-cli push -f certinfo.yml

Once the image is on Docker Hub you can deploy the function to your OpenFaaS GKE cluster:

Full source code of the certinfo function can be found on GitHub at stefanprodan/openfaas-certinfo. In the certinfo repo you can see how easy it is to do continuous deployments to Docker Hub with TravisCI for OpenFaaS functions.

Conclusions

Like most serverless frameworks, OpenFaaS empowers developers to build, ship and run code in the cloud. What OpenFaaS adds to this, is portability between dev and production environments with no vendor lock-in. OpenFaaS is a promising project and with over 64 contributors is a vibrant community. Even as a relatively young project, you can run it reliably in production backed by Kubernetes and Google Cloud. You can also use Weave Cloud to monitor and alert on any issues in your OpenFaaS project.

About the author

Stefan is a Developer Experience engineer at Weaveworks. Previously he worked as a software architect and a DevOps consultant, helping companies embrace DevOps and the SRE movement. Stefan has over 15 years of experience with software development and he enjoys programming in Go and writing about distributed systems.