The Bitesize Serverless Architecture

Software Engineering and Web Development at the BBC has been undergoing a major change over the last few years.

In the recent past, all teams worked on a small number of shared platforms and despite differing needs, had a limited choice of technologies available to use. As time went by and more applications were built atop this platform it became harder and harder for changes to be made to that platform. The risk of breaking one of the applications running there was too great.

To combat that, we have been migrating our products to cloud based architectures. In many ways this has been an amazing enabler for the teams within Design & Engineering. We are all free to choose the most appropriate technology for our products and to make use of the varied services available.

This change, however, has led to an increasing requirement for our teams to learn skills that previously have been the domain of our Operations team. We must make more effort to understand security issues, handle deployments and take responsibility for the performance and operation of our servers.

In a lot of cases the capabilities offered are worth the added responsibility. But where we have simpler requirements it would be extremely useful to regain the some of the simplicity we previously enjoyed — not have to worry about server maintenance or auto scaling and rapidly create new sites without the overhead of infrastructure set up.

Towards a new Bitesize web

We’re constantly looking to find ways to make Bitesize more relevant to our users. As part of that process we’ve brought the exam board specific content that is already accessible on the Bitesize Revision app, to the web.

When planning the project we evaluated the technical architecture that would be required to support the new content.

In the industry we’ve observed interesting developments in the Serverless Architecture movement. With Amazon Web Services adding official support through the Serverless Application Model it felt like a great opportunity to explore it as one of the options to power this new version of Bitesize.

The Serverless Application Model is focussed around a number of AWS technologies, a subset of which we felt would work for our requirements.

API Gateway provides a scaleable ’front door’ on the web that can in turn route requests to a number of different services.

Lambda allows us to run a simple piece of code in response to a request to the Gateway.

Combined, they provide a simplified model for processing and responding to web requests. We prototyped the architecture as part of a small scale ‘Alpha’ version of the application and the exciting results confirmed it was suitable for use.

Reusing existing services

Additionally, Morph also provides the ability to create simple React-based view components that work hand in hand with the data services. Following this pattern allows us to create simple, isolated components that we can reuse across multiple pages — speeding up development and promoting software engineering best practise.

The typical pattern for using these components has been to set up a new application to embed them within — hosting it on cloud infrastructure comprising EC2 instances, load balancing, and auto scaling.

The alternate model provided by the Serverless Architecture pattern, simplifies this by allowing us to implement a simple composition layer in API Gateway and Lambda.

Some big successes

Following this architectural pattern exposed us to a number of advantages.

New pages were very quick to pull together once the individual components had been created

The code for each Lambda was simple, easy to understand and easy to maintain

Code changes can be deployed very quickly — changes can be made live in a matter of minutes

We don’t have to worry (as much) about scaling. Lambda has a model that scales to 100 simultaneous requests. This might not be enough for all applications but it works for the Bitesize traffic levels

API Gateway offers a caching layer built-in. This meant that we can protect our systems from high levels of load and offer a fast response to requests

Generally we maintain multiple environments for developing and testing our applications. This means having duplicate resources running to facilitate this. With the server-less model we only pay a very small amount per request, making test environments very cheap to run

We don’t need to worry about a lot of the additional complexity that the move to the Dev Ops model introduced. Operating System patching and security issues are taken care of, as are issues like auto-scaling. We sacrifice a little bit in terms of the ‘latest’ technology — Lambda took a long time to update to the latest LTS release

Some small issues

Despite seeing a lot of success with this model we also discovered a number of areas that didn’t work as well as we hoped.

The standard cloud authentication model between our service and other internal BBC services wasn’t available when using Lambda. Thankfully we were able to work with our Platforms team to create a solution that will be available to everyone deploying to Lambda within the BBC

API Gateway caching is useful but we have a long tail of content. We needed to introduce stale caching to ensure that long tail content requests are still served quickly and refreshed in the background

Not having a traditional web server meant we couldn’t do things that we previously took for granted — proxying to other services, certificate based access, hosting static files and setting up fast web server level redirects to other URLs

What’s next?

The Serverless Application is currently servicing requests for all GCSE English Language content for Year 10s and Year 11s in England and Wales. We’ll continue to monitor and evaluate the architecture and look for opportunities to improve the model as we bring Exam Specifications to more of the Bitesize audience.

Serverless Architecture really appeals to us but as we continue to develop Bitesize we’ll likely find use-cases and features for which it isn’t a great fit. For cases like this though, where we’re predominantly making calls to other services and combining them, it makes great sense and offers us a lot of advantages.