I’ve spent hours working through potential to manually gzip files and set the appropriate headers from S3 over the past few months and looking to build it into my simple deployment pipeline and now, I don’t have to!

So if you’re reading this - the chances are if you’re on a browser released in the last 10 years that this post was served up all compressed!

The Impact?

Overall its ~4.9 seconds down to 3.4 seconds, so I’m looking at roughly 30% page speed increase just from gzip. Noting this is only for text resources, so there’s still more to do!

Measurements?

Is 30% scientific? Hah! Not remotely. Single sample from a single connection on slightly different browser versions. I grabbed the .HAR download from a pagespeed analyser (I used GTmetrix since it popped up on a google search)… (but you can do so using an extension or directly in Chrome’s developer tools).

My inner geek wants to remind myself that this is one sample of one request on a remote service who’s environment I cannot control. I’m also very aware there’s a lot for me to do to make this the perfect front end (the JS and CSS is far from my greatest work). However at 30% page speed improvement for 60 seconds of work is good enough for me!