Navigation

Fun with Squid and CDNs

Wed, 18/02/2009 - 12:29pm — tumbleweed

One neat upgrade in Debian's recent 5.0.0 release1 was Squid 2.7. In this bandwidth-starved corner of the world, a caching proxy is a nice addition to a network, as it should shave at least 10% off your monthly bandwidth usage. However, the recent rise of CDNs has made many objects that should be highly cacheable, un-cacheable.

For example, a YouTube video has a static ID. The same piece of video will always have the same ID, it'll never be replaced by anything else (except a "sorry this is no longer available" notice). But it's served from one of many delivery servers. If I watch it once, it may come from

But the next time it may come from v15.cache.googlevideo.com. And that's not all, the signature parameter is unique (to protect against hot-linking) as well as other not-static parameters.
Basically, any proxy will probably refuse to cache it (because of all the parameters) and if it did, it'd be a waste of space because the signature would ensure that no one would ever access that cached item again.

I came across a page on the squid wiki that addresses a solution to this.
Squid 2.7 introduces the concept of a storeurl_rewrite_program which gets a chance to rewrite any URL before storing / accessing an item in the cache. Thus we could rewrite our example file to

We've normalised the URL and kept the only two parameters that matter, the video id and the itag which specifies the video quality level.

The squid wiki page I mentioned includes a sample perl script to perform this rewrite. They don't include the itag, and my perl isn't good enough to fix that without making a dog's breakfast of it, so I re-wrote it in Python. You can find it at the end of this post. Each line the rewrite program reads contains a concurrency ID, the URL to be rewritten, and some parameters. We output the concurrency ID and the URL to rewrite to.

The concurrency ID is a way to use a single script to process rewrites from different squid threads in parallel. The documentation is this is almost non-existant, but if you specify a non-zero storeurl_rewrite_concurrency each request and response will be prepended with a numeric ID. The perl script concatenated this directly before the re-written URL, but I separate them with a space. Both seem to work. (Bad documentation sucks)

All that's left is to tell Squid to use this, and to override the caching rules on these URLs.

# All of the above can cause a redirect loop when the server
# doesn't send a "Cache-control: no-cache" header with a 302 redirect.
# This is a work-around.
minimum_object_size 512 bytes

Done. And it seems to be working relatively well. If only I'd set this up last year when I had pesky house-mates watching youtube all day ;-)

It should of course be noted that doing this instructs your Squid Proxy to break rules.
Both override-expire and ignore-reload violate guarantees that the HTTP standards provide the browser and web-server about their communication with each other.
They are relatively benign changes, but illegal nonetheless.

And it goes without saying that rewriting the URLs of stored objects could cause some major breakage by assuming that different objects (with different URLs) are the same.
The provided regexes seem sane enough to not assume that this won't happen, but YMMV.

Comments

Comment viewing options

The thing that bothers me about youtube, more than that it reloads videos on every visit (although that is stupid), is that it reloads videos every time you push the play button. This has happened to me: "Hey, come look at this cool video I just laboriously buffered then watched!" "Oh, OK, let's just wait 20 minutes for it to buffer again"

It seems this will always ensure you get the quality you asked for. But it seems to me that you don't mind, as long as you get *at least* the quality you ask for. But this can't be done by canonicalizing URLs. Can it?

Up to now I don't see the difference between high quality and normal quality(although both still low quality) in the URL. Maybe they call it high quality because its the original file uploaded by the user. By the way HOME is true high quality I've ever seen on youtube.

by the way whats good thing about python vs perl? I'm only concern about speed.
I only know assembly. I'm a beginner with this languages. Thats why I'm toying squid.

Have you come across squirm (http://squirm.foote.com.au/) which was for redirect_program (now url_rewrite_program) but could easily be customised for storeurl_rewrite_program.

It's a shame they haven't yet ported the storeurl_rewrite feature to the 3.x release as this would make it really easy to create a repo cache for CentOS or any other one that uses a mirror list. Yum first fetches a list of repo mirrors then chooses one of them to fetch the packages from. Using the storeurl_rewrute_program directive you could then normalise the store url for use between all of the returned mirrors.