Squid Puts the Squeeze on Net Wrongdoers

Between online deathmatches, hearts tournaments, and sports bookies, your network might be looking more like a playground than a place to get work done. Here's how to use Squid to button down the traffic and make sure your more slippery users don't slide out of its grasp.

Last week, in part 1, we got Squid up and running as a simple local http caching proxy. This week, we shall exercise our godlike admin powers and use Squid to control Web access for our users. We shall control the time of day they are allowed to surf, make sure their Web surfing goes through Squid, force them to authenticate to Squid, and create blacklists of their favorite Web sites.

Connecting Clients
Last week, we got as far as testing Squid on the server. Connecting clients is simple. For example, this is how Mozilla does it:

Diving Back Into squid.confsquid.conf contains something like 125 separate options. You can run Squid with an empty squid.conf, and just let it coast on the defaults. (Though on my Debian system, it would not start until I gave it a visible_hostname directive.) The following examples can be used either in a blank, brand-new squid.conf, or added to the original. Remember to not uncomment the defaults. Be sure to run Squid's syntax checker, and restart Squid after making changes:

# squid -k parse
# /etc/init.d/squid restart

Blocking Bad Users
Some folks just never learn. They show up for work, and immediately start crawling all manner of not-work related Web sites, or fire up their personal mp3 servers, or log in to Deathmatch Quake, and slow traffic to a crawl. Squid makes it easy to stop them in their tracks. Remember our ACLs (access control lists) from last week?

The order is important, always put your "deny all" rule last. Now you can leave your problem user blocked until you implement a remedy.

Using Blocklists
Here we are with this nice big Internet to play in, and suddenly bosses everywhere are screaming at us to put up fences. It's quite the fashionable trend these days to implement Web blacklists, to prevent users from accessing certain Web sites. You and I both know this is largely a futile exercise. But here is how to do it anyway. All you need is a list of forbidden domain names, and these two rules: