The company I work for is looking for something that will block certain websites such as facebook and myspace amongst various gaming sites. I have a FreeBSd system running dansguardian with an extensive blacklist set up. I haven't yet put it into production but am thinking about it.

I know there are guys out here that are IT people for companies and just curious as to what they have used, or might use with FreeBSD to block out certain sites and what comp specs it ran on as well as how many users it controled?

I fured if i was going to block stuff like facebook and myspace i might as well block gaming sites and well any other site that has nothing to do with work. We have employee's abuse the internet access to much and our mechanics depend on it to be able to fix vehicles since G.M has pretty much moved everything to online access.

Having looked into some stuff I have to decide if the machine I have up and running dansguardian with a blacklist is going to be enough for the 40 to 50 users we have. The system specs are a P3 1.4Ghz with 1.5GB RAM with just over 9 GB's used on a 80GB drive

Having looked into some stuff I have to decide if the machine I have up and running dansguardian with a blacklist is going to be enough for the 40 to 50 users we have. The system specs are a P3 1.4Ghz with 1.5GB RAM with just over 9 GB's used on a 80GB drive

Squid and DansGuardian love RAM. The more RAM you can put into the system, the less the system will be noticed by the end-users. DansGuardian also loves CPU. The more CPU power (cores, processors, GHz) you can put into the box, the better.

For 50 users, you should be able to get away with just 1.5 GB of RAM. Personally, I'd try to get that as close to 4 GB (max on 32-bit systems) as possible. Then you can give 2 GB to Squid's memory cache, and leave the rest for DansGuardian to use.

Be sure to put in packet filtering rules that by-pass squid/dansguardian for the really important websites, like the ones the mechanics use. Unless these are static websites, you don't need to cache them.

I'm using opendns to filter loads of sites and then I've got squid running that is bypassing opendns for the few that doesn't need blocking like the bosses, it works pretty good.
The only problem I have with it is that you either block stuff for every one or not at all.

It would be nice of them to implement a system where you can block sites on a per host or local network manner. I do believe that it might actually happen one day.

I should explain the way our network is set up. GM controls the router as well as the domain controller which acts as our DNS server as well. If we set the PC's to use something other then that DNS server employees are no longer able to log into the domain or access some of the sites GM provides us to use. It won't be till about 2010 when GM relinquies control of the router in which case I will have to provide a suitable replacement for it. Till then, I have to come up with another way of filtering the sites that people go to. When I started to do this a couple years ago, and then was stopped by management, I had read about squid and dansguardian. I set it up on a system that well really isn't doing anything else and it was only going to effect a smaller amount of users. We have a total of 35 users. 12 with direct acces to PC's, the rest using dump terminals to access one of two terminal servers. I know it's sounds like a strange set up and well it is, but I have to work with what I have. I agree that once I am able to control the router or firewall I no longer will need dansguardian.

Phoenix, I'll have to look into the pcket filtering rules so that I do not stop any sites that are actually needed. Thanks for that.

For what you describe, what you need is a filtering bridge. That's a firewall design that filters on level 2 (ethernet, token ring, whatever) and you can drop between the router and your network. No one will notice it is there until they want to browse a forbidden site.

For the lazy: pfSense makes a great filtering bridge when properly set up (and it is a matter of three clicks...).

well so far the system has been up and running for a couple of days and the only thing people have complained abot is not being able to access certain websites. It's funny to see how many co-workers have facebook and how they are pissed they can no longer get there.

yup. I have a bunch of sites listed in the exceptionsitelist that people want access to. Since it was put into place I have been asked by some people to unban sites like TSN, rogers.com, facebook, myspace, a couple of gaming sites, nascar...there were a few others that i was asked to remove. I told them that if it is not work related that it wasnn't going to get done. Now, the only changes I have had to make was take the mechanics computers off the proxy since it was causing havoc with an application they need to use to program cars, so it looks like the shop won't be protected for the time being. Also had to set it up a little differently int eh parts department since the program we use to look up parts using IE and if it is set to use the proxy, it won't redirect it back to the pc where the files it needs are located.