I'd like to set up Linux to cache some commonly requested URLs so that it doesn't have to go out to the net to get them every time. This would be a system-wide URL cache, not just in a particular browser. For example, a program might request http://www.w3.org/TR/html4/sgml/loosedtd.html a few thousand times per day. There are many other URLs I'd like to cache as well.

1 Answer
1

Yes, these "apps" are called HTTP proxies and one example for a HTTP proxy is squid which can be installed like this:

sudo apt-get install squid3

After installation you have to configure your program to use localhost:3128 or to use the system wide proxy settings if this is supported by your application. The system wide proxy settings can be configured in your network settings.

Another possibility is to configure squid as a transparent proxy. In that case you don't have to configure your applications but it can be more difficult to setup.

Either way you should check the squid configuration to fine-tune the cache settings if the default settings don't work well for you.