Category: Apache

There are many search bots, spiders, and content scrapers out in the internet. Some are completely legitimate like Google and Bing; however, there are also many that are overly aggressive or are simply out to scrape content or email addresses from your web site. Analyzing my own web site access log files led me to discover that there are several extremely aggressive bots that are hitting my web site frequently all day, that seem to have no true origin or search engine use and they were causing me some concern.

It is always a good idea to compress the text content being served to your web site visitors. Not compressing content wastes bandwidth and slows down your visitors experience to your web site. All modern browsers currently support compression, and you can configure it within your Apache server config file, or through an .htaccess file for a specific application. This will significantly speed up the transfer time of all requested text based files making your web site load faster.

Throughout my experience working with web sites, one of the ways that I’ve found to improve performance and page load time is to leverage the browser cache to cache static resources. By leveraging the browser cache and saving infrequently changed files to the local browser you are able to reduce the number of HTTP requests necessary to retrieve required resources and you are able to reduce the total payload size of the responses. In addition you’ll find that you significantly reduce the bandwidth demands of your web site.

When working with a PHP framework such as CodeIgniter, and I’ve found this same configuration works for WordPress as well, you can eliminate the need to reference index.php as part of the URL, by utilizing the following .htaccess file entry.