Speed matters. People rate snappy, responsive sites as more usable, even when the user interface itself doesn’t change. If your architecture or design aren’t that great, your users will give your site more chances before abandoning if the site responds quickly. Google knows that speed supports usability. I’d suggest much of the credit for Google’s rise to industry dominance goes to their ongoing obsession on making search results blazingly fast.

So, how does site speed relate to PPC?

To begin, read Google’s announcement about speed and quality score. If you serve Google’s Adbots slow loading pages, you’ll eventually get hit with QS penalties and drive up your AdWords click costs.

Bots experience your site differently from browser clients, because bots can skip images and javascript. I don’t have inside information on this, but it is a safe assumption that the Google code evaluating AdWords landing pages is simpler than the code assessing natural search relevance, and likely isn’t pulling in and analyzing external resources.

To serve AdWords landing pages to bots more quickly, use server profilers to determine which components of web pages takes the most time to build. Don’t optimize before profiling; odds are you’ll apply your efforts in the wrong place. (One of my favorite software quotes is from Sir Tony Hoare: premature optimization is the root of all evil.)

Two likely systems which could be slowing your page build times are your database and your templating engine. To speed up your database, bring in the best database administrator you can find, check your slow query logs, invest in more memory and faster disks, and build smart database indexes to support common requests.

To speed up your templating engine, minimize trips to the database, bump up the memory on those machines, and cache common page components or entire pages. For example, if your SKU pages provide customer ratings and reviews (a very good idea), consider generating and stashing the review block each hour—conversion wouldn’t suffer if a product review was 59 minutes delayed getting on the page.

Other ideas to increase speed: gzip your pages, remove whitespace from your source, remove needless CSS markup, use shorter class and id names, and use the cascading properties of CSS wisely, applying styling at the highest level of the DOM possible, rather than repeating styles ad nauseum on child elements. Be sure to benchmark your pages before and after, because some approaches (like gzipping) could, depending on circumstances, cause slowdowns.

To speed up your load times, make sure any redirects between you and Google, or any redirects on your site, are blazingly fast. Avoid tracking redirects which query databases to determine where to send traffic. We know a significant retailer whose home-built e-commerce platform grabs every new visit, teleports it off to a tracking system which queries databases to place cookies, then returns the session to originally desired page. Slow for humans, slow for bots. Not good.

Of course, make sure your site lives on fast servers, hosted in top-notch data centers with fast pipes. Memory prices continue to fall, so again, make sure your page-building servers and database servers have generous amounts of RAM.

I’m confident Google’s speed checking will be quite liberal, with the penalty applying to only really slow pages. If you get hit with a page load QS penalty, Google says the AdWords dashboard will tell you so. If you don’t get these warnings, should you pat yourself on the back for a “fast enough” site and consider speed a non-issue?

Nope.

It is worth investing some additional effort to take your site from “reasonably fast” up to “impressively fast.”

Speeding up sites for browsers is somewhat different from speeding up sites for bots (though every improvement for the later also helps the former). To display pretty pages for humans, browsers need to load images, render HTML, apply CSS, and execute javascript.

There are some amazingly powerful and simple ideas to speed up page rendering time in Steve Souder’s excellent book, “Speed Up Your Site.” Souder held the title Chief Performance Engineer at Yahoo, before he recently jumped to Google. His short book gives practical advice on CSS sprites, using page caching and expires headers correctly, and being smart about DNS. Buy Steve’s book for your engineers—I promise the $30 you spend ($30 at O’Reilly, $20 at Amazon) will be the highest ROI you get all year. You can basically read the whole book online: instructions are on our blog, but it is worth buying the entire book.

Make sure your team is using the amazing Firebug plugin for FireFox, including the YSlow extension written by Steve.

Steve Souder wrote a fascinating post last week titled How Green Is Your Web Page. He suggests if we all followed his simple advice and optimized our sites, the reduced load on our servers would save energy and so reduce carbon emissions. A few less CPU cycles or hard disk rotations might not sound like much, but Steve calculates if the entire industry went along, the net energy savings could add up.

Lower Google bills. Happier users. Increased sales. And helping the planet a bit, to boot. You have your marching orders: get out there and speed up your site!