Optimize Search Results Now

Make your Web site click with algorithm search engines … and customers.

You may be under the impression that getting your Web site ranked on “spidering” or algorithm search engines such as Google involves the dark arts, or at least a secret handshake. If so, you can’t be blamed.

Some search-engine optimization (SEO) companies make it sound as if high rankings are more hocus-pocus than strategy—that the algorithms (part computer program/part math equation) that determine page rankings are designed to estimate something other than how relevant a site will be to an Internet searcher.

While there are a number of complicated and technical aspects of SEO—also known as organic optimization—most can be boiled down to a few simple, key principles:

• Your site should be informative and relevant.

• Your site should be easy to navigate.

• Your site should be easy to use.

Content Is King

No single element of your Web site is more important, when it comes to search-engine optimization, than content. Content is what the search engine actually is searching for.

Part of what search-engine algorithms determine is what is called keyword density, essentially the frequency with which a search term appears on a Web page. If a keyword appears too few times, it’s deemed less relevant; if it appears too many times, it’s deemed spam by the spider. Why is this? Conventional wisdom would suggest that the people designing these algorithms determined through research that copy relevant to a certain topic would reference that topic a certain number of times as a percentage of total copy.

There’s no hard and fast rule for keyword density; search engines do not make public their algorithms. A rule of thumb I’ve culled from something of an informal straw poll of the industry experts interviewed for this story is that the 5-percent to 10-percent range is a good ballpark figure.

“One of the mistakes that new SEO writers make,” says Heather Lloyd-Martin, president and CEO of SuccessWorks, a search-engine marketing firm in Bellingham, WA, “is to think that if putting a keyword in once is great, then 20 times is better.”

This leads to copy that reads as if written by a robot. The idea is for your copy to be the sort of information that would intrigue a customer, not a cyborg. Search-engine algorithms are designed to perform the seemingly impossible task of determining relevant copy without actually “reading” it.

“When I’m teaching people, I tell them I never check keyword density,” explains Lloyd-Martin, sounding almost heretical. “You never want to do anything for the search engine that detracts from the conversion copy.”

In other words, if you’re writing for the search engine, you’re not writing for your customer.

Before you start writing your copy, you need to know which words consumers use to find you. This is where keyword analysis comes into play.

“We do very extensive keyword homework for our clients,” says Andrew Wetzler, president, MoreVisibility. “We encourage them to have content on their site based on words people are actually searching for.” Often, what clients think are their keywords “differ from what empirical data suggests,” he offers.

“The best data out there relative to who is searching for what comes from Overture,” explains Wetzler. With Overture’s keyword statistics (available at www.overture.com) “you can put in any search term, and it will tell you how many people searched that term … It’s like having a focus group on call.

Along with picking the right keywords, it’s vital that you not limit yourself. “A really big mistake some site owners make is that they believe there are only five money words,” says Lloyd-Martin. “The difference between organic optimization and pay-per-click [a system where site owners bid for keyword positioning and pay when consumers click their link] is that with organic it’s possible that your site can position highly for every single word combination on your page.”

This is where it helps to have a keyword strategy. At its simplest, a keyword strategy involves the “integration of the keywords that you find in your keyword research into the content of the site itself,” says Detlev Johnson, president of technology solutions for SuccessWorks.

Checking site logs for referring search engines is one step (referring URLs generally include the search terms a user typed into the search engine). A Web analytics product can be helpful. This helps you optimize your content for the terms people are already using to find you.

The next step is to use a keyword service, such as Overture’s, Google’s AdWords (http://adwords.google.com) or Word Tracker (www.wordtracker.com), to find other commonly searched words that you’d like to incorporate into your keyword strategy.

Ideally, once you’ve optimized for these new terms, they’ll start showing up in your referring links as well.

Build It (Well) and They Will Come

All the brilliant content and keyword research in the world can be undone by counterproductive site architecture. The construction of your site is an important factor in having it successfully spidered.

“In a perfect world, all sites would be static HTML and all text,” says Mike Gullaksen, senior search strategist for Scottsdale, AZ- and New York City-based search-engine marketing firm iCrossing. Spiders are designed to read HTML; they run into problems with dynamically created sites that cull information from a database when a page is requested, and they can’t read sites designed in Flash at all. (There are ways around these issues, but that’s a topic for another article.)

“Google reads content higher on pages versus lower on the page,” continues Gullaksen. “You want as few images as possible, and label the ones you have with alt tags [which tell the engine what the image is].”

Further, design your site with minimum drill-down; customers shouldn’t have to click through more than two levels from the home page to find what they’re looking for.

“A rule of thumb is to not have a site with more than two sublevels,” recommends MoreVisibility’s Laratro. He also advocates the use of subdomains when applicable. A large company like Megalocorp with many subdivisions might have separate subdomains for, say, its music and movie divisions at http://discs.megalocorp.com and http://movies.megalocorp.com.

Another structural consideration is a site map, essentially a page containing all a site’s links. Site maps function like tables of contents. They’re helpful for humans, and more so for spiders.

“The site map allows you to store all the internal links on your site within one page,” says Gullaksen. “It’s like spider food for a spider.”

More than a mere list of links, the site map allows you to show the spider how to read your site. “A site map allows you to incorporate the priority of the pages in your Web site,” says Gullaksen. The site map will “tell the spider what pages to read first.”

“A good site map can have category levels,” says Laratro. “You can put keywords or even descriptive sentences as to what the landing page is. This is very powerful for helping Google know what’s on a page.”

Of course, there are many other considerations. SEO is an ongoing process, and the Internet is a constantly evolving beast. However, SEO need not be a constant process. Says Lloyd-Martin: “It’s a big scam of some unethical SEO companies to make clients feel like they have to rewrite every month.”

The only way to know if your site needs to be re-optimized is to keep track of your rankings. Minor dips and burps in ranking are expected, so checking daily isn’t necessarily recommended. A sudden major drop, however, can signify that your competition has figured out optimization as well, or that perhaps an engine’s algorithm has changed.

An optimized site, if done properly, should keep well. “I’ve written content that has done well for years,” explains Lloyd-Martin. “With the spidering engines, the nice thing is that it’s always been about content.”