ONLINE EVENT: How complicated is SEO anyway?

"Key words" and Search engine Optimization. Is this truly a science and a non scientist would need to purchase expensive software or is there an easy rule of thumb to bring our sites to the forefront of Search engines?

Like anything else, you can get into it as deeply as you want and drive yourself crazy. But you can also learn a little and get very, very good results without making a career change.

There are different kinds of software that can help, but most fall short in different areas, since it's software. Most are not too expensive, but I don't recommend them unless you really have no time and no interest in optimizing your site yourself.

From our point of view, optimizing a web site consists of:

1) Having a "good" site.

2) Finding out what keywords and phrases people use to find sites like yours.

3) Using the keywords and phrases on your site as much as possible AND still using good English.

4) Submiting to as many search engines and directories as possible.

There are many sites that mostly all say the same things about this process and can provide details that reflect the opinion of each site. You will read a lot of conflicting information, so the best thing is to try to find several sources of information so you can see what the most popular ideas are.

The foundation starts with keyword research. We use the free program from www.GoodKeywords.com but many professionals use the paid service from WordTracker.com, which has a free trial.

You make a list of keywords and phrases that you think describe your site and then do some reseach to see what people are really using. For example, "free tech forum" may not be used while "tech support forum" is. Here is a small sample for "tech forum":

The number is the number of times each was used in a search in the past month. As you can see some are very good and some are not. These results strip our plurals and combine the numbers. Also people often use different word orders and these are also normalize and you have to see what would be common usage.

Each keyword can return up to 100 variations, so you can quickly build a very large list with which to work. Short keywords generally have large numbers of searches, but are not well targeted. Multi-word phrases can be very targed, but few seaches per month. I recommend using a blend of both on your site.

You need to edit your page title, description, and keywords tags for the entire site so that each describes what the page has. You also should work the keyword phrases into the text of each page.

At this point the search engines that spider your site will pick up the changes and should start to bring you more traffic in 5-14 days generally.

The next step is to try and get some other sites to link to you, and submissions to free search engines that don't spider, and web directories can be very effective.

So, while there are a lot of other things you can do which can make SEO complicated, the core of it is really pretty simple.

Now, if you want to talk about the optimization of a dynamic site like TechSoup, that does involve another layer of complexity, but it's really not all that bad, and the same principals hold true. :-)

I think that's a great question. In my experience, working with Search Engine Optimization consultants as well as doing it myself, I've found that Search Engine Optimization (SEO) is not a science, but it does benefit from orderly thinking and attention to detail.

It does not require expensive software, or maybe any software at all. Some good web analytics tools are helpful for finding key words, but not essential to basic SEO.

And, there is no rule of thumb to bring your site to the forefront. There are too many factors. At the most basic level, you want to make sure that you have good content that addresses the reasons people would be interested in your site.

One of the most important things that I learned in interacting with SEO consultants who worked on our site, is that the basics are simple but time consuming. There is no easy answer. If you have a little basic html knowledge, and some basic knowledge of SEO, you can do well.

When I am posting a new page to our website now, I go through some simple (but sometimes time consuming) steps to optimize the page.

First, I write a short synopsis of the content of the page.

Then I identify keywords by thinking of the most important words that describe the topic of the page. I also search on Google to find related terms; and check my web analytics software to see what search terms people are using to arrive at our web site that are related to the topic of the page.

I then post the description and keywords in the tags on my page.

The only part of this that requires special software is checking the search terms that people use to get to my site (and that part is nice but optional). The software I use is Urchin. I understand that Urchin is the software that was developed into Google Analytics, so it is available in that way.

I am not saying that web analytics software isn't important, because it is. But what I'm saying is that at the most basic level, SEO doesn't need software.

1) Your site needs to be well-designed so that search engines can crawl its pages. So use valid, semantic html; write meaningful text in your anchor links; have proper page titles and description meta tags... etc.

2) The content should be well-written, concise, informative and appealing. Don't try to stuff keywords into your text, just use the normal language you'd use to describe your organisation and its activities.

3) Your site needs to have many relevant, good-quality links from other websites. So ask the best and best-known websites in your area of work to link to your site.

4) Time is a factor. Google rates pages higher if they've existed longer. It also rates pages higher if they are substantially updated frequently.

You need to know what works and what doesn't in order to assess what you're getting right or wrong. So use tools such as your own web stats, Google Analytics, Google Page Rank, W3C validators etc to test the performance of your website over time.

Ok, sorry for the long post but SEO is really the "make or break" factor online.

Rule #1: No matter how pretty your site may be: If you build it, no one will come.

Here's more of the "what to do" with a little "how" sprinkled on top.

Research your market

Get to know the field you're playing on.
Who else is there?
Where are they?
Whose linked to who?
Who has external links off the field to others (links to .edu's, .gov's and blogs)?
How many other sites link to your competitors (ask them to link to you)?

You can use the Google Advanced Search (link to the right of the Search box on Google)

Checkout the SEO for Firefox extension. Here:

http://tinyurl.com/rl9yr

This'll give you a bit more data to chew on like:

- site age
- Google PageRank
- inbound link count
- if any governmental or educational sites link at their site
- if they are listed in major directories
- if bloggers link at their sites

Keyword Research:

All of the above is right on. (Wordtracker and Google keyword tools)

I would add a few others:

- checking your web analytics or server logs to see how people found you.
- looking at page content of competing websites
- looking through topical forums and community sites to see what issues people frequently discuss

Site Structure:

Not negotiable. Do the following:

- the most important categories or pages are linked to sitewide
- link to every page on your site from at least one other page on your site
- use consistant anchor text in your navigation
- link to other content pages (and expcecially to action items) from within the content area of your website

Before you start building, decide which keywords are most important and how you're going to accomplish content creation and page placement/linking while successfuly integrating keywords.

On Page Optimization

Write for humans.

Use unique page title and descriptions for EVERY page. The old school FIND/REPLACE in Dreamweaver for title/meta tags won't cut it. Spend the time and match page meta tags to page content.

Using a CMS? Install a "Friendly URL" component. Take the time to assign meta tags when given the option. JOOMLA makes this easy.

- submit your site to general directories like DMOZ, the Yahoo! Directory, and Business.com
- submit your site to relevant niche directories
- if you have a local site submit to relevant local sites (like the local chamber of commerce)
- join trade organizations
- get links from industry hub sites
- create content people would want to link to
- try to link to your most relevant page when getting links (don't point all the links at your home page)
- mix your anchor text
- use Yahoo! Site Explorer and other tools to analyze top competing backlinks
- don't be afraid to link out to relevant high quality resources

When you say RATES did you mean RANKS? I have a site I finished building 3 weeks ago and submitted 9 days later. I am in the top 8 for 3 primary keywords. It appears to have more to do with good content, amount of content, size of market and solid SEO implementation.

A site with longevity is apt to fulfil many of the above principles plus nice link juice, linkbacks, even more content etc.

As I mentioned, everyone had there own ideas about what to do and how to do it. :-)

The main thing that I have to comment on was mentioned by two different people:

Yes, checking your web traffic reports can reveal some keywords and phrases that you may not be aware of that you should research. But keep in mind that the things you find there are things your site is already being found for!.

Adding more keywords that you are already being found for may or may not make sense. Rather, I would make sure those are included when you do your keyword research as I mentioned earlier. You may find that you are using those terms and they have very few searches per month when you could be adding others that you are not being found for that have higher search volumn.

I've built new sites, had them jump right to the top of Google's results soon after launch - then slowly sink back down again. Other sites have crept up and up over time.

I've also seen some really awful ancient sites keep hold of the #1 spot because they've been around for years and got lots of incoming links.

I don't know how strongly Google rates time (longevity, newness, frequency of updates) as a factor in determining a page rank, compared to number of links etc, but it's mentioned in their patent Information Retrieval Based on Historical Data. The patent gives an interesting insight into their likely ranking methods.

I think some of those factors like age may have something to do with ranking, but I have a very old site with a PR0, so if it's a factor, it's not a big one.

The patent is interesting, but just because the have things in there it doesn't mean that they use them. I know when people first started talking about the patent, one thing they noticed was the part about how long a domain is registered for being a factor, with the main thought being that if someone registered a new domain for one year, it might be a sign that someone was only going to use it that long before discarding it. That prompted many to go out and register their domains for 5, 10, and in some cases many more years.

They didn't stop to think that just as Google can see how long the domain is registered for, they can also see how long it's been registered. So my domain that has been registered since 1999 should not have any problems, despite the fact that it's never been registered for more than one year at a time.

Yes, checking your web traffic reports can reveal some keywords and phrases that you may not be aware of that you should research. But keep in mind that the things you find there are things your site is already being found for!.

Adding more keywords that you are already being found for may or may not make sense. Rather, I would make sure those are included when you do your keyword research as I mentioned earlier. You may find that you are using those terms and they have very few searches per month when you could be adding others that you are not being found for that have higher search volumn.

I wonder if you could comment further on this? Wouldn't consistency in repeating keywords add to the strength of ranking in those areas? Is the point that you want to make sure that keywords are relevant to your site goals, as well as popular?

Keywords with very low search volume but that are nonetheless relevant to your site/product.

These will outnumber the popular keyword searches and help identify niche areas of content you may want to include in order to bring additional traffic to your site.

Yes, your site is already being found when long tail searches are done (ie. they show up in your log files) however, measuring your search placement (are you in the top 5? top 3?) for these long tail keywords is very useful.

Depending on the size of your "tail", you'll want to further optimize for these words. You never know when a search term will begin to move up in popularity ie. search evolution.

I didn't hear you mention doing any keyword research beyond the stats, which can return up to 100 variations for each keyword or phrase that you check. Also, just optimizing further for what is already working may or may not help, and depending on the search volumn for the keywords, you may be focusing on things that will provide little in future traffic.

If you are only looking to optimize for the most popular keywords, then you would want to be highly optimized for those terms. The problem with this traditional approach to SEO is that it is dependant on YOUR site getting a high ranking for the few terms you choose. With increased compitition and search engine changes, you are going to have to keep a constant watch so that when your rankings drop, you can make some changes to get them up again.

This is the most common type of optimization offered by SEO companies. Check some sites, get some quotes and you will see that not only does it generally command a large cost, there is also ongoing maintenance that is require and is also pretty expensive. If you have the budget, then it may be worth going this route.

With General optimization, you don't have to worry about working on your SEO each month. It's like being diversified in the stock market. Some keywords go up and others go down, but overall your site does well over time because you are being found for so many different things that are appropriate for your site.

If you are looking at your stats, I'm sure you will see that your site was found for some pretty strange things that you would not expect. This happens because search engines will match requests on sites even if the words are not together, but occur somewhere on the page. In the same way, if you have more and more of the keywords and phrases that people use in your site, it will be found for more and more combinations of what people search with. This happens naturally, but you can increase it by using more of what you can confirm that people use to search with.

My advice is to not worry about placement or rankings. Focus on the amount of traffic you get and the things you are being found for, which will happen easily if you do good keyword research and use the results on your site as much as possible.

Good SEO is not cheap, but increasing your qualified traffic by 2-4 times may be worth it to many. I think that many of the larger SEO companies price their services to match a company's budget, although I have no proof of that.