Many people have said to me that they can't afford to employ a SEO company, but they would still like to get better search results on Google, Yahoo and MSN searches. Are there any basic rules that they can follow to improve their ranking on search engine results pages (SERPs)? In response to this question, here are some basic things that people who have coded their own web sites can do to make the web pages more spiderbot friendly and so more attractive for search engines to crawl.

Let's start with the actual information on the home page of the web site. What is the first thing that a spiderbot will see when it crawls your web site - complex graphics, tables of information that do not directly relate to your products or company, or does it see some well-written, grammatically correct text? We have found that the first 100 words on a web site are of particular interest to spiderbots, and a lot of emphasis is now placed by them on what is found there. This is the place to put your key words, what search words and phrases do you want to be found under.

But what about key words in the Meta tag statement? This was a great idea, but unfortunately became abused by people doing key word stuffing. Look at you own site, what keywords do you have there. We would suggest that you consider putting up to 10 key words in the Meta tag, and these words should re-enforce the key words that you use in your actual text. I saw a site for buying property that repeated the words house sales a hundred times; do you think that fooled Google? Search engines are now trying to ensure that the content and key words the actual web surfer sees, relates directly to how the page is ranked.

On a similar theme, have you ever been to a web site that is very difficult to navigate and you end up giving up because it's just too difficult to find the information you require. Well, think how the spiderbot feels when it hits such a site. Do you think it will spend time and effort trying to build a complex navigation map, or will it out the site it the too difficult pile and move on. Make your site easy to navigate; on a small site being able to access any page from any page is a useful concept.

This brings us nicely on to the whole question of Inbound Links (IBLs). Some sites have very few external IBLs, but still do well because of the structure and linkage of their own pages. Unfortunately, this can mean a fairly major redesign and re-write of a site but if you can do this, you will improve your visibility. Other ways of getting IBLs are link farms, reciprocal links and one-way links.

We believe that link farms are a dangerous way forward. Search engines do not like this idea as it cuts across the major concept of searches that they should reflect the true popularity of sites. Either now or in the future, sites using link farms will be penalised and maybe sandboxed, a bit like being sent to the sin bin for 6 months and not recommended.

Reciprocal links can again be seen as a false way of increasing the popularity of sites. Many high ranking sites use this idea and to date have not suffered. However Google and other search engines are starting to take the relevance of the linking site into account. So if you are a property company and you link to a frozen food supplier and vice versa, yes I've seen this one, then the search engines will ignore it as an IBL. We are aware that more sophisticated companies are using a 3-way reciprocal link procedure that is far more difficult to track. So 3 property companies work together, company A links to company B that links to company C that links to company A. So at the moment, the search engines probably cannot see the 3-way reciprocal links, although this may change in the future.

Update on Reciprocal links. The latest Google Dance (update) is taking place (end of October 2005) and many sites with lots of reciprocal links are saying that their Google search rating has disappeared. This leads us to believe that such links are definitely not a good idea.

By far the best way of getting links is by having a quality site that other people want to link to. So again, we are back to content. If you provide quality information, over time, more and more sites will link to you, and if they are in the same business sector as you, so much the better.

Finally, we would like to stress the need to write the cleanest and most error-free html code that you can. Again, it is fairly obvious if you think about it, spiderbots want to be able to crawl a web site quickly and efficiently.

If you have bad html, links that are broken or tables that don't have the correct number of entries, then spiderbots will struggle to classify your site. We have invested in a very sensitive html validator that picks up lots of errors and potential errors. To date we have not found any web sites that validate 100% error free using this validator.

In summary, good clean readable text backed up by clean html programming and some good solid inbound links will give you a far better chance of good search engine rankings. So, if you do all of this, will you need a SEO company? If you can achieve all of the actions above, we believe you will have done about 40% of the work we would expect to do for a client. Maybe that will be enough for you, if not feel free to have a look around our website.

Do you want to learn more about Internet Marketing? I have just completed my brand new guide to Search Engine Marketing Success. Discover The *Secret Formula* We've Used To Stay In Google's Top 3 Rankings For Over 3 Years For Some Of The Toughest Keywords Around (18 MILLION Competitors!)

Serge Daudelin is a Search Engine Optimization Specialist who has written over 300 articles in print and 5 published ebooks. Serge is dedicated to helping others and offering the best information on how to make more money online.