SEO - HTML Blog - Tutorial

We will from time to time attempt to include information we have found to be useful as a guide to Search Engine Optimization, beginning with basic HTML principals.

An SEO HTML educational blog, intending to help teach and instruct on the relationship of good html code to SEO.

Monday, September 05, 2005

Proper Coding Ethos

Paramount in any commercial website is the useability of the site itself. Glitches, malfunctions, inconsistencies and errors are business killers, and must be eliminated in order to provide an efficient and friendly browsing experience for customers and readers. The implementation of proper coding practices will keep customers at your website for longer periods of time, and will vastly decrease their levels of stress. It is crucial to gain the customer's trust, as far as their perceptions of your security and professionalism. A buggy website is often seen as an insecure website, and an amateur affair. These perceptions can crumble the credibility of a business in an age where the website is not only the first portal to a company, but often the ONLY portal.

Sideline to the obvious issues of useability, are issues of search engine exposure. Search engines employ sophisticated programs called "spiders" or "robots" to automatically trawl the internet for content, which is then indexed according to the search engine's own specific (and secret) algorhithm. The methods these spiders use to index a page are unknown, and change frequently. Certain things hold true (for the time being): The text used in an image is not seen by the spider, but the "alt=" text most likely is. Spiders are code based and thus are incapable of "fuzzy" interpretations of your website. This is the reason why proper code is crucial; if the spider hits an error, it may improperly index your page, or ignore it completely, severely limiting your exposure to search engines.

It is impossible to know what glitches a spider can overlook, and which ones will stop it in its tracks. For this reason, we have taken the stance that the only code which is sure to be read properly by a spider, is code that has passed W3C validation. The more of your code that can be validated, the more a spider can access. Once it's indexed, more customers will find your site among search results, thus increasing revenue. There is a demonstratable link between proper coding practice and revenue generation in today's online marketplaces, one that is unfortunately often ignored.

It is possible that search engine spiders aren't quite as stringent and strict as the validation rules are, but on an issue of this importance it's better to be safe than sorry. Next week, we will delve into proper code declarations to maximize the ability of a spider to properly detect and index your site.

Mission Statement

In a world dominated by the internet, design and compatibility are crucial to a company's success. Care must be taken to properly and formally code a webpage, so it works across the board on a variety of systems, and is still fully compatible with the latest search and information management technologies. This weblog will teach designers the fundamentals of such coding, including the crucial need for proper validation, the methods in which one can create code which requires less debugging after the fact, and the methods in which Search Engine Optimization can be used to promote the stature of a company within a page of otherwise nondescript search results.