Tuesday, 6 September 2011

We all want to be found in a search engine when someone searches for something relevant to what we offer. The tricky part is how a search engine determines which site is the most relevant to each individual. In the mid 90's the answer was the use of META tags.

The idea is to provide a snapshot of key page information without a search engine having to actually read the page. As you can imagine, the honesty system only works for so long before people start to realise they can trick the search engines to thinking they're relevant to a specific search when they're not. 'Keyword stuffing' was a very successful practice for a time, but the constant abuse from primarily adult service websites led to an abrupt overhaul in the way search listings were determined.

It is unknown exactly how much weight current-day search engines give to meta tags, but it's small to the point of insignificance when compared to known and listed strategies such as page heading & content keyword density, external links inbound and newer metrics like +1's.

It has also been suggested, that if the listed meta information doesn't match the actual page content, your website may actually be penalised in the search results. This is of more significant concern with website content being frequently updated.

Although it was significant 10 years ago, Digerati no longer recommends the use of search-related meta tags, but we can certainly implement them upon request if its something you would like to self-manage for your own website.

Talk to us today about a custom strategy for maximising the search engine friendliness of your site, and other ways we can build up the external links to your website.