Category Archives: meta tags

Post navigation

Hi, my name is Cameron and I run I Want a Credit Card, an Australian credit card review site. I think I’m doing everything I should be to get good Google traffic. I have loads of unique content and lots of incoming links (some from PR7 and PR8 sites).

I get about 20-30 Google visitors per day, mostly from very specific keywords. I don’t show up anywhere in the first 300 results for credit card’ or ‘credit cards’ on Google Australia, which is frustrating because I think my site is a good resource (certainly more relevant than some of the sites in the first few pages of results).

I’ve read countless articles on optimizing my site for search engine traffic and I’ve tried to follow all the appropriate principles (SEO-friendly URLs etc). My site doesn’t contain any content which may cause it to be penalized (gambling, porn etc) and I have no outbound links to bad or PR0 sites. Google Webmaster tools reports no problems. What am I doing wrong?

Cameron

Dear Cameron

The existence of Google Sandbox, the (ageing) filter put in place by Google spam team to fight web spam, is debatable. While it did exist in the past, many SEO professionals now believe that it no longer exists but Rand Fishkin wrote a post which proves otherwise. I believe that you are trying to hint at the fact that the website’s inability to rank for the keyword ‘credit card’ or ‘credit cards’ is because of Google Sandbox effect. However, a brief analysis of the website did not show any potential signs of the ageing filter playing a role in the website not being able to rank for these keywords.

Since you have been reading around the subject, I am sure that you would have come across various resources detailing search engine ranking factors. The important thing to remember here is that these factors change with time; new factors get added, some lose their sheen while others gain prominence. With this background knowledge, I would like to give you some potential reasons for low ranking and suggest a better approach.

While analyzing your website, I could spot a couple of flaws which will hurt its potential to rank high on search engines. Many of the web pages seem to be using the same (duplicate) ‘title’ which is not healthy. Page title is one of the most important on-page ranking factors and it is imperative that each webpage has a unique title, which is in sync with the keywords being targeted for that particular page. In addition to this, the website’s back link profile looks very unnatural. While building links, it is essential that you rotate anchor texts and use semantic variations of the targeted keywords. It has to be a proper mix and I am afraid to say that it is not at the moment because more than 95% of the links have ‘credit card(s)’ as the anchor text. Also, majority of the links come from a handful of websites. The existing back link profile is bound to raise red flags and many links will be potentially devalued or already are by search engines.

Ranking for competitive terms such as ‘credit card’ or credit cards’ will require concerted efforts for a prolonged period of time. It is pertinent to add that domain authority, trustworthiness and age play a crucial role in ranking for competitive terms; something that cannot be built overnight and will come with time. I would therefore recommend that you adopt a slightly different approach.

You can begin with targeting less competitive keywords like ‘credit card comparison’, ‘compare credit cards’, ‘low interest credit cards’, ‘student credit cards’, ‘low rate credit card’, etc. Keywords which are 3-5 words long are not only easier to rank for as compared to generic term like ‘credit cards’, but also are more likely to convert better. The best part of this whole approach is the fact that as you work towards ranking for less competitive but better converting keywords, you gain significant link equity and domain trust. This in turn will help you rank for more competitive and generic keyword like credit card. By adopting this approach you would accomplish your end goal and in the process achieve high rankings for a wider keyword portfolio.

Is there a way to target phrases with this? It seems to be parsed by spaces, any phrase would be taken as individual words. Is this assumption correct?

Thank you,
Cy

Hi Cy

No, your assumption is not correct. Multiple keywords can be integrated into the KW tag as phrases and both the individual keywords and the grouped phrases should be picked up by those few search engines that support the META Keywords tag.

You can either separate the phrases with commas, or just include all your keywords and phrases without commas separating them. Commas are a personal choice and I prefer not to use them in a META Keywords tag because I feel they can act like a stop word to some search engines.

[ADDED: A few people reading this have asked me *which* search engines still support the META Keywords Tag. Judging by Danny’s tag retrieval test, it looks like only Yahoo and Ask continue to support the tag, so don’t stress about it!]

I just noticed that my company’s homepage has five meta description tags within the head tag. Will this have any negative ramifications?

Thank you, Heather

Dear Heather,

I am curious as to how your site got that many meta description tags. In fact, you also seem to have multiple keyword meta tags. You may get penalized by Google for duplicate content, but regardless, it’s bad design practice to have improperly formatted meta tags, I would remove the extra tags ASAP. I wrote a post about meta tags once on my own blog, but here are a few tips.

(1) Keep the description tag down to about 120 characters. (2) Include a compelling call-to-action since this tag usually shows up on the SERPS (search engine results page). (3) Be sure your important keywords are used at the beginning of the description. (4) Don’t keyword-stuff, and use the same keywords more than a couple of times, even as different variations.

I have recently optimized a friend’s website. The site was already listed with Google and Yahoo etc. I have noticed that since uploading the site a few weeks ago the new description and title for the home page is now listed and a few of the new page extensions.

In the SEO 201 course, you recommended submitting different listing descriptions for each search engine/directory. However, all the search engines are just using the title and description from each page they have listed.

1) Should I be listing pages not listed on the popular search engines or wait till the find them.

2) Should I only submit alternative descriptions where the site is not currently listed and do I only need to submit the home page?

With thanks

Peta

Dear Peta

You generally don’t need to submit sites to search engines as they will be discovered, provided there is at least one site pointing to them. But what you should make sure of is that each page on your site is being indexed. You can do this by creating an XML sitemap of your site and submitting it to Google via Webmaster Tools (also via Yahoo). More info is available at www.sitemaps.org.

Regarding different descriptions and titles – search engines will use whatever they think is the most relevant snippet from a page in relation to the search query. This could be taken from the title tag, the description or from the text on the page itself. You can control this to some extent by making sure each page on your site is optimized for a small range of target keywords/phrases so that each page has the opportunity to rank on it’s own merit.

When I talk about submitting different descriptions, I am generally talking about when submitting your site to niche directories and search engines that don’t automatically crawl sites to discover new pages. If you use different descriptions for these submissions, you can easily track keyword referrals in your log files and recognize which sites are bringing you the most traffic. I hope this answers your question.

I am hoping that you can help me as this has been driving me crazy! Certain search keywords such as “buy Taser” from the Index (home page) goes to page 1 in google for ONE day, then jumps to page 13 and climbs up to page 20 or so, then goes back to page 1 for one day. So, about once every two weeks those two keywords are on page one in Google for one day!

I do not have this problem with MSN. I am totally baffled. I was running Google Base thinking maybe there was a connection, but I inactivated the product search a few days ago, so I guess that is not the problem. The site is about 9 months old. Is it because I have a yearly expiration domain and Google thinks it will expire soon? I tried so many things. Please help ASAP. Thank you SO much!

Secondly, to answer your question about shifting rankings, in a nutshell it’s because Google uses different datacenters to show results and shuffles between these datacenters (they each have slightly different ranking algorithms). So on one datacenter your site might be on page 1 for a search query but then for that same query done a day (or hour) later you might be on page 4, because they are showing results from a different storage facility.

Different pages from your site and your competitor’s sites might also be stored on different datacenters, meaning that pages that normally rank well may not appear at all depending on which datacenter Google is using to fetch search results and whether or not all your indexed pages are listed in that datacenter. Your competitors may have more pages indexed by Google across all datacenters so they seem to be consistently outranking you. Or else they have simply done a better job of optimizing their pages to match search queries.

But the datacenter issue is the least of your worries. Here are just some of the problems I see with your site:

The Title and META tags are poorly constructed and not optimized for performance on search engines. This can partly be blamed on the tag limitations of Yahoo SiteBuilder, but mostly it is just poor keyword choices and incorrect formatting. For example, your META Description tag contains a bunch of keywords instead of a readable sentence! Your poor Title and META tags are limiting the ability of each of your site pages being found in search engines.

You’ve got the worse case of code bloat that I’ve seen in years, thanks to excessive code added to your HTML pages by the SiteBuilder program and the page author. Code bloat happens when unnecessary code snippets are added to your HTML code during the editing of your pages. A very common way this happens is if you cut and paste text from one program into your web editing software. For example, if you cut and paste from MS Word into your web editor, you can often find extraneous code (such as span tags) added. These snippets build up and add to your file size and can often lead to invalid code, meaning that Googlebot and other search robots may have problems indexing your pages and abandon the site, meaning fewer pages are indexed and included in their datacenters. Apart from that, code bloat impacts the ranking relevancy of your site because it impacts the keyword density of your pages. For example, if your competitor mentions “buy taser” in their page text the same number of times as you do in your page, but their page has less code to wade through, it is likely that their page will rank higher than yours for the search query “buy taser”.

As I suspected, your code does not validate. When I ran it through the W3C Markup Validator, it spat back 236 errors, including a missing DocType! Now Google and other engines are pretty forgiving these days when it comes to invalid code, but even if some pages are being successfully indexed, the errors could well be sabotaging your site’s ability to rank well.

There don’t seem to be many internal or external links pointing to your site. You should try to gain some links from other web sites in your industry as theme-based links will help boost your position in Google.

There’s lots more wrong with your site, but I think I’ve given you plenty to get on with for your $10 coffee donation. The rest is up to you. As I always say to businesses using free or cheap web design and hosting tools online, you get what you pay for. If you want potential customers to take your business seriously, YOU need to take it seriously and spend some time and money addressing your site’s compatibility with search engines.

You should consider paying a site designer to build you a better looking site that can be properly optimized. If you can’t afford a professional site design, consider installing the (free) WordPress blogging platform on your server and taking full control over your site that way. Teach yourself – it’s free – and then hire a search engine optimizer to get your site ranking better. If you can’t afford a search engine optimizer, consider posting your requirements on our Search Engine College jobs board as there are a lot of SEO students just itching to sharpen their skills on a real site.

I’d also recommend downloading the free Search Engine Optimization lesson from Search Engine College so you can better understand what makes a site rank well in search engines and take control of your own site’s destiny. Good luck!