5 ways to use Google Webmaster Tools for SEO

Most people attempting to optimise their website for search are familiar with Google Analytics and how to exploit this tool to analyse and improve user experience and search traffic.

Did you know that Google Webmaster Tools also carries several useful resources for SEOs? We’ll run through the most important features available in GWT – from those which help you make sure your site is easily crawlable by search engines to those which generate new content ideas.

To get started, you’ll need to verify yourself as a website owner in GWT. This is easy if you already have a Google Analytics account associated with your site. If you don’t have a GA account, you’ll need to have FTP access to your site and verify your ownership using one of the available methods provided by GWT (usually it is adding a file to your server or inserting a line of code into your homepage; however, you can verify your ownership using DNS settings).

Done that? Great. Then let’s get started!

1. Fixing broken links with the ‘Crawl errors’ report

Under the health tab you’ll find a report labelled Crawl Errors. This is where Google reports back on all the pages it’s tried to crawl on your website but couldn’t do it for whatever reason. The type of errors an SEO is most interested in is ‘Not found’ so click on that.

A not found error is where Google has tried to follow a link to a page on your site but instead got a 404 error. This might be because the link is pointing to the wrong place or the page doesn’t exist.

In the above example you’ll see that the site in question has seen a growing number of 404s – these need to be remedied in order to avoid losing link equity. The more 404s Google encounters on your site the less it’ll think your site is worth crawling.

Below the graph is a table of all the 404s that Google has found. You can download this report to analyse in Excel but that removes the most useful feature: by clicking on each link you get a pop-up tab which tells you where the links are that are pointing to the 404-returning page:

If the links you see are the broken links from your own pages, you now know where to go to fix them. If they’re pages that don’t exist, you can remove the links or apply internal permanent redirects. If the links are on external sites you can apply 302 redirects to the URLs to recapture the lost link-juice.

2.Using ‘Fetch as Google’ to index new content

If your site is relatively new, you might need to wait a while before Google finds your new content via links – unless you submit the new URLs here and request that Google crawls the pages immediately.

Then, by clicking on the ‘Submit to Index’ button which appears next to the newly crawled page you can make sure that the page and the pages it links to are added to Google’s index. You only get to do this ten times a week but, for a small site, that’s probably all you need.

3. Finding content opportunities with ‘Search Queries’

The ‘Search Queries’ report is probably the one you’ll use most frequently for SEO, as it tells you which are your top keywords and pages over the last 12 weeks by how often they appear in SERPs (search engine results pages) and how many clicks they’ve attracted.

The tab for ‘Top Queries’ tells you which are your top keywords, and you can sort these by the number of impressions, clicks, average position in search results and the changes in these stats over a selected period.

The above site has seen a dip in the number of search queries it’s made impressions on recently – perhaps due to the same cause as all the 404s.

By ordering the results by ‘Avg. position’ you can see which important keywords you rank for on page 1 and which you don’t but might be able to with a bit of on-page optimisation or targeted link-building.

The ‘Top Pages’ tab gives you the same data for pages and, crucially, where a black arrow appears next to the page, you can see which keywords it’s getting impressions for. This is useful for identifying which pages to use to target certain keywords:

So, for example, if in the above example the keyphrase ‘free cash point’ was valuable, then the page in question, which already garners some traffic for this phrase, could be better optimised to rank for it.

4. Tracking indexation with sitemaps

The ‘Sitemaps’ tool requires that you set up and submit an xml sitemap to Google. The best tool I know of for doing this is Xenu Link Sleuth, which crawls your site and then generates an XML sitemap on request. The sitemap tells search engine crawlers where all your pages are and tracks how many of your pages have made it into Google’s index a cinch.

The above site has an indexation percentage of 94%, which is pretty good. Tracking these numbers from week to week or month to month lets you see how Google Algorithm updates have affected the indexation of your site. This is where you can spot Google penalties. For example, if after the recent Penguin Update you saw a sharp dip in the indexation, you might reason that you need to de-spam your backlink profile.

5. Avoiding duplicate content issues with ‘HTML Improvements’

Under the ‘Optimization’ heading you’ll also find the ‘HTM Improvements’ tool. This is where Google tells you about duplication issues and non-indexable content.

This page also shows whether (and where) you have duplicate meta descriptions or title tags – or whether these are too long or short. Google provides an itemised list, so you can go in and change these manually or identify where you can use dynamically generated meta data.

Where content is seemingly duplicated, Google will normally index one of the pages in question – even if the actual on-page content is different. Unique meta title tags and descriptions send the right signals to the search engines, so make sure that all your valuable, unique pages are properly discovered and recognised.

From the above example, it looks like some work needs to be done to fix that site.