Understanding the perfect usage of Google Webmaster Tools

The Google Webmaster Tools is a set of free utilities that helps the webmasters of a website or blog to optimize his or her website or blog.

Preference of verification methods

We prefer uploading an HTML document to the domain root verify the site. There is no problem with any other verification methods. You can use any method.

In Google Webmaster Tools you will find 5 main links in the left hand side:

Namely, they are:

Messages

Site configuration

Your site on the web

Diagnostics

Labs

We are going in to a bit detail for each of them.

Messages in Google Webmaster Tools

Unless you have made a huge change (like changing the crawl rate) you will not get any new message there. All messages will be forwarded by email too.

Site configuration

If you extend this option, you will get 5 sub links:

Sitemaps

Crawler access

Sitelinks

Change of address

Settings

First, we can handle the sitemap, a .xml file that tells Google to crawl. You can see number of submitted sites and indexed sites here too. Check it weekly.

In the crawler access you will able to test your robots.txt before any change to see if it works. For example, we want to Disallow /author URL to be crawled in order to avoid the duplicate content issue. So, we added the line

Disallow: /author

And then, write the full URL in the field saying: URLs Specify the URLs and user-agents to test against and then click the button Test :

Parameters will be like that of Linux / Apache.

We got our expected result:

So, we can change the robots.txt using the parameter.

Sitelinks is a section where you can check if Google has generated sitelinks for our website, a fact that if you search with the name thecustomizewindows.com on Google search, you have seen them. We can block certain sitelinks not to appear from this area.

Advertisement

---

We wrote ago on how to get sitelinks. It is a kind of award for a quality website, because Google is promoting your content for free. Sitelinks changes with time as top landing pages changes.

The next category, Change of address, is rarely used; it must be used by an expert.

Finally, we can handle minor parameters in Settings, such as geographic targeting , The preferred domain (With www or without www) and the frequency of crawling. Keep the frequency of crawling to Let Google determine my crawl rate (recommended) , mention your preferred domain, set your Geographic target if only you need (for example, a website selling potatoes online in India will set it to India, not Globally). If you want to sell your potatoes Globally, do not touch anything there (i.e. keep it unchecked).

Your site on the web

If you extend this option, you will get 5 sub links:

Search queries

Links to your site

Keywords

Internal links

Subscriber stats

There is nothing to mention about this category, its quite easy. Just click and see them. Important is the Keyword section. You will see the most important keywords as determined by Google for your website.

For example, if the Keyword Abhishek appears for our website (s); definitely we need to remove the word as much as possible from the sites. Because, it is irrelevant.

Diagnostics

In the same way, if you extend this option you will get:

Malware

Crawl errors

Crawl stats

Fetch as Googlebot

HTML suggestions

This is the most important part of Google Webmaster Tools in our opinion.

Google crawl errors will show errors that have been encountered, you should not have any errors. Paradoxically, if you disallowed from robots.txt it shows up here as “Restricted by robots.txt”. It is, however not an error.

Crawl stats gives us the parameters that Google is to crawl our wbsites: in some cases can be high and especially we are interested in a fast loading website.

HTML suggestions is very important, this is what makes this part as a whole very important. You can Disallow from robots.txt one of the URL reported here.

Labs

Instant Previews are page snapshots that are displayed in search results. In general, Google generates these preview images at crawl time. If the Instant Preview looks different from what users see, it could indicate Google is having difficulty crawling your page.

Page speed can be seen retrospectively in Site performance. You will notice, the data is almost the same as webpagetest showed during that period.