Effective and practical digital marketing, internet and mobile marketing for small businesses to attract potential customers online. Our strategies are built to drive more traffic and capture more leads, more importantly converting these leads into prospect and sales.

Wednesday, March 14, 2018

The 2018 Guide to Google Webmaster Tools – Google Index, Crawl, and More

Back in 2010, we wrote a thorough beginner’s guide to Google Webmaster Tools. But since then, there have been significant changes to Google Webmaster Tools.

We’ve updated this guide to include new ways to set up your website with Webmaster Tools, the new data included in Webmaster Tools about your website, important data you might have forgotten about, and how to continually monitor for any issues that might affect your search engine rankings.

Setting Up Your Website with Webmaster Tools

If you haven’t already, the first thing you will need to do is set up your website with Webmaster Tools.

To do this, visit the Google Webmaster Tools website, sign in with your Google Account – preferably the one you are already using for Google Analytics.

Click the red Add Property button to begin.

Next, you will have to verify this site as yours.

Previously, this involved having to embed code into your website header or upload an HTML file to your web server.

Now, if you already have Google Analytics, you can verify your site by connecting Webmaster Tools to Google Analytics.

To do so, select the Use your Google Analytics account option.

Once your site is verified, you will want to submit a sitemap if you have one available.

This is a simple XML file that will tell Google Webmaster Tools what pages you have on your website.

You’ll find the option to add a sitemap under the “Crawl” tab of your toolbar.

If you have one already, you can usually find it by typing in http://yourdomain.com/sitemap.xml to see it in your browser.

To create a sitemap if you don’t already have one, you can use online tools like XML Sitemaps.

Once you have activated the plugin, look under your Settings in the WordPress dashboard and click on XML-Sitemap.

The plugin should have already generated your sitemap, so there’s nothing else you have to do.

You’ll find your URL at the very top of the page:

Copy the link address and head back over to your Webmaster Tools page

Then paste the portion of the URL after the http://yourdomain.com/ of your website into the box to submit your sitemap to Google Webmaster Tools.

You’ll now be able to refresh the page and see the sitemap you’ve submitted.

It may take a few days for Webmaster Tools to start pulling information about your website if you are setting up your website on Webmaster Tools for the first time.

Be sure to wait a bit, then continue on to see what you can learn from Webmaster Tools.

Valuable Information within Webmaster Tools

Once you have data in Webmaster Tools, you will be able to view the following about your website.

These are only the highlights of new types of data within Google Webmaster Tools and the most important data you should always remember to check on occasionally.

Dashboard

When you visit your website in Webmaster Tools, you will first come to your dashboard.

This is an overview of the important data within Webmaster Tools. You can visit specific areas such as your Crawl Errors, Search Analytics, and Sitemaps from this screen by clicking on the applicable links.

You can also navigate to these areas using the menu in the left sidebar.

Search Appearance

In the left sidebar, the first option you’ll see is Search Appearance.

This section gives you an overview of how your site will appear on the Search Engine Results Page.

Optimizing this is one way you can check your site’s SEO.

Structured Data

The Structured Data tab will send you to a page that looks something like this:

This gives you insight into how Google is viewing the content on your page, and if there are any errors that stand out.

You’ll be given a boilerplate piece of coding that you can customize to your site.

When you set it up, your Webmaster tool will be able to break down your page and show a chart of successfully indexed pages as well as any AMP-specific errors.

Search Traffic

The Search Traffic section is the second option of dropdowns you can select. Here, you’ll find a breakdown of how your site is performing from a variety of angles and where you can improve in the future.

Manual Actions

It’s one of the ways that Google has taken action against web spamming.

Mobile Usability

On the Mobile Usability tab, you can check to make sure that all of your website’s pages are aligned with what Google considers best practice.

As you can see, you can have issues with text size, viewport settings, or even the proximity of your clickable elements.

Any of these problems, as well as other errors, can negatively affect your mobile site’s rankings and push you lower on the results page. Finding and fixing these errors will help your user experience and results.

Google Index

Index Status

This report gives you data about the URLs that Google has tried to index on your selected property in the last calendar year from the date you’re viewing.

As Googlebot crawls the Internet, it processes each page it comes across to compile an index of every word it sees on every page.

It also looks at content tags and attributes like your Titles or alt texts.

This graph shows a breakdown of the URLs on your site that have been indexed by Google and can thus appear in search results.

As you add and remove pages, this graph will change with you.

And don’t worry too much if you have a smaller number of indexed pages than you think you should. Googlebot filters out the URLs it sees as a duplicate, non-canonical, or those with a no index meta tag.

You’ll also notice a number of URLs that have been disallowed from crawling by your robots.txt file.

And you can also check on how many URLs you’ve removed with the Removal Tool. This will most likely always be a low value.

Blocked Resources

If you want to know if any of your site is somehow beyond Google’s crawling ability, you’ll be able to take a look at your Blocked Resources report.

This report doesn’t show every resource though. Just the ones Google thinks you can control and fix.

Since Googlebot needs access to a great deal of information on your page in order to index you correctly, a blocked resource can affect how your page performs in Google’s search rankings.

Finding and fixing these errors will help Google consistently rank you accurately.

Remove URLs

If for some reason you need to temporarily block a page from Google’s search results, this is where you would go.

You can hide a page for approximately 90 days before this wears off.

If you want to permanently remove a page from Google’s crawling, you’ll have to do it on your actual website.

Crawl

This section of tools helps you break down the performance of your site according to how Googlebot sees it.

By utilizing the insights here, you can find broken links, see how often Google takes a look at you, and set up technical parameters for how you want Google to crawl your site.

Crawl Errors

It’s never good to have broken links on your website.

But when you do, or suspect that you might, you can verify what’s actually going on with this section of your Webmaster Tools.

You’ll be given an overall graph of your site’s performance:

This lets you know how errors have increased or decreased over time. You can see here we have quite a few broken links to fix.

Below the graph, you’ll find a list of URLs, response codes, and the date that the error was detected.

Google gives these to you in a priority ranking, so you know which one should be dealt with first.

By clicking on the URL, you’ll also be given an in-depth look at what exactly is going on.

If you have a lot of errors like this, focus on redirecting the ones with the most incoming links.

Crawl Stats

For a more in-depth analysis of how often Googlebot is looking at your site, you can select the Crawl Stats tab.

Here, the Webmaster Tools show you how often the pages of your site are crawled, how many kilobytes are downloaded per day, and what the download times of your site are.

According to Google, there is no “good” crawl number, but they do have advice for any sudden spikes or drops in your crawl rates.

Fetch as Google

This tool is helpful as it lets actually do a test run of how Google crawls and renders a specific URL on your site.

It’s a helpful way to make sure that Googlebot can access a page that might otherwise be left to guesswork.

If you’re successful, the page will render, and you’ll be able to see if any resources are blocked to Googlebot.

When you get to the debugging point of web development, you can’t beat this free tool.

Robots.txt Tester

If you’re using a robot.txt file to block Google’s crawlers from a specific resource, this tool allows you to double check that everything is working.

So if you have an image you don’t want to appear in a Google Image Search, you can test your robot.txt here to make sure that your image isn’t popping up where you don’t want it.

When you test, you’ll either receive an Accepted or Blocked message, and you can edit accordingly.

Sitemaps

I mentioned sitemaps earlier, so I’ll cover this again in brief.

Here, you will see information about your sitemap.

If you notice the last date your sitemap was downloaded is not recent, you might want to submit your sitemap to refresh the number of URLs submitted.

Otherwise, this helps you keep track of how Google is reading your sitemap and whether or not all of your pages are viewed as you want them to be.

URL Parameters

Google themselves recommend using this tool sparingly, as an incorrect URL parameter can negatively impact how your site is crawled.