Follow Us

Understanding the Most Useful Tool in SEO Toolbox - XML Sitemap

XML sitemap is a powerful tool to optimize a website for search engines. But, in order to use this tool properly, it is important to have proper knowledge and skills. The majority of SEO experts think that XML sitemaps help getting the pages of any website indexed. Unfortunately, this severe misconception and it needs to be corrected right away. However, when you submit an XML sitemap to Google Search Console, you tell the search engine that in your opinion, the pages in the XML sitemap are high-quality search landing pages. Thus, Google can index them.

Consistency

It is important to maintain consistency while creating your XML sitemap. It can only be maintained when you do not include a page that is set to "noindex,follow." The "noindex" tag tells Google not to index the particular page. The tag "Nofollow" prompts Google not to follow the outbound links from the page.

Webpage categorization:

Every component webpage of any website broadly fall into any of the two categories.

Utility pages Pages that are useful to your users, but are not exactly search landing pages and

Top-quality search landing pages

The utility pages should never be a part of the XML sitemap. You should either block them by robots.txt or meta robots "noindex,follow." On the other hand, almost every top-quality search landing pages should be included in the XML sitemap. Just do not block them by meta robots "noindex,follow" or by robots.txt.

Google understands it is natural for every site to have a certain number of utility pages meant for users, like pages for sharing content, replying to queries and comments, logging in and password retrieval. If the XML sitemap that you create contains all these pages, then you are simply telling Google that you are in the dark about what is quality content on your site and what is not

.

Website quality

Suppose, you have a website of 100 pages, out of which 37 are top-quality search landing pages, whereas the rest are utility pages. While generating your XML sitemap, just include these 37 pages and ignore the rest. Now, when Google crawls over those 37 pages included in the XML sitemap, it grades 10 pages with "A", 8 with "B+", 12 with "B" and the rest 7 pages with "B-". Thus, you score a good average overall for the entire site. On the other hand, when you submit all 100 webpages in your XML sitemap, Google awards 63 of the pages in "D" or "F" category and the average score for your website comes down unexpectedly. Thus, Google will stop sending visitors to your site on basis of the poor score.

Google will consider the pages that you include in your XML sitemap as important although it will not ignore the rest. There is a hairline difference between using robots.txt and meta robots to stop page indexing. When you block a page with robots.txt, you're just flushing it down the drain, whereas when you use meta robots "noindex,follow" the link equity to that page, flows out to the other pages it links to.

Different situations

There are occasions, when you'll have crawling bandwidth issues and Googlebot will take a lengthy time to fetch your utility pages. Thus, when Googlebot fails to access the vital pages, you should use robots.txt. Blogs, product category pages or new product pages usually have their content updated regularly. On the other hand, there may be other pages that are better to be indexed, but not at the cost of ignoring the core pages mentioned above. In that case, just include the core pages in the sitemap to reveal their importance to Google than the rest unblocked pages.

To create a XML sitemap, you may consider any of the three options. Omit the pages from getting indexing that have:

No product image

Less than 200 words of exclusive description

No comments or reviews

Remember it is not necessary for XML sitemaps to be static files, as such they don't need to have a .XML extension.

The following tips will help you make the best out of submitting your XML sitemap to Google Search Console.