3 key reasons to use a SEO log analyzer

Using a SEO log analyzer is not a waste of time as it provides a lot of technical and on-page informations about your SEO performance and how Google behaves on your website.
Log files are data referring to any connection to your website: http header, date and hour, referer, user agent, client ip, response port, etc. Exploring these data unlocks a tons of new possibilities to improve your rankings! Furthermore, combining your log files data with your crawl data can give you an exhaustive view of your performance and helps you optimize your Google crawl budget.

What can I learn using a SEO log analyzer?

Basically, a log analyzer helps you detect whether or not Google is dedicated enough of its crawl budget to important pages. In some cases, Google is visiting old ones that don’t generate any value. It could be ok if Google had an unlimited crawl budget but it is not the case and these old pages are seen at the expense of valuable and recent ones.
We have listed some of the questions that can be answered thanks to log analysis.

Are all my URLs well crawled?

In that example, we clearly see that pages out of the structure are crawled by Google bots whereas an important part of pages belonging to the structure are not (around 51%). There are optimizations to implement to be sure that 100% of the pages found by OnCrawl are also crawled by Google.

Do I need to reduce my Google crawl budget on some parts of my website?

It is important to understand the relation between your most crawled page groups and the page groups that drive the more SEO visits. Groups that are crawled very often by Google but that don’t generate any SEO visits are a waste of crawl budget. You are missing chances to convert on important pages from the SERPs as your crawl budget is not targeted to the right pages.
With OnCrawl SEO log analyzer, you can identify these pages and take action. The graph below shows you the relation between SEO visits and crawl frequency.

Pages located on the bottom left corner are pages that are often crawled by Google but that don’t generate many SEO visits while the ones in the upper right corner are often crawled and drive many SEO visits. Here, page group “Pneus 4 saisons” in orange is crawled quite often by Google but does not generate that much traffic.

The purpose here will be to reduce the crawl volume of pages that don’t generate traffic to foster pages that do.

How is my HTTPS migration behaving?

Using a SEO log analyzer combined with a crawler can help you monitor and check that your HTTPS migration has behaved as expected. There are several KPIs you should invest. Check that:

100% URLs respond 200 OK HTTP header ;

You have 0% URLs returned : X-Robots-Tag: noindex ;

All old URLs returned a 301 HTTP header to new URLs (if a new pattern or an HTTPS protocol is enabled) ;

All old URLs are crawled once by Google Bot.

With OnCrawl, you can:

Launch a new crawl before and after migration to get a exhaustive state of the art ;

Run a CrawlOverCrawl to compare before/after migration quality data: compare your staging version version your live version to identify evolutions. Does internal popularity has been involved? What is the difference of pages volume between your staging and live version? OnCrawl lets you compare your crawl and identify how your internal popularity has evolved;

Monitor how your live version behaves by looking at the status codes. That way, you will be able to identify in real-time decrease or increase of 302 or 404. Why should you wait for a Search Console alert when you can directly get this information ?

Export in csv your reports to compile your metrics.

These 3 examples are just a sample of all the possibilities offer by SEO log files analysis. We would be pleased to hear your thoughts and learn more about the metrics you are monitoring or would like to.

Want to know more about our SEO log analysis services? Ask your demo today with a member of our team.