Note: Make sure to verify the HTTPS and non-HTTP versions of your site. This means adding: https://example.com AND http://example.com as separate properties.

Once you’ve done that, you can proceed to step 2.

Step #2: Decide on a “Preferred Domain”

Your 2nd step is to set a preferred domain.

This tells Google to use the “WWW” or non-www version of your site.

For example, Google can see your URLs as:

https://Example.com
Or
https://www.Example.com

I personally don’t like “WWW”. But that’s just me. It honestly has zero impact on your SEO.

So:

Why does this matter?

The version you choose here is the version that will show up in the search results.

So if you go with a “WWW” version, your site will have a “WWW” in the search results.

Important Note: You can also choose “Don’t set a preferred domain”…

…but I wouldn’t recommend it. As Google says: “If you don’t specify a preferred domain, we may treat the www and non-www versions of the domain as separate references to separate pages.”

This can make your backlinks MUCH less powerful. So make sure not to pick “Don’t set a preferred domain”.

Pro Tip: Setup an automatic 301 redirect that redirects traffic and links FROM your non-preferred domain name TO the preferred version. For example, all links that point to https://www.backlinko.com/ automatically redirect to https://backlinko.com/. This is best for user experience and SEO.

Step #3: Set Your Target Country

Google does a pretty good job figuring out which country your site is targeting. To do that, they look at data like:

Just load up the “HTML Improvements” report and Google will tell you what’s up:

Need specifics? Just click on a category for a full list of pages with issues.

Simple.

Chapter 3:Get More Organic Traffic with the Performance Report

In this chapter we’re going to deep dive into my favorite part of the GSC: “The Performance Report”.

Why is it my favorite?

Because I’ve used this report to increase organic traffic to Backlinko again and again.

I’ve also seen lots of other people use the Performance Report to get similar results.

So without further ado, let’s get started…

What Is The Performance Report?

The “Performance” report in Google Search Console shows you your site’s overall search performance in Google. This report not only shows you how many clicks you get, but also lets you know your CTR and average ranking position.

And this new Performance Report replaces the “Search Analytics” report in the old Search Console (and the old Google Webmaster Tools).

Yes, a lot of the data is the same as the old “Search Analytics” report. But you can now do cool stuff with the data you get (like filter to only show AMP results).

But my favorite addition to the new version is this:

In the old Search Analytics report you could only see search data from the last 90 days.

(Which sucked)

Now?

We get 16 MONTHS of data:

For an SEO junkie like me, 16 months of data is like opening presents on Christmas morning.

(In fact, I used to pay for a tool to automatically pull and save my old Google Webmaster Tools data. Now, thanks to the beta version of the new GSC, it’s a free service)

You want to filter out everything that’s beating that expected CTR of 4.35%. That way you can focus on pages that are underperforming.

So click the filter button again and check the “CTR” box.

(Make sure you leave the “Position” box ticked)

Then, set the CTR filter to “Smaller than” 4.35.

So what have we got?

A list of keywords that are ranking 5 or lower AND have a CTR less than 4.35%.

In other words:

Keywords you could get more traffic from.

We just need to bump up their CTR.

So:

Let’s see if we can find a keyword with a lower-than-expected CTR.

When I scroll down the list… this keyword sticks out like a sore thumb.

1,504 impressions and only 43 clicks… ouch! I know that I can do better than 2.9%.

Now that we’ve found a keyword with a bad CTR, it’s time to turn things around.

2

Find the page

Next, you want to see which page from your site ranks for the keyword you just found.

To do that, just click on the query with the bad CTR. Then, click “Pages”:

Easy.

3

Take a look at ALL the keywords this page ranks for

There’s no point improving our CTR for one keyword… only to mess it up for 10 other keywords.

So here’s something really cool:

The Performance report can show you ALL keywords that your page ranks for.

And it’s SUPER easy to do.

Just click on “+ New” in the top bar and hit “page…”.

Then enter the URL you want to view queries for.

Bingo! You get a list of keywords that page ranks for:

You can see that the page has shown up over 42,000 times in Google…but only got around 1,500 clicks.

So this page’s CTR is pretty bad across the board.

(Not just for this particular keyword)

4

Optimize your title and description to get more clicks

I have a few go-to tactics that I use to bump up my CTR.

But my all time favorite is: Power Words.

What are power words?

Power words show that someone can get quick and easy results from your content.

And they’ve been proven again and again to attract clicks in the SERPs.

Here are a few of my favorite Power Words that you can include in your title and description:

Today

Right now

Fast

Works quickly

Step-by-step

Easy

Best

Quick

Definitive

Simple

So I added a few of these Power Words to the page’s title and description tag:

5

Monitor the results

Finally, wait at least 10 days. Then log back in.

Why 10 days?

It can take a few days for Google to reindex your page.

Then, the new page has to be live for about a week for you to get meaningful data.

With that, I have great news:

With the new Search Console, comparing CTR over two date ranges is a piece of cake.

Just click on the date filter:

Select the date range. I’m going to compare the 2 week period before the title change, to the 2 weeks after:

Finally, filter the data to show search queries that include the keyword you found in step #1 (in this case: “best helmet brands”).

Boom!

We’ve increased our CTR by 63.2%. And just as important: we’re now beating the average CTR for position 5.

Pro tip: You’ll find that different title formats work better in different niches. So you might have to experiment to find the perfect format for YOUR industry. The good news: Search Console gives you the data you need to do just that.

How To Find “Opportunity Keywords” With GSC’s Performance Report

If the last example didn’t convince you of just how awesome the new Performance Report is, then I guarantee this one will.

What Is An Opportunity Keyword?

An opportunity keyword is a phrase that ranks between positions 8-20 AND gets a decent number of impressions.

Why is this such a big opportunity?

1

Google already considers your page to be a decent fit for the keyword (otherwise you wouldn’t be anywhere close to page 1). When you give your page some TLC, you can usually bump it up to the first page.

2

You’re not relying on iffy keyword volume data from third party SEO tools. The impression data you get from the GSC tells you EXACTLY how much traffic to expect.

Mining For Gold With Google Search Console’s Performance Report

Finding these gold nugget keywords in the Performance report is a simple, 3-step process.

1. Set the date range to the last 28 days:

2. Filter the report to show keywords ranking “Greater than” 7.9

3. Finally, sort the by “Impressions”. And you get a huge list of “Opportunity Keywords”:

“Making a site faster improves the users’ experience while also increasing crawl rate. For Googlebot a speedy site is a sign of healthy servers, so it can get more content over the same number of connections. On the flip side, a significant number of 5xx errors or connection timeouts signal the opposite, and crawling slows down.”

Bottom line? Make sure your site loads SUPER fast. You already know that this can help your rankings.

As it turns out, a fast-loading site squeezes more out of your crawl budget too.

3

Get more backlinks to your site

As if backlinks couldn’t be any more awesome, it turns out that they also help with your crawl budget.

“The best way to think about it is that the number of pages that we crawl is roughly proportional to your PageRank. So if you have a lot of incoming links on your root page, we’ll definitely crawl that. Then your root page may link to other pages, and those will get PageRank and we’ll crawl those as well. As you get deeper and deeper in your site, however, PageRank tends to decline.”

The takeaway:

More backlinks = bigger crawl budget.

Get The Most Out of “Fetch As Google”

I already covered the “Fetch As Google” tool in Chapter 3.

But that was one part of a big process. So let’s take a look at Fetch As Google as a standalone tool.

Specifically, I’m going to show you 3 cool things you can do with the Fetch As Google tool.

1

Get new content indexed (in minutes)

Fetch As Google is the FASTEST way to get new pages indexed.

Just published a new page?

Just pop the URL into the box and hit “Fetch”.

Next, hit “Request indexing” to send Googlebot to that page.

Finally, choose “Crawl only this URL” and hit “Go”…

…and Google will normally index your page within a few minutes.

2

Use “Fetch as Google” to reindex updated content

If you’re a regular Backlinko reader, you know that I LOVE updating old content.

I do it to keep my content fresh. But I also do it because it increases organic traffic (FAST).

For example, in this case study, I reveal how relaunching an old post got me 260.7% more organic traffic in just 14 days.

And you better believe I always use the “Fetch As Google” tool to get my new content indexed ASAP.

Otherwise, I have to wait around for Google to recrawl the page on its own.

As Sweet Brown famously said: “Ain’t nobody got time for that!”.

3

Identify Problems With Rendering

So what else can the “Fetch As Google” tool do?

“Fetch And Render” shows you how Google and users see your page.

And because it shows them side-by-side, you can EASILY spot differences.

In this case…

…Houston, we have a problem!

Looks like Googlebot can’t load some of the images on that page.

What’s going on here?

Well, I scrolled down to the bottom of the report. And it looks like the images were temporarily unreachable.

Maybe it was a fluke.

So I went ahead and ran the “Fetch And Render” again.

And Google still couldn’t access those images. Hmmm.

Next, I wanted to see if I got the same problem if I fetched using Google’s mobile crawler.

To do that, choose “Mobile: Smartphone” here.

So what happened?

That’s better.

Now, I should point something out:

Google recently updated their guidelines for the Fetch as Google tool.

Back in the day you got unlimited single URL submissions. You also got up to 10 URL + “crawl this URL and its direct links” page submissions a day.

WOW… great post as always. I think this tool is underrated and certainly underused because it’s free so people don’t think it’s valuable. It allows us to see data that can’t be seen with other tools. What do you think about the keywords everywhere tool, have you ever used it? I’m just curious, either way, keep up the epic posts.

Great depth as usual Brian! It’s crazy how many website owners solely focus on Google Analytics when Search Console actually provides a much clearer of SEO progress, especially during early stages of campaigns when a lot of people lose hope. Awesome advice on identifying the “power house” pages for internal linking!

Well said, Sam. I think part of the reason for that is that GA is easier to use and understand. “Users” is more intuitive than most of the terms the GSC uses (even the new one that’s designed to me more user friendly).

Thanks Niall. You’re not alone. There’s A LOT to the GSC that people miss (myself included until recently). As it turns out, even though millions of people technically use it, the GSC is an untapped goldmine.

Thanks Jamie. To be honest, you still need to use the old version for some stuff (like submitting pages for crawling). But they should have the new one out of beta before the end of the year. Also, thanks for the topic suggestion. I might cover that in a future post (or write an entire post on it).

I keep reverting to the old version as well. They’ve moved things around in the new version and it seems like getting to things requires more clicks. In other words, the new version is not very user friendly compared to the old one. (At least in my opinion.)

This guide is super awesome. Thanks for sharing such an in-depth information. Whether it is STW videos or your articles everything is explained in a super simple way. As far as my experience is concerned, I uncovered lots of potential keywords from Google Search Console data which we haven’t thought of targeting in the beginning.

No problem, Navneet. In my opinion, that’s the best part of the GSC: finding untapped keywords that you’re already ranking for. And unlike most keyword research tools, the data comes straight from the horse’s mouth.

Thanks for the guide, Brian. Hopefully, I will now be able to fix my Structure data errors. I currently have a couple of errors: Missing: author
Missing: entry-title
Missing: updated. Do you have a fix for the entry-title and the update error?

Hey Brian, another awesome post.
On statement you might want to revisit:
“Note: Make sure to verify the HTTPS and non-HTTP versions of your site. This means adding: http://example.com AND http://www.example.com as separate properties.” You mentioned HTTPS and non-HTTP, but your example cited the www and non-www versions.
Thanks!

Thanks Joe. I have a few pages in that category. When I check them, it’s exactly as Google describes: they’re pages that are indexed but not in my sitemap. In my case, they’re pages that I actually don’t want indexed.

Awesome, Umesh. Fortunately, most of the same features are in the new GSC. But it can take some time to get used to it (in fact, that’s one of the reasons that I wrote this guide: to force myself to learn about the new GSC).

Your guides are pretty easy. Very helpful for Beginners as well as Professionals. I’ve been reading Backlinko posts for a long time. I have commented one or two times here.
Your writing style and easy-to-understand methods + long in-depth posts help us to learn more and grow our blog.
Thank you, Brian!

The most understated part of this whole guide is the fact you get 16 months of data. This is really useful if you’re new to working with a site and have no history of rankings etc.

Being able to spot that a specific keyword dropped 20 positions at the time of a Google update 6 month before you took on a client or joined a new company is pure bliss. In my experience people are much better at setting up Search Console for a site than analytics or any form of keyword tracking.

Thanks once again for the super informative post. It’s awesome.
I was expecting you to talk even briefly about the backlinks section in Search Console. Do you think you will ever do an analysis on the correlation between search console backlinks and the main SEO tools backlinks reporting at any point? Now, that would be an interesting post.
Thanks again

Thanks, Brian! As always, great tips & insights. GSC has indeed great features & it’s just a case of using them in the right context. I loved the way you show CTR optimization based on The Performance Report. It really works, I’ve tested that myself. That’s also a low-hanging fruit anyone can just go ahead and implement right away. Other tips are also super helpful, especially for beginners who are just starting out with GSC.

You’re right and make a great point: the GSC has a ton of pretty amazing features that most other SEO tools would charge thousands for. The downside is that, unlike tools like Ahrefs etc. , it can be hard to know when to use the GSC (and how to use it).

That’s a lot of errors actually. I’d recommend working with a technical SEO (or developer who knows a lot about SEO) to get that fixed. It’s hard for me to diagnose the problem without digging deep into your site.

I believe a lot of those are because of redirects. Bluehost may have changed directories (not sure!). What I just did after reading your blog is “fetch as google” my main URL. Once Google re-indexes it, it should resolve the redirects (replace with the right ones), correct? I hope I’m understanding this correctly. Thanks again for your help!

Assuming they’re 404 errors, unless you setup 301 redirects for them, Google will continue to complain about them. Of course, you only want to setup a 301 redirect if a page on your current site directly corresponds to an old url (ex. contact.php is now contact/). If not, you can safely ignore them.

Thanks Matt & Brian. What happened was I switched the hosting packages on BlueHost & the new package has a different directory where everything is stored. Google has the old address, but when I click on that old address, it automatically forwards to the new URL. I still see crazy # of errors in GSC (new version) though. Will that impact me negatively? Thanks again for taking time to answer my question.

Dr Brian,
You have totally made me look so ignorant about the search google just after reading blog. I didn’t know that all long i have been a kid posing as an industry leader.
Thank you for again putting in your time to put up this blog. Am sure it needed a lot of resources.
Now, let me head straight to my gsc and i do as you say.
Kudos Brain.
Your content is always at the top.

GSC helped me to check articles that are not indexed so I can update them or resubmit them. But I didn’t know that this can actually help me improve my blog! I did not regret reading this guide. It is very helpful! Thank you!

Thanks Paul. You’re not kidding: this guide was a BEAST to put together. As you know, there’s so much inside of the GSC to cover. But I’m glad it’s done because I think it will help a lot of people get more out of the GSC.

Hey Mads, happy to help. It was tough to write a guide that included material for people that are new to SEO and those that have been in the field for years. That’s why I included chapters 4 and 5. That way, people that are experienced (like you) can easily find some cool new techniques.

Hi Brian,
Another fantastic and highly informative piece of content. The whole guide was great but in particular I found Chapter 3 remarkable. My team and I will be be reviewing the Performance Report for all clients each month as a tool to increase Organic traffic. You set the bar really high, keep up the great work!
-Brian

In the section for “Identify Problems With Rendering”, you mention that “Today, you only get 10 individual URLs and 2 site recrawls”. Interestingly, we used to have limits on it, but it no longer shows those limits in the pop-up box. I also recommend submitting the “Mobile: Smartphone” version because of Google’s mobile index. If you don’t have GSC access, you can still get Google to crawl/index a page quickly by using this form (https://www.google.com/webmasters/tools/submit-url) or by Googling “submit url to google” and using the field at the top of the SERP.

Thanks Justin. I noticed that too. I’m not sure if the limit still stands (I haven’t heard anything from Google either way).

Re: mobile vs. desktop, I’ve tested that. And as of right now, desktop makes pages index faster (at least from the sites I’ve looked at). Although that will probably chance once the mobile-first index fully rolls out

Hi Brian, so I submitted a comment and saw the “comment-page-4” part in the URL. That should only happen if you have “break comments into pages” enabled and I see no way to navigate to the earlier pages of comments. Also, the page seems to have too many comments already and it doesn’t seem you’re breaking them down. I even tried to edit the URL to reach page 3, but it seems to be still the same comments appearing up.
Can you explain to me why it appears in the URL? Is this some sorta magic implementation?

I always get jealous the way you write, LOL! You are such a mastermind.

I like your “Pro Tip: Supercharge Key Posts With Internal Links From Powerhouse Pages” that always works for me because I don’t need to waste time on making backlinks every time, I can get link juice from the powerhouse pages: thanks to PageRank algorithms.

But you have missed some important features of Google Console (I wish I could post images here).

1. Change of Address. Sometimes you need to change your website address to someone else via 301. Google Console guides you well in 4 simple steps. It can be found in the top right corner of your screen as a wheel icon, below your Google DP.

2. You didn’t mention; you need to add all the four versions of your site www, without www, HTTP, and HTTPS and set yours prefer domain that you want to show in SERP.

3. From the Site Setting, you can change your crawl rate if you have a great server, and you have zillions of pages that you wanna index faster.

But, Sir, I am not claiming to be a guru, may be, it is not the subject of your above post, but it can be improved significantly to make this guide the definitive and ultimate guide.

Hey Aamir, I’m glad to hear that you liked the guide. Even though this guide weighs in at 7k+ words, I unfortunately couldn’t cover absolutely everything in the GSC. So I tried to focus on the features that have the biggest impact.

Wow great post as alway but i have one question sit “I read somwhere that after the recent update of google importance of backlink in Website ranking is decreased and now social signals make more impact in websites ranking.. Is it right???

Hey,
Good article on google webmaster tool. I am planning to remove bad links via Disavow feature..but I am afraid it may cause damage to the site. Can you recommend any youtube video or good guide about using Disavow for bad links? (looking for the detailed guide on this)
Cheers

You’re welcome, Marco. I don’t closely track how long I put into each guide (it’s an insane amount for sure). Plus, I had a lot of help for this one (especially from the super smart David McSweeney) who was a HUGE part of making this guide a reality.

one ques..you said if I update my old article, I should again fetch it to Google from search console? Right? Previously I thought when I update my old posts, I also update date for that article..plz explain this.

But one complain, now you don’t show any strategy like previously. I think skyscraper technique and moving man method is great and I want more technique like this. Please consider to show new technique that quickly work like you previously published.

I want to point out one sentence that doesn’t make sense to me.
So you say: “It’s worth double checking these errors to make what your blocking is meant to be blocked”.
I’m sure that sentence needs one “sure” and “you’re” 🙂
Anyway, I noticed these errors as I was reading.
So I read it and moved on, and realized- I suddenly don’t understand what I’m reading.
This is the first time I noticed such a confusing error.
In other words- keep up the good work:)
Cheers, Brian!

Hi Brain,
Fantastic work as always! Thank you.
Question: When you add the www and non-www domain and choose to set preferred domain as non-www cab I delete the www verified domain or shall I leave it?
Thank for your work

Thank you Brian, Also, I am using the https version but have both versions verified (https,http) shall I leave the http version as a property and don’t delete it and keep using the https version for sitemaps etc?
Thank you

Such amazing detail here, really easy to follow…once again. I haven’t really looked into this much, but I thought you might know:

If you’re trying to rank locally in Canada but the primary customers you’re targeting are in the US, what would you set your Target Country to? And if I made it US, would it affect my local ranking? All my servers are in the US, and using a .com.

Hey Neil, hmmmm that’s a tricky one. In general, at least for the GSC, you want to focus on where your customers search for. I’m not a local SEO expert, but I think your actual location matters much more for Google+ local vs. “normal” SEO.

I’ve actually never had any issues ranking customers locally before, and not even worrying about this targeting. As long as the G+, G My Business and all the information matches locally, with your site details that is.

But everything G does throws in a bunch more questions!

Also, I asked this question in the Webmaster’s Forum and got this response, please take it for what its worth.

“Hi

Not really – I use a dot com domain set to target US since that is the larger market.

It is called ‘Targeting’, not ‘Restriction’. It doesn’t’ restrict a site only be visible in one country.”

Hi Brian, this is one of the best GSC guides what I have seen in my life. Loads of stuff what I do already, but you gave me some new hints and made attention to some things, what I know, just a bit put in lower priority. Thanks for this and refreshing my mind.

Hi Brain. Thanks for such a detailed and knowledge-rich article on GSC. I learned a lot of new things. I will definitely experiment with them and see the results. I must say your SEO articles are a goldmine.
Stay Blessed.

Hey Brian – Gold Standard Content as always… A Month of research just goes to show the hard work that goes in to create such high-quality stuff – It’s not lost and much appreciated ……. Do they hand out black belts for SEO?

Hey Burak, I agree. The GSC is still missing some features vs. the old one. But what they have so far are leagues better than the old GSC. And once they add the missing features, it should be a lot more useful than the old GSC.

This is definately a definitive guide to search console… A great how-to article as always. I am so glad that Google updated the interface which to me makes it easier to extract the data webmasters need in order to make much better necessary changes to their site(s). When indexing I typically, in not always click “crawl this URL and its direct links”. Thanks for always providing great insight for any skill level to learn from.

Thanks for the awesome information. Don’t have time to go thru all of it today but I checked the first 2 chapters and am sooooo happy you put this out there. Hope it is going to be there for awhile. I have to setup all of my sites this weekend and will need to reference this again.

Brian! This is exactly what I need. 🙂 So much money leaving on the table without this GEM knowledge. A quick question on your example for pages that are ranking #5 or below. What does #5 means? Page 5 or position 5 in page 1 ?

Holy crap. Just went through GSC step-by-step as I read this (took half of the day), and discovered so many things I never knew…

For example, my sites ceased to be verified (because I deleted the HTML verification file to save space — bad mistake). I also learn the two keywords which deliver the most impresions (deskmos and logarithm), and found out that mobile CTR is actually 70% higher than desktop!

In short, I love GSC much better than Google Analytics, which I’ve always felt hard to navigate around.

Hi Brain
This is tutorial is a complete guide for all SEO learners and also for professionals. This is truly a definitive guide to Google search console.
Thank you for all your time making screenshots with a detailed description and tips.

No problem. This guide will probably need a major update as Google rolls out updates. But for now, in my humble opinion, it does a great job of showing people how to get the most out of the new and old GSC.

again, nice work!
When I set the filters as you suggest (4.9 pos), I also get results that rank well for their position. For example, if an Pos. 9 keyword has a CTR of 4%. That’s pretty good. Some people will now edit the metadata of this page according to your guide. I think it would make more sense to go through position by position and look at the keywords that, for their position, underperform.

Hey Cristoph, thanks! You’re right: a 4% CTR at position 9 is excellent. So you make a great point: it’s more about looking at how each page is performing vs. automatically changing pages that are ranking at position X.

I have a question regarding Targeted country. Currently iam getting most of the traffic from india. If I change the targeted country to USA really it make sense? I mean am I losing Indian traffic or overall traffic?

Taking a seemingly simple toolset & creating a guide that rings true for beginners & still has more experienced users like myself hooked throughout.

Question on crawl budget, specifically the parameter handling. Would you just trust the parameter tool alone? I like to take more steps to ensure the crawl can be optimised. Canonicals/Robots.txt/Using JS for faceted navigation etc.

Nice guide. Search Console is a must use tool, but lately it has become very slow. In some cases you have submit link many times before it gets crawled. Also the number on links (domains) can drop without any reason. Not sure if it’s to do with background data migrations of any sort.

Wow, you did it again Brian.
Amazing post once more, i love how your guides are always taking the user all the way.
Just want to let you know that you are also getting alot of attention with shares on Linkedin here in Denmark 🙂

My understanding is that Soft 404’s are typically when Google visits a page they feel should be a 404 but is returning a non-404 header (such as a 200, 301, etc). To fix this, you need to consider the following:

1 – Is this a 404 page that returns the wrong header? If so, fix
2 – Is the content on this page thin/worthless? If so, expand or delete
3 – Am I redirecting this page somewhere that doesn’t seem relevant? Redirect to a more relevant page or simply mark as 404

“Submitted URL not selected as Canonical” — Note if you use the info: command with the submitted URL, they will show you what they chose as the canonical, so you can compare and figure out why they think the pages are similar and correct it

You’re welcome, Jake. My understanding is actually the opposite: that a soft 404 is a page that should be up (like a 200), but it’s 404ing for some reason. Like in the example, the page was up, but was showing a 404. And when we sent the crawler back to the page, the soft 404 went away.

Jake is correct. A soft 404 is a page that returns a 200 status code, but Google feels that it should return a 404 status code (because it has an error message or minimal/no content in the body of the page).

Thanks Matt. I wonder what was up in the example/case study from the guide (How to Fix “Soft 404” Errors). The page didn’t have any error messages and had some content (not a lot, but it’s an ecom site). Maybe the page was down when Google visited it.

One question. Why we need to add both http and https versions of the site?
—–
Make sure to verify the HTTPS and non-HTTP versions of your site. This means adding: https://example.com AND http://example.com as separate properties.

Hey Alexander, good question. Honestly: it’s not a big deal. But as Google considers them separate properties (ie. websites), it’s good to have both verified. It also helps you see if HTTP versions of your pages are ranking in Google for some reason.

As usual, such a great article and full details on how to better use GSC. Unfortunately not much people are even aware of GSC and this is sad as it help understand way much better where you stand with your site and how Google see it.
Thanks for such a nice article about GSC.
Serge

A comprehensive and overall awesome guide to Google Search Console. I’ve got this excellent reference bookmarked! A question for you. Is there a reason you do not submit SEO by Yoast sitemaps individually? For example, you suggested submitting the sitemap as “/sitemap_index.xml”. I’ve – for years now – have always submitted sitemaps individually (e.g. /page-sitemap.xml, /post-sitemap.xml). My methodology is that they are separate sitemaps so they should be treated as such. Also, it’s easier to identify individual sitemap issues in GSC. Can you offer any insight on why you prefer the method you’ve recommended vs. my GSC sitemap submission process?

Hey Randy, there’s no real reason actually. My take: as long as Google finds all of your sitemaps, you’re set. Now that you mention it, it does make sense to submit them separately… but I just submit them in one go which also works.

It’s interesting the keyword average rankings are lower than what my ranking tools are showing. Are you experiencing this as well? Obviously it’s an average of more than what my ranking tools are showing but do you think it’s taking into account rich snippets, and other noise on the SERPs? Awesome guide by the way. I’ve been digging into the new SC for the last 4 days because of this. Lots of fun!

At first, I thought we were discussing Girl Scout Cookies (GSC) but I read on anyways to be blasted with an amazing article full of actionable directions to super tune my site with Google Search Console Google Analytics. Awesome article, the only thing that would make this read any better would be a glass of milk and some Girl Scout Cookies, I like the maple ones! Time to get to work and get some GSC 😉

It’s me again. 🙂 I tried CTR method. Now, I am trying out the opportunity keywords. Say, we found one great opportunity, do we start a brand new page on this KW, or continue beefing the existing page ?

What an in-depth article about GSC!
Thank you, Brian.
Google Search Console is really a goldmine. I love using its “Performance” Report ( “Search Analytics” in the older version). Earlier I used to ignore Index Coverage report because of insufficient knowledge but now as I have bookmarked this post, I can refer to it again & again. 🙂

As always another great post with real-life examples on how to do it. Hmmm…one thing stroke me when reading it rgd. country selection and reach. Can you only pick one country? E.g. in Europe or how do you cover all Europe. Have investigated before but never found clear answers? Your advice is appreciated, THX:-) Looking forward to your next post

Hey Bruno, good question. You can only choose one country unfortunately. Google hasn’t said how your target country affects visibility worldwide, but I don’t think it’s a huge ranking factor. So I’d go with the country that makes the most sense and go from there.

One of the best guide I ever read it took me 30mins. But so worth and clear my many doubts. But one question now a day’s my website indexing took long time can you suggest me any information what I need to do.

My question is, if I set my international targeting to Malaysia, does that mean I am only able to rank for say “b2b lead generation” or other keywords in Malaysia only? The language is English as well.

The dilemma is, my local agency is based in Malaysia and I would like to rank for “digital marketing agency in Malaysia”, so the international targeting is set to Malaysia.

However, I am still unsure if this country setting would affect the ranking of my blog posts, whether they will rank ONLY in Malaysia or worldwide as well.

Not sure if I am missing something but referring to the “Check Indexed Pages For Possible Issues” section. My website has way to many pages indexed…115, yet it should read 16. How do I fix this please. That section of the post didnt dive into the fix, unless I missed something 🙂

Wow thank you for this best post about seo realy is amazing.
I have a question about international targeting, i have domain extention .to but google generate auto target to contry tonga, the problem is i cant change it to international targeting on webmaster tools, how can change it to international targeting ?
thanks you

Hey Brian,
When clicking the URL parameters in GSC, I have this message from Google: “Currently Googlebot isn’t experiencing problems with coverage of your site, so you don’t need to configure URL parameters.”
Do you think it’s still a good hack for optimizing Crawl Budget anyway?

Hi Brian,
Thank you for the detailed guide. It helped me discover the new Google console, and unlock cool features.
I found that my blog’s posts are ranking much better on mobile (Av position 7.7) than desktop (Av position 23.7), would that be a good thing or an indication of a problem of my desktop website?
Thanks!

Hello Brian and the amazing community out here.
First an amazing article as always and always takes a few scrubs to really get the most out of these. As I was following along with my own site and the tools I noticed that my performance report results does not have the “Position” and “CTR” columns as part of the results. I tried (what I thought) is everything to get these to display.

Hi Brian,
I noticed that you conduct case studies on your audience’s blogs. I have a blog “WinSavvy” with a DA of 10. Although I have guest posted numerous times, the posts have low PA although the DA of the sites are okayish. I was wondering if you would like to conduct a case-study on my blog by having me try your tips and provide you with feedback and the results!

This was literally SO useful! I was looking to create a comprehensive checklist/audit for a few of the websites I managed, and by far this was the most helpful one!

You didn’t just gloss over the setup process and basic features. You really went for it and gave explanation and what to do for different scenarios. I will definitely be coming back to this guide in the future. #Bookmarked

Wow, this is exactly what I needed. Thank you so much. Google Search Console has been the “one door” I have been too scared to open. Now I feel I’ve got a roadmap.

One question about the “Add a Property” part. Right now I only have only added ONE property for my website. Since I have an SSL certificate, I added the “https” version. I was nervous to add in the “http” version because I thought:
#1. I might have to VERIFY ownership (and it was such a pain doing it for the https version)
#2. I’m not sure if that also means submitting ANOTHER set of XML site maps (I am using the Yoast SEO plugin) and I’m not sure how to make those for the “http” property
#3. Wasn’t sure how having only the ONE property is effecting my visitors and/or SEO
#4. Kind of thought that since GSC seems so scary… “If it aint broke don’t fix it!’ But then again, maybe that is why the “i” comes up for a second or so, when you type my website URL Chrome’s search box?

So I guess I’m wondering if you could clarity if this step is really needed. And whether you have to verify your site again. And, how to make the additional XML Sitemap.

Thanks in advance for any feedback. This guide has been incredibly helpful.

Hi and thank you for this guide! I added both http and https versions of my site to google console, but when I try to inspect URLs in my site from the new console page, I get a “URL not in property” error- any idea why? Thanks in advance!

Hi Brian,
I had already had a Property added in my google search console, but the moment I tried the second advice here, it asked me to re verify the site, which I couldn’t , so I added a new Property in GCE and then did the second advice and the complete instructions as in the chapter one here. My question now is I have 2 properties now in my GSC one with “www” with the preferred domain type selected as none(which I do not want) and another with the “non-www” my preferred domain type selected but this is the new one, is it going to harm me , shall I delete the older one, Is it going to harm me , deleting one , or leaving the two property as it is here ? lol.. confused now. Can i be helped in this..

Fantastic!!!
This is a post that leaves you with your mouth open. Without a doubt the best GSC tutorial there is. I’ll keep it in my favorites for every time I have any questions.
Thank you very much Backlinko 🙂

An incredible guideline on how to use GCS which I never ever think of. I appreciate if you can point out on one section about checking internal link to make a proper internal link building as I am still unclear. Can you point out how to link it the correct way by using those report? I am not sure how u extract and those report and how to add link to it. thx

If you did extensive keyword research (albeit with free tools) and your best keyword was ABC but the search console revealed that over the past 6 months, the best word order was BAC, would you trust the console? Would you change your homepage to read “BAC” rather than “ABC?” Thanks!

Thanks for this great info!
I use GSC a lot for tracking my keywords but I discovered some great features thanks to you. I was happy to see my average position and CTR is higher on mobile than desktop. Now I’m not surprised my mobile traffic is really higher than my desktop traffic.

Wow, Brian – fantastic info!
You’ve just made me a huge fan of the new Search Console.
But I’ve noticed this:
I used the “Mobile Usability” report to compile a list of pages not listed as “mobile friendly”. Fewer than half the pages on one site were listed. I then checked some of those URL’s in the mobile friendly tool (they passed), fetched them using the Mobile:Smartphone bot and requested indexing. My Mobile Usability report shows an increase in the number of listed pages and amp pages, but my Valid page count in the Coverage Report decreases. I’m not sure what to think – maybe I’m duplicating the content, or maybe it’s a Mobile vs Desktop index issue., IDK.
Anyway, I’ll be referring back to this guide for days until I’ve fully absorbed it all; thanks for this post!

You’re welcome, Jim. I’ve actually noticed a lot of changes with the Mobile Usability report lately. We had a few pages listed as not mobile friendly (“Content wider than screen”) that are now listed as OK. Even though we made no changes.

I have been suffering with some issues on my Wix site. They do like to break sites for you. Until I just read about your URL parameters section I was relying on just canonical tags. But there in the URL parameters box was a whole bunch of URL’s all related to the issue. Boom! They are sorted in a click, never to be indexed again. Great content as ever, as useful as ever! Thanks

Hey Brian that was awesome post.
But you didn’t show the solution of rendering In chapter 5 (Identify Problems With Rendering). How to solve the issue when images are blocked. Any definitive guide to resolve the issue. i have the same issue.

Brian, I’m confused.
I followed your awesome guide but I noticed I keep receiving hits from the http version of my website, even though I switched to https months ago and there are no urls on http anymore.
How is it even possible?

Hi Brian,
Not sure if I missed it but on your index coverage report its showing 213 pages where Google chose a different canonical than user. Do you see this often and do you have any tips for getting Google to choose the proper canonical link?