I am having multiple variants of my website for different countries. According to the country(detected from IP address of the user), we were displaying the different content to the users. The same thing is done by youtube. For the meta information, currently, we were using same meta information across all the pages and variants. So google's bot can come from any IP we were displaying same meta information.

Now our business requirements have changed and we want to show different meta content for each country while keeping the URL of the page same.

Problem Statement:- We want an URL to be indexed for the different keywords for a different location.

I am seeking answers to the following questions

In 2018 has google started indexing pages using different locations? If yes do they have any documentation for the same? If no what is the default location of google's bot?

If Google finds different content on a page from 2 different locations. How is it going to process that content and link the content to the URL? Is there any penalty for the same?

If I start using subdirectories for the SEO purpose. How will I be able to make sure that for a particular audience only that particular locality specific page will show and not any other locality pages?

Google says they support the same URL, but I've never seen that work in practice. The automatic detection doesn't really work all that well anyway. Many international users have English browsers and people travel so IP detection isn't always appropriate. You should have different URLs and put a prominent link up top when you think somebody belongs in a different area. See How should I structure my URLs for both SEO and localization?
– Stephen Ostermiller♦May 24 '18 at 10:18

1 Answer
1

First of all. To rank different content and do different Page Title, Meta Description for the same URL is not possible and it's against Google guidelines. By Google interpretation, it is "cloaking". With a cloaking, it's pretty easy way to get a penalty if you doing it wrong.

Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.

Of course, in some situation, it's acceptible, for example, in a header you change the local phone number based on visitor location. Here could be some other similar exceptions as well.

Update:
After many requests to John Mueller in twitter and marking the Locale-Aware crawling help page as useless, the Google finally updated the information about Locale-Aware crawling and removed sentences which can make you think that it's possible in Google.

Do not use IP analysis to adapt your content. IP location analysis is
difficult and generally not reliable. Furthermore, Google may not be
able to crawl variations of your site properly. Most, but not all,
Google crawls originate from the US, and we do not attempt to vary the
location to detect site variations. Use one of the explicit methods
shown here (hreflang, alternate URLs, and explicit links).

What Google doesn't do:

Google crawls the web from different locations around the world. We do
not attempt to vary the crawler source used for a single site in
order to find any possible variations in a page. Therefore, any locale
or language variations that your site exposes should be communicated
to Google explicitly through the methods shown here (such as hreflang
entries, ccTLDs, or explicit links).

As we can see the confirmation from John Mueller in Google Webmasters Hangout, that Google Crawling out of USA are most likely using only for countries where for political or other reasons they see that are blocked IP addresses from USA. As well, Google most likely will continue to crawl your website only from one country, not multiple. So, at this stage, this Locale-aware crawling can't be used for multilingual websites, if you want that Google sees all your page versions.

Please use different URL structure for you your location-related pages, if you want to rank them in Google. And use proper Hreflang configuration for these pages which I mentioned below.

END of update

From where Google bots crawl your website? In most of the cases it's coming from the US, so in Google most likely will be indexed your US content and all your rankings will be based on this version of a page.

The best localisation solution is to make separate pages/domains for each location and do proper internationalisation setup. In this case, Google reads all possible versions where you have the option to customise all metas and content for each location and rank for different keywords.

For this type of setup, the most important part is properly setup Hreflang Attribute in your HTML head. Here is example:

This answer, while helpful is incorrect. Adapting, or changing you content based on a users perceived location or settings is not a violation of Google guidelines, and a few years ago Google announced they now support crawling and indexing of locale-adaptive pages: webmasters.googleblog.com/2015/01/… However, as both this answer and google mention, using separate URLs is a better approach to making SEO friendly multi language/local pages than using single dynamic page.
– MaxJun 13 '18 at 5:38

Hi @Max please read my updated answer on this. After some request to John Mueller Google updated official help page about locale aware crawling and it clearly says, that in reality, it's not working as expected back in 2015.
– gintsgJul 12 '18 at 22:21