Separating bots from humans

Separating bots from humans

How do I separate googlebots from human visitors for my tests? I want to make sure the original version shows to all traffic and the variations show to all except for googlebot (and possibly other spiders), this will make sure our seo efforts are not harmed with the tests. Or please let me know if the bots are excluded by default.

Re: Separating bots from humans

My first question to you, is whtat is in your Robots.txt file? That is the place to control what bots come to your site. If you can successfully block the robots - then your test will be clean. You should be able granually control to a file path, if you desire more specific bot management. More bot info can be found here: https://support.google.com/webmasters/answer/6062608?hl=en

Re: Separating bots from humans

Perhaps this has changed recently, (not my area of expertise, so do your own research to confirm this is still true), but as of a few years ago bots and spiders do not run javascript and would be automatically ignorant of any changes that your experiments may be making.

Re: Separating bots from humans

Thanks, NAPOLEON, that's why I want to make sure Google doesn't pick our variation title or whatever, which will harm our seo. Is there a way around it at optimizely or should I look for another a/b testing platform that does it?

With this information, a custom javascript audience condition, which is available on our enterprise plans, can be used to exclude visitors with a certain user agent. Here is an example on how the custom javascript would appear:

navigator.userAgent.indexOf('Googlebot') < 0

It appears that you are using our free Starter plan which doesn't include the "Custom Javascript" audience condition. I recommend using the code provided above and wrap your variation code around an if statement to prevent any variation code to appear for Google's bots.