The Moz Q&A Community

Hey friend! Have fun exploring Q&A, but in order to ask your own
questions, comment, or give thumbs up, you need to be logged in to your
Moz Pro account.
You can also earn access by receiving 500
MozPoints
from participating in YouMoz and the Moz Blog!

Is there such thing as white hat cloaking?

We are near the end of a site redesign and come to find out its in javascript and not engine friendly. Our IT teams fix to this is show crawlable content to googlebot and others through the user agents. I told them this is cloaking and I'm not comfortable with this. They said after doing research, if the content is pretty much the same, it is an acceptable way to cloak. About 90% of the content will be the same between the "regular user" and content served to googlebot. Does anyone have any experience with this, are there any recent articles or any best practices on this?

2 Responses

It is acceptable and completely common. Imagine you had a 100% flash site. The bots can figure out some of the content, but not a lot, so they actually need you to serve up a different version of your site so that they know what's there and can index you properly. As long as the content is the same, it shouldn't be an issue.

We have the same issue with our site HelloCoin, its pure ajax/javascript so we make a second no javascript version for every page for googlebot to crawl it, we just make it as much as possible similar to the original (user version). Just don't hide anything and show everything as it is, some functionality might not work but its not an issue, google just want to see how it looks for the user not how it works.

Hey friend! Have fun exploring Q&A, but in order to ask your own
questions, comment, or give thumbs up, you need to be logged in to your
Moz Pro account.
You can also earn access by receiving 500
MozPoints
from participating in YouMoz and the Moz Blog!
Learn more.