Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.

Matt’s tweet read:

Google will more at cloaking in Q1 2011. Not just page content matters; avoid different headers/redirects to Googlebot instead of users.

Honestly, I thought Google already did a good job with detecting cloaking of different headers and redirects, and of course cloaking content. But I guess I was wrong. It is hard to tell exactly what Matt meant in the 136 characters he posted on Twitter. It is leading to a lot of concern in a WebmasterWorld and Cre8asite Forums threads.

So if you are doing any cloaking and getting away with it, I guess you may have to be on the lookout.

Postscript From Danny Sullivan: Over the years, Google has (in my view) redefined some things that would have been considered cloaking in the past to be acceptable, such as its First Click Free program. This has lead, again in my view, to some confusion. For some background on this, please see:

Personally, I’ve long wanted Google to take a less technical approach to examining cloaking (and whether it should be banned) and rather more one of intention. That would help with cases where people end up innocently cloaking and getting penalized, as Google had to penalize itself last year:

My view was that it was cloaking, since something special was being done just for Google (regardless of whether users saw the same thing coming from Google). Cutts was away on vacation when this came up, and we’ve just not had a chance to connect further on this since then. I’m on vacation myself now, but I’ll follow up on this after the New Year, as well as more about what else Google may be planning in terms of cracking down on cloaking.