Google’s rolling out a new system where ad landing pages will be automatically spidered by a new AdsBot. The content of landing pages will help determine the quality of an ad campaign. That quality score, along with the amount you are willing to pay, is then used to determine an ad’s AdRank, the position where an ad will appear in the results. A high quality score means you can rank higher even if you pay less than others. And not participating in the new spidering system can hurt your AdRank.

What’s the deal? Didn’t Google already spider landing pages as part of the announcement back in December that landing page content would be assessed? To my understanding from Google, only if the AdSense spider had seen the page for ad content placement purposes or if regular Googlebot had already indexed the page for inclusion in the web search index. If the page wasn’t already visible to these or perhaps some other Google spiders, or had been specifically blocked from spidering, then AdWords couldn’t assess it.

Sometime in the coming weeks, a new AdsBot crawler will be grabbing all landing pages independently of AdSense, Googlebot or other Google spiders. Can you still block being spidered? Yes. But if you do so, Google AdWords will consider you a “non-participating advertiser” in the review process. As a result, you’ll take a ding on your overall AdWords quality score.

While you can exclude your site from review, this will provide us with little information about your landing page’s quality and relevance. Therefore, if you restrict AdWords from visiting your landing pages, you will experience a drop in Quality Scores for your related keywords. (This will cause higher minimum bid requirements for any landing page for which you’ve restricted access.)

That page also explains how to block AdsBot from getting your pages, how the visits won’t cost you money even though AdsBot is following your ad links and how blocking or allowing AdsBot to your site will have no impact on what Googlebot thinks about it in terms of ranking it for free, organic results.

I talked with Google about the coming changes during my visit there last week. They’re all part of moves Google’s trying to do to improve relevancy.

Google’s aware of concerns with both users and some advertisers about low quality landing pages, in particular sites that do nothing but show again the same ads you already saw on Google. This type of arbitraging recently got attention in both in our Search Engine Watch Forum thread, AdWords Abuse & Search Arbitrage, which in turn begat a video illustrating some examples.

From an advertiser point of view, Google’s not finding ads of this nature are bad, even if advertisers disagree. Nicholas Fox, senior business product manager for AdWords, told me:

Advertisers are in two camps. Some don’t want to be on the pages and don’t want these ads competing with them. Others say “these ads are fine, they convert, I’m getting leads and no complaints.” We look at this closely internally. There’s no reason to believe we are delivering poor ROI though these sites. We are measuring, and it doesn’t set off the alarm bells.

Instead, Google’s more concerned that these types of landing pages might hurt the user experience. Said Fox:

I am very concerned about the user experience. They do the search, saw both unpaid listings and ads, then click and get to another page of ads. That’s not a good user experience. It’s a pretty bad experience. My guess is that as a result of seeing these ads, users are going to become blind to them.

To solve the problem, Google shifted away in the middle of last year from the clickthrough rate x cost per click model it has used for ranking ads. Google found it could pick up some quality signals from landing pages, which it starting making use of last December, then relied more heavily upon last April (which is one reason why some people saw CPC spikes, Google said). Other things, such as if people were to click on an ad then select another ad can also be used to help assess potential quality.

Google said it’s not certain how often AdsBot will refresh its collection. It won’t be continuous at first but is expected to ramp up in activity over time. As for what makes a good landing page, Fox didn’t have specifics. However, human reviewers remain part of the process, especially to provide benchmarks to help train the automated system and do double-checking.

Google: AdWords, Part of the Search Topics area for Search Engine Watch members, it has a number of articles and posts about quality issues.

Finally, on a separate issue, Fox said Google’s aware of concerns about high minimum bids being suggested on low volume ads (such as John Battelle encountered recently), and they are working to reduce the frequency of this. Also, as ads run over time, it helps Google learn if they should come down in cost. A predictive system is involved.