Again, this is launching as soon as tomorrow and will look specifically spammy queries, such as terms like [payday loans], [casinos], [viagra] and other forms of highly spammy queries.

What is Google's definition of a "spammy query"? The examples [payday loans], [casinos], [viagra] don't seem to have much in common, except possibly they are disreputable or disgraceful or unseemly, in some peoples' minds. (although another possible thing they have in common is a lot of adwords competition.)

Not too spammers try to elbow each other out of the way for "mother teresa" or "history of borneo." "Casinos," "viagra," and "payday loans" are a different story, and it isn't too difficult to believe that Google might have a pretty good idea of what the most popular "spammy queries" are.

I don't get it, a query on it's own cannot be spam - surely only the websites that rank for that term can be classed as spam, so this update is going after spammy websites?

OK, so maybe Matt Cutts should have said "spam-prone queries." The point (IMHO) is that, if we're to believe what was announced, the newest version of the Payday Loan Algorithm will target search queries that attract spammers the way a picnic attracts ants. But--as always--the ranking decisions will be about pages and sites.

Well if Google has a separate part of the algorithm just for "spammy queries", as apparently they do, then don't they need to be able to identify them in some algorithmic way?

Of course.

Something else to consider about the "Payday Loan Algorithm": If I were a gambling man, I'd bet that Google isn't nearly as worried about collateral damage when ranking results for "spammy queries" as it might be for more general queries. Aside from anything else, the Payday Loan Algorithm is probably a great test bed for new or more aggressive spam-detection tools.

Info queries get treated one way ecom queries get treated another certain uber-niches have special rules (e.g. health, travel, p0rn, news)

Now Spam gets its own micro-algo, it's own on-the-fly SERP-building rule set.

The final ranking is based on the indexed scoring of the page... how each of those scores are calculated would be the focus of V2.0.

Actually, I would think that is why Brands have been over-emphasised until recently. Whatever scoring criteria (e.g. domain longevity, traffic numbers) that represents brand-iness have been dialled up because they are unachievable for churn-and-burn sites. If there is a new separate algo for spammable queries, churn-and-burn sites will automatically become less of an issue for the rest of the world, allowing those "brandy" signals to be dialled back.

***** As for identifying target queries... once you have a seed set of spammy sites, track what they rank for. Frequency analysis will give you the spammy queries. The advantage is, doing it that way means a query gets covered as soon as it gets a history of being targeted by spammers.

I guess the real question is whether this impending refresh of the spammy-query part of their algorithm will actually get rid of any spam in the search results. Evidently they created this special part of the algorithm out of their frustration with their continuing inability get rid of spam with the main algorithm for all these years.

Whatever scoring criteria (e.g. domain longevity, traffic numbers) that represents brand-iness have been dialled up because they are unachievable for churn-and-burn sites.

There clearly are other signals that can outweigh these though, perhaps dependent on query as you suggested, because the last time Google proudly announced they were cleaning up the payday loans SERPS they were handed their a** on a plate within a few days - see [seroundtable.com...]

There clearly are other signals that can outweigh these [brand signals] though

Sure, especially under the general algo. Imagine there are a bunch of signals. Some of these are usually really good indicators of quality, but are also gameable with enough effort. There are other blunt-instrument signals which are usually less helpful, but harder to game.

When people start spamming, you can dial up the un-gameable signals to minimise spam, but it impacts quality across the board. Or, you can aim for wide-based quality at the cost of highly spammed niches.

That seems to have been the choice for quite a long time, until the simultaneous (co-incidence?) Spam / Panda update. Suddenly, Brand signals had been dialled down on most niches, as if another tool had been deployed to clean the cesspit.

I would suggest Google actually got better at detecting spam. They then could dial down the Brand prophylactic. If I were Google, the next thing I would do is turn Brandiness back up on spammed niches, to further neutralise spamming attempts in those areas.

Whatever scoring criteria (e.g. domain longevity, traffic numbers) that represents brand-iness have been dialled up because they are unachievable for churn-and-burn sites.

I agree that they have been dialed up, wayyyy up. As for unachievable for churn and burn sites.... Months ago I got screwed by a well known company with a big brand. The company has been around for 10 years and has had countless mentions and links from big well established sites like amazon, forbes..... One month ago I launched a new site on a new domain name. I post most of my focus into making the site look like the official website (without actually saying it or even coming close) of the company that screwed me. I only have 3 links to the site and it has made it to the first page for most of it's targeted terms, including the name of the company that screwed me. It is listed above amazon and countless other much more established sites than mine. The biggest thing I've learned about launching a new site these days is this... finesse. I waited about 2 weeks after launching the site before pointing a link at it. Then I watched it's positions in google and when it stopped moving up and down in the results, I added another link. Waited for it to stop moving up (and down) again, and pointed another link at it. After the 3rd link, it's on the front page.

I agree with Shaddows. By creating a special separate part of the algorithm to deal with spammy queries, Google doesn't have to use the main core part of the algorithm for that purpose, and thus has more flexibility in finding ways to improve the general search results. Previously there was a conflict between efforts to get rid of spam and efforts to provide the most relevant and highest quality results. That's why they've never had much success at getting rid of spam in the past. But now they should be able to do a better job of it. And they should be able to improve the general results too. In fact it seems to me that this new approach could mean that Penguin and Panda might not be needed anymore.