Some dodgy sites have been hit hard by the Farmer update. But some non-dodgy sites have also been hit hard -- so bad that Google is already trying to fix that collateral damage. As the dust begins to settle on the Farmer update, a simple question for our "Discussion of the Week" -- how should this update have worked? What did Google get right, and what did it get wrong?16 Comments

Comments

The thing that strikes me the most about this is just how reactionary Google is being on this -- and how reactionary they've been recently. They're usually out front leading the way, but on this issue and so many others recently they're being led by public sentiment, media reports, and so forth. Go back to the Borker thing, then the JC Penney and Overstock news, and now content farms -- it's all reactionary and it seems rushed, even.

Google says this Farmer update was an algo change they've been working on for many, many months ... but I can't help wonder if they pushed it too soon in reaction to the recent spate of articles about search quality. That's my main thought when I hear about non-farm sites that have been hit.

I'm thrilled with this and other algorithmic tweaks that Google has rolled out as of late. It's simply making SEO tougher, which is a good thing for folks that know how to roll with the punches and build sites that are either a) immune to SEO tweaks b) not built to live or die based on Google referrals.

P.S. I feel like the definition of a "non-farm" site is very subjective.

Matt M., I agree entirely that "Farmer" seems to have been a rushed implementation, and the driving force behind it appears to be more Google's concern over a chorus of commentators protesting the perceived amount of poor quality content ("spam") in their SERPs as any urgently-needed improvement to the SERPs.

I don't think, however, that their reponses to JCP and Overstock can be lumped together with "Farmer" here. These were everyday penalties meted out to sites caught violating Google's guidelines, rather than algorithmic responses, and were extraordinary only in relation to the high profile of the sites impacted. Borker probably falls somewhere in between: it was an algorithmic update (giving further weight to something akin to sentiment analysis), but one much more limited in scope than "Farmer."

I certainly can't recall an update since "Florida" with such a wide-ranging impact on search rankings globally, and with such collateral damage reported. "Florida," I think, actually did represent a major innovation, and was driven by an actual need to correct the SERPs because rankings were being fairly easily maniuplated by generating a high volume of poor quality links.

"Farmer," on the other hand, as a reaction to criticisms regarding the quality - as opposed to relevance - of sites appearing in the search results, has achieved the same sort of scale of collateral damage as "Florida" without actually hitting the mark. That is, in the simplest possible terms, the prominence of eHow pages that everyone had been complaining about did not appear to be diminished with (the initial phase) of this update. This to me is hardly surprising, as trying to carry out an objective (algorithmic) response to subjective (human) assessments of quality is - as we've seen - fraught with danger.

Google seems to be just responding to mainstream news, but they're not really doing anything to truly repair their results. They let eHow slip out of the Farmer update, while many smaller blogs, etc., were left to drown. The issue is more that Google is trying to find a way not to hit their pocketbook, while keeping good organic results and they may just have found that spot that's nestled between a rock and a hard place. There may not be a balance and that's the issue I see.

Needed? Yes. Rushed? Probably. Is it a big deal? Not really. Alta Vista's Black Monday and as Aaranged mentioned Gloogle's Florida had a much bigger in impact on more sites. And I agree with Jill, they've let their results slide a little in recent years. Too much spam, farmed content and duplicate content has been indexed and ranking lately.

Since Google has been addressing spam & content farms in 2011, they are going to have to do something major about duplicate content as that one is going to become the next topic of improvement.

I don't think Google should get rid of eHow until it gets rid of Wikipedia. If we can find some more low quality sites to dominate their search results, we should insist that Google give the top five positions to only five Websites that provide the least amount of value to the general public (who don't realie just how little value they are actually obtaining).

I have to disagree with you Michael. I know we all wish Wikipedia would not dominate the #1 position in the SERPs, but I almost always find useful information there. Where as on eHow I almost never find anything valuable or insightful. But I agree on your second point and the whole Farmer update in general, we are taking the approach that there is now a void to be filled with excellent content and working with our clients to do so.

Well, at least they waited until after Xmas. On the positive side, there are a lot of companies that will spend the next month figuring out how to make their sites better for both bots and people. You can be sure that regardless of whether or not their site was hit, most site owners are having discussions about how to improve their products so they are less reliant on GOOG which can only mean better SERPs in the long run for GOOG.

The change may have been in response to criticism of Google's search results. However, it seems to have impacted sites that follow good practices of building content and either distribubuting it on the web or encouraging links to their content. The tactic could have been relied on too much by some and abused while others were doing it above board. The result being that now you have to control who is linking to your content (and make sure they have a decent link profile).

I've noticed impacts on company's site I work for where we saw decreases in rankings, these pages had been picked up by adsense sites and had a larger number of total external domains (although high % of low quality) where pages that were not impacted had no or very few external links. Seems like a portion of the algorithm change addresses the notion that some links regardless of quality are better than no external links. But how is one to control who links to their content?

I agree with @Aaranged comment "This to me is hardly surprising, as trying to carry out an objective (algorithmic) response to subjective (human) assessments of quality is - as we've seen - fraught with danger."

I agree Matt, it does seem reationary and rushed. And while Google claims to have been working (or thinking) about this for over a year, the attention and references to the Personal Blocklist Chrome Extension data is curious.

It's been very interesting to see the way Google is dealing with the media. First in the way they reacted to Rich Skrenta and Blekko and then Vivek Wadhwa and Bing. These were all very 'inside baseball' type of engagements from my perspective. Ask anyone on a street (outside of San Francisco) if they've heard of Blekko, I'd be surprised if you got more than 2 out of 100 to say yes.

So was this more about their own vanity and image within the industry? Because by addressing it so vigorously, it just seems to invite more media to the table. Because the 'mainstream' media still doesn't have a fair grasp of search. But maybe that's the point.

But after rolling out Panda/Farmer, I think Google may have been surprised at the speed in which the community built a picture about the impact. This is the first major algorithm change where micro-blogging and social media were so ubiquitous. Within a day we had multiple data sources, identify collateral damage and begin speculation about how the change was implemented. I'm not sure Google was prepared to be as ... accountable as we're requiring them to be this time around.

Aaron is spot on with the distinction between relevance and quality. But setting aside whether the subjective opinion of site quality is accurate, the update was clearly an exercise in 'addition by subtraction'. It targeted sites that they classified as having low-quality and demoted them. Cross your fingers that the stuff that was below it is better! On average, I'd say it's not that much better. Different? Sure. Better? Meh.

This technique also treats all content on these low-quality sites the same. So a great and very useful article on one of these sites is treated the same way as a terrible, grammatically incorrect mess of an article.

Shouldn't the goal be to find the best content regardless of source?

I know Google meant to do the right thing, but I've come to expect more from them.

Well, Wikipedia's information is always changing so I suppose that as far as dynamic, user-generated pap goes it's probably the best quality the Web has to offer right now.

So, if we're all agreed that Wikipedia can stay at number 1 for every keyword any eager Wikipedia editor can contrive, we can go back to discussing the Great Big Panda Update.

Ooo, I just want to give it a HUG!

I see things happening that only a couple of other people have so far alluded to. I suspect there is a layer to the new algorithm that has so far escaped most people's notice.

On the other hand, I was just telling a couple people on Saturday that I thought Natural Language Processing might have something to do with this -- machine learning and all that. I should have blasted that from the rooftops and taken credit for an ingenious 2-day analysis of Google's algorithm.

Then again, now that we know it was named Panda, I may still have an opportunity to earn my place in Google SERP history by writing a great piece of link bait...which, actually, I don't do.

Technically, I don't see any reason for Google to back off on this update. I believe they gave every site affected by it an escape path. If that's the case, search engine optimization will probably spawn a new sub-industry or two based on this update.

It's really not about the quality of the content so much as it seems to be about the quality of the presentation and user experience. After all, every major newspaper in the world has the same pictures and facts about Colonel Whatshisname in Libya -- they don't seem to be whining about lost traffic and search visibility.

I think the day of Feel Good Optimization may have arrived. Soon, we'll all be FGOs in addition to being SMOs and SEOs. So the good thing for us is that we'll be able to optimize for new acronym-laced blog brand names.

We've been in business for almost 3 years and have now been put out of business by Google's updates. We have a 100% original content site and maybe someone can take a look at it before we de-activate everything and tell me why we got canned. Two more people in the job market now! It's so sad because we are a tiny business trying to help dogs with our product. www.dermapaw.com

I agree with Hugo - bring it, Goog. And by bring it I also mean please humble the content farms with oh, 5 articles about reheating quiche (namely eHow.) To me, superfluity is an important content farm indicator that should've been examined more closely in the update.

Speaking of quiche, I'm disappointed that the update hasn't affected recipe SERPs and other nichey farms much. Google's new recipe search options are awesome but sites like Cooks.com (a recipe farm that attacks you with all-caps sidebars) are still on the first page for nearly every query with "recipe" in it.