Wired Magazine interviewed both Matt Cutts and Amit Singhal and in the process got some helpful insight into the Farm Update. I note that some of the speculation we've had at WebmasterWorld is confirmed:

Outside quality raters were involved at the beginning

...we used our standard evaluation system that we've developed, where we basically sent out documents to outside testers. Then we asked the raters questions like: "Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids?"

Excessive ads were part of the early definition

There was an engineer who came up with a rigorous set of questions, everything from. "Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?"

The update is algorithmic, not manual

...we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons.

Wow! Take a gander at some of those sites that people are reporting losses for. Go ahead and drill down to the MFA pages. I visited more than a few, all of them would have failed my initial sniff tests. Some of those folks have definitely stretched the limits of AdSense.

I agree that many of these sites have too much ad space, but there are several notable exceptions that lead me to suspect that ad space is not the primary factor here.

The one common thing I'm seeing in these sites has to do with overall look-and-feel and is hard to define, but it's basically this: upon landing on one of the internal pages of these sites, it's not immediately obvious that you've found the answer to your query (even in cases where a good answer really is there). This has to do with layout, typesetting and placement, which when badly done creat an overall impression of confusion.

My hypothesis, based on this, is that user interaction data of some kind is the primary factor here. People land on these pages and react negatively in some way that Google can measure as indicating "this page doesn't immediately appear to be what I want."

This issue of the overall impression of the page as a landing page is something I've been working on for my site recently, as it was a common criticism from testers. I've made several changes over the past year or so that are designed to make it more immediately obvious that the searcher is in the right place: I changed the color scheme to look more like other sites in the genre, I made titles stand out more, and I put several graphical elements above the fold that make it more visually clear what the site is about. I gained traffic in this update, so perhaps my efforts have paid off.

Because it's something I've been thinking about, I noticed that most of these sites that lost traffic seem to have problems in this area. For example, one of them is an art site, optimized for art-related keywords -- but most of the pages have few or no actual artistic images above the fold. If I'm looking for art and I land on a page that's almost all text, with maybe one tiny image above the fold, I'm going to look elsewhere. I don't want text, I want art! I bet if that site were to put one or two large relevant images at the top of each page, it would do just fine.

I'm thinking this has more to do with visual design and overall look than anything else. If you've lost in this update, maybe consider hiring a top-notch graphic/layout designer.

However I will have to add that bounce rates are gamed by most big sites.If you look for a product, they never make it available on the first page.There will always be a separate page to buy things.

Smaller sites tend to give it away in the landing page.

Bounce rate is a noiy signal and not the perfect one.

I think G is redefining most of the things these days, though they had ignored them in the past for good reasons.Is it because people working on it now are different from those who worked on it earlier?

indyank, I doubt it's as simple as just measuring bounce rates, but in the spirit of sharing potentially useful data: my bounce rate is about 40% and has not changed significantly in years. The changes I speak of above did not affect bounce rate (that was kind of disappointing, I thought they would) but did have some impact on average pageviews and time on site.

Agreed with sentiments about the bounce rate. Often larger sites will make you click through to get the information you are looking for (e.g. click here for price or phone). I am of the frame of mind to provide that to the end users without hassle. This effects both our bounce rate and time on site, but I feel it provides the better user experience. Would be a shame if that was somehow negative in Google's eyes.

I agree that many of these sites have too much ad space, but there are several notable exceptions that lead me to suspect that ad space is not the primary factor here.

The impact of the update is not necessarily site-wide. I have one site that was hit in part. The section that was hit (@40-50%) had very different html and css than the section that was not - but used the same advertisers and ad networks. Most of the pages in that section that was affected had fewer units per page than the section that was not affected. I've recoded the pages and should know, soon enough, if that makes a difference.

If I'm looking for art and I land on a page that's almost all text, with maybe one tiny image above the fold, I'm going to look elsewhere.

freejung, each one will have his own idea of what is good and what is bad.Human beings are not all the same.Even culture is not the same.

I always prefer stories that flow naturally.Images should be in the right places within the overall flow.

G should focus more on whether the page has stuff what the user was looking for rather than focussing on things that some may consider as bad.If they tend to listen to the stories from a few people, they will miss out striking a balance with other cultures.

My feeling is G is being confused by outsiders to a great extent these days.They shouldn't be attempting to do everything.They should define a boundary at some point.

I look at bounce rates for my site as being significant, even if Google doesn't. Google Analytics was showing a bounce rate of 45-48%, which I thought was high, given the nature of my content.

I found a bit of code to add to GA's tracking that changes the bounce rate to not include single-page visits that last ten seconds or more. If somebody spends more than one second on a page, I figure he's found something he was looking for, even if he doesn't go further.

At any rate, my bounce rate with the 10 second filter is anywhere from 8%-18% for most pages. Any pages higher than that are ones I need to scrutinize.

If I was a conspiracy nut I'd be inclined to think these new changes are intended to rotate a number of (not big name) sites out of the top so a new set of wannabee sites trying to get into Adsense earnings and failing since their sites just aren't quite good enough... after all Google is an advertising company and if they are not getting new advertisers, they're losing money.

freejung (and others who've mentioned above the fold): having to look or scroll down past a lot of advertising space is surely not ideal for a visitor. Yet there are such pages; at times have decent info but not designed to present it without users making some effort. A bit like having store to a museum, say, right on the way in. Nor does this really require a top notch designer to fix such a massive issue

I've got a site with a bounce rate of 83% and Google traffic is up since the algo change. The information on this site could have been spread across hundreds of pages, but instead it's on a dozen long pages with some graphical navigation aids at the top. If a visitor doesn't see what they want there, it's not going to be on any other pages of the site, which the limited sidebar navigation makes obvious. It's not a major site, draws something over 500/day from Google.

I read through the entire list of sites reporting in on the WebMaster Tools thread. Yes, there were a number of MFAs, but what really stood out to me was all the 10 year old sites that got slammed. One guy even had PR=8 for the homepage.

I don't have a great critical design eye, but some of the serious content sites (the basket mine fall into) looked awfully good to me. Many of them commented on getting ripped-off all the time and suggested duplicate content issues.

And yes, I would have given them my credit card AND followed their medical advice.

but what really stood out to me was all the 10 year old sites that got slammed

I don't have a great critical design eye, but some of the serious content sites (the basket mine fall into) looked awfully good to me. Many of them commented on getting ripped-off all the time and suggested duplicate content issues.

Yes to both. The site of mine that got hit the hardest is literally 10 years old. It has also been constantly scraped and ripped off more times than I can even put into words.

A pro SEO making money on the 6 figure level told me this update is about google maps. Yes, google maps. Go figure....as I'm still trying to figure this out. I know he's a reputable source, so....perhaps this helps.

#1 It's not us, it's you #2 Don't complain in public #3 Accuse yourself of blackhat SEO #4 Accuse yourself of bad content #5 Spend your time on SEO rather than content #6 Don't offer suggestions, we're smarter than all of you put together

Tashi - whatever happened to the picture is worth a thousand words mantra?

We will write an additional 500 words on this page article around the photo gallery, widen the page and double the length of descriptions and see what happens. We live, work and play in our niche so no problem to generate content.

She was just trying to be helpful. I got a lot out of her article. She just tried to point out that if you want to survive long term you it helps to not take algorithm changes personally. Most site owners aren't going to stay at the top of Google or any other search engine year after year without doing some analysis of what types of pages the algorithm is favoring and what types of pages are not doing as well. My older pages got hit because they were designed for an algorithm in existence 10 years ago.

I never designed for an algo, I designed for people. That's still Google's official advice, it simply my not be true anymore.

When you redesign your sites and regain your positions, you can thank Google if you think it's appropriate. I think you're getting a little ahead of the game thanking them for making work for you before you know the outcome.

I agree she was trying to be helpful, I just find it offensive how she went about it.

When you redesign your sites and regain your positions, you can thank Google if you think it's appropriate. I think you're getting a little ahead of the game thanking them for making work for you before you know the outcome.

I wasn't thanking Google. I was thanking an ex-employee at Google for sharing her insights.

I never designed for an algo, I designed for people. That's still Google's official advice, it simply my not be true anymore.

In the competitive categories, there are often millions of search returns for a single search term. Good content alone usually isn't enough to get you in the top ten spots.

If you want a good rule of thumb for where Google is aiming, think "measuring engagement". I know that "engagement" sounds like a social media metric, but more and more it will also be important for SEO. We could even understand the Panda update as a first venture into that territory.