Another forum's data that I've seen lines up with what dataguy shared. Some pages significantly up, others significantly down. I agree, this looks more like a document level multiplier and not a site-wide to me.

And that makes sense with the softer information we have. Remember this new algo factor was more than a year in development. That sounds like something quite fine-tuned, and also not something that uses only broad brush factors like backlink quality or bounce rates.

A couple of people have floated the idea that reading level might be in the mix - something we know Google measures. It couldn't be the whole thing.

I also noticed that Google people have gone to some length not to use the phrase "content farm" which might indicate a domain level multiplier. Rather they talk about quality and usefulness.

I read an interview with someone from eHow yesterday, talking about this update. The guy said that eHow also has some pages up, and some pages down. I'd suspect that they have more pages up than down, though.

I have checked a few e-commerce sites we work with, all seem to be fine, can someone chime in? Someone who actually sells and ships a widget and not just copy/pasting manufacture descriptions are you down as well?

Sell and ship a wide selection of widgets across numerous product categories on a 17 year old site. Original, detailed, well written content. 40% traffic loss.

I feel like it can not be based mostly on backlinks. Many of these big companies and the sites I know of personally have tons of quality back links (.edu, .gov, NYT, ...), for example take suite101 they have 1.5 Mil back links according to yahoo, yet they are getting hurt in this update. So if this is an algorithmic change than backlinks can not be part of this equation, or at least authority backlinks we used to know and love...

By the way, met with my staff Friday and reduced everyones hours by 10 hours per week effective immediately. Come Monday we're on a 30 hour work week. I feel bad for myself but even worse for them as they truly can not afford this loss in wages.

@krikomoush. I quite agree. This update does not seem to be based on authority backlinks based upon my research.

Many of the clever ideas and theories expounded on the various threads seem to have some merit. I think there are clearly site-wide penalties which perhaps are being applied to those who do not deserve them. I think most of us would agree that spam in the form "content farms" is not ideal (most of us do not wish to be replaced by robots), but obviously it is not fair for those who "play by the rules" as they empirically understood them to be placed in the same boat if the content is original and valuable. Hopefully the factors used will evolve quickly and those who are seeing falling traffic will recover. Yes, there is no "entitlement," and "life is not fair," but there are thousands of days and nights devoted to putting up pages which many of us rightfully feel are valuable.

G does have a long record of being reasonably consistent, and there is a chance that big revenue producers may get whacked for no reason, which is not good for anyone.

See the problem is fresh content. One is that the internet has been a cheap and easy way to produce content, but new websites have a problem in getting in new content. Google way may or may not be fair. However, who is it not being fair to. Is the the old guy who has not changed his website in years and expects to be at the top, or is the new guy on the scene that does not have a chance. The key in my opinion is that Google wants new content to show, because a lot of the old content is making them boring. If you search for something new you will find it on page 37 of a Google search. Now using seo is only good for a month or 2 without providing fresh content to a website. Change the keywords etc... and that may help for a day or two, but the bots will pick it up quickly. I have found that craigslist is a leading way to get and stay at the top of Google. The sole reason is fresh content all of the time. It is local and it works. The only problem is that you will find some old shoes for sale. Now, this being said the way to get the fresh content for new content is that everyone else writes it, not the same ol person setting back and answering every post, with I agree or disagree. Forums do well for the agree or disagree approach but what do they offer other than reading an opinion of everyone that post. I think that Google is correct in what it is doing. To stay at the top requires fresh content without cheating. You will see in the future that seo will be much less effective also. Links to pages will be falling and negatives to websites will be discounted also. Google has a business model, and the model must be rewritten all of the time to stay profitable. I don't know about you, but I get tired of seeing the same old Google ad when I visit a website, because I did numerous searches looking for something, that I found a month ago. You don't profit on yesterdays news.

A lot of people are saying it... and I just want to add to the discussion, that I am noticing sites that have scraped our original content are ranking ahead of us in the SERPS. Also, I have noticed that other commerce sites that have used the MFG's description are ranking high (without any additional unique content). and what is happening is for many of the key terms you will find 50% of the results being for the same item... whereas before it would have been less homogeneous.

Personally, I wonder if it's the 'new' content you're talking about they're looking for or if it's the 'freshness' (or 'staleness') of the page itself that makes the difference?

I've seen many results where the date of the "blog entry" is older than our content and ranking higher, where we used to rank... So I don't think it is necessarily fresh content... Though I have been noticing a lot more Craigslist and Ebay than there used to be...

grimmer the way around this to add a new product with new content. Do not use the same that your supplier may have on their site. Keywords for the new product must be different with at least one new keyword, you may may think of back linking the new product on other sites also, and change that frequently also. The key to seo is fresh content on nearly a daily basis, this requires a lot of work for the people that do seo, either that are no seo at all, providing the website is fresh and offers something different. A branded website, even though is popular will lose sales even with a customer following if they do not get the new product noticed. You will see that search engines will be changing in the near future to bump websites, that either pay for advertising or have new and exciting content. So the bottom line will be the bottom line for all search engines. It is possible for a search engine to lose advertising dollars even if they are making record profits.

That could depend on your definition of fresh ... In rankings, fresh does not mean new ... It means fresh, and freshness cascade much like PR. (That's to the best of my understanding and about a simple as I can put it.)

The bottom line for any business is the bottom line. For a long time this has been overlooked for anyone that does seo and has taken the search engines for granted. This is the very reason you will see paying sites moving ahead of free ones. Think about what was posted by Lenny2, a lot more Craigslist and Ebay than there used to be. What they have in common is fresh content with new names popping up. Also, remember that Google is trying to find the key to local searches. Seo will never be able to hide that. You may find that a company comes up first by someone doing a search in California, because the company is based in California, and it may come up on page 5 in Maine. Again, now matter what, fresh content will be the key in the future.

I am working on a very high-traffic, high PR site that has lost 20% of it's traffic. I believe it is site-wide. Most of my rankings have dropped. Some have not. But in each and every case where I maintained a number 1 position, there was no real competition. So even if I took a modest ding, it wouldn't push me down.

And the dings have been modest. But if everything goes down 4 places, you are going to lose a lot of your traffic as things get knocked to the second page.

Also, every part of my site has been hit, every page type, all with a similar hit. Even pages I believe are much higher quality.

So I believe it is site-wide similar to a PR ding (but I don't think it is that either - because my higher PR pages still would have maintained their number 1 ranking). Just a new judgement on a "site". When Google announced the update - they talked about the quality of sites, not pages.

So how are they judging quality? And what makes a good page Algo-wise? And can they now be making a site-wide assessment based on behavior? (For instance, a second click on a search result within 15 seconds of the last click is a definite quality signal). These are the things I am wondering about.

What are the negative quality signals? - Large dup content from other domains - Pages assembled from small pieces of content from your own site - typically done on large dynamic sites to make up for lack of content - Lack of original images - Heavy-cross linking perhaps (content done for Spiders) - Lack of complete sentences - Content that does not make sense (again stuff assembled just to create a page, but beyond the title and maybe a sentence the rest does not hold up thematically). - Poorly organized content (Not sure how do this with an Algo - but Ehow's content is certainly well organized). - Content that is just too similar to other content on your site. (Shuffling the same content to get different results) or even Ehows case of writing 10 articles on how to tie a bow-tie - not the same text, but really the same content with different titles purely for SEO purposes). - Poor ad to content ratio

As someone said earlier in the thread, it is likely a percentage of overall indexed pages that they judge as low quality that generates a low site quality rating.

This could explain why some sites with really good content are getting knocked down. How does their overall pool of content look? Are they generating thousands of tag pages or something like that which are drowning out the good content in overall pool of indexed content?

Anybody have other ideas on what other signals might indicate quality of content and any indication that it is based on user behavior?

@seoguy9999, I think you are correct on most of points mainly "Pages assembled from small pieces of content from your own site".

I've several pages with tags and when you click on these pages it shows all the topics for that tag(kind of Stackoverflow tags). I've around 16000 tags for 2000 articles. I also have some small pieces of content. I've decided to remove these pages from Sitemaps, block google from crawling these pages. Not sure I need to do reconsideration option. I will wait and see if this helps. My competitors are doing great after this update, they don't have these many pages indexed. I think number of quality pages indexed in Google is the key to solve this problem.

Coming to your point about "repetitive elements on each page", if you look at answers.com, for every question at the end of the page they add "Our contributors said this page should be displayed for the questions below" and generate 10 variations of the same subject, but they are doing fine.

seoguy9999, I had a paragraph to post, but I erased it. All I care to say is the bottom line. You all need to think about the bottom line to any business. What makes Google successful, and what not? I will give you an idea. I used to work for a company that required me to make at least 5 service calls a day. I only averaged 3, however, I ranked always in the top 5 either did any of the top 5, of the 60 people every month. I never made my 5 service calls that were required, so how did I rank so high? Again the bottom line is the bottom line.

seoguy9999, 99% of 2010-2011 pure spam is better than what you said, minus the interlinking (but 'related stories', and "you may also like" are on almost all sites. Spam has moved up, it's no longer a bunch of words mish-mashed together by a php script. Take all the sites hit and you see full sentences, albeit cheesy ones like "Cancer is a really bad disease." Can google grade each pages like e prof grades student papers? Don't know.

Maybe google took the words individually and compared them with the words in other pages /sites, without the arranging but that's a bit scary and will result in many false positives.

Algorithms and hits. Let's talk about these. What is creating hits, and what is not? You see that many of the sites that have old content are staying at the top, why many with new are not placing at all. Now I am guessing that most of you are seo pros. If I did a search for seo, were does your site place in search engine results and why? If you are the best, you may place on page 23, while the worst may place is number one. 15 minutes of fame and again the bottom line. Why else would a company change their algo? You may see that the largest social networking site, may be getting less notice with search engines in the future also.

Just saw this link [sistrix.com ] . The biggest loser is suite101.com , answerbag is there but ehow is not in the top 25 losers after farmer update list. So, we need to check why eHow is doing better than others.

Ironically our position on items on our ecomm site is now lower than the position of the content farms. On one that we went from 2nd page to 18th page, Ehow is now the listing above us. Also, the page from our site that google has chosen to show for this particular keyword is much less relevant than the old one.