Comments

Michael it could be as simple as they dont care what the SERPs look like for a query like "7 things airlines should jettison from their planes now". Meaning that showing duplicate stories this this low, low, lowly competitive phrase is not a big deal and might even be ideal in their eyes.

Michael, those other phrases you gave as examples are the same type of long tail multiple word terms. Theres likely so many identical items, becuase theres likely not much else to show. So instead of showing only 1 result, they show the ones they have. Ita bit of a lame example. How about how many times one of your example articles show up for a search term like "palin pipeline", which should still be vague, but produce more varity in the result pages.

@SanDiegoSEO in many cases google will show a message something like "we found a bunch of results really similar but didnt show them click here if you want to see them". However when trusted domains are involved the algo rules are different.Showing the reddit page which has no content, no comments, and only a title is interesting because it ranks only because of its domain trust.mg

@graywolf: Are you sure only trusted sites escape the supplemental this way.I recall seeing a few results where a few-moths old site was also listed in serps (with copied/syndicated content) along with the original site.@SanDiegoSEO: Its just that since two-phrase keywords also have a lot of other applicable results, you dont see many such results on the first page. But they only move to the next pages (dont go to the supplemental index as they should).See this for an example: http://www.google.com/search?q=acne+formation&pws=0Can you see ezinearticles on #3 and AmericanChronicle on #6. And on SERP page 2, you will see articlesbase.com on #13.All three have the same article inside.

@incredihelp: I agree that its not a competitive phrase. But SanDiegoSEO was referring to this as only a long-tail multiple word problem. And he wasnt referring to competitiveness.Secondly, of course when the serps are competitive, there will be many sites optimizing for the first page, so you would see a variety of results in the first few pages. But you cant check that 978th result to see if it matches with any of the other 1000. I cant think of any feasible way of testing whether this holds true or not in competitive phrases, also because after the 1000th result, you dont know whats in supplemental and whats not (thats why I didnt address @you!).But even for acne formation, you can find around 210 monthly searches from Google Adwords tool and it also appears in Google Suggest. Although these tools arent perfect, if Googles system itself shows there are a few people searching for a term consistently, then it isnt serving its own purpose by showing same results.Point is that all duplicate content doesnt get filtered right away (especially if from trusted sites) and doesnt go to the supplemental index where it should be.

It all boils down to crediting the original source. MSNBC & tripso both reprinted the article without linking to the original article, and their pr or trust outweigh elliot.org. The problem is fixed if they link to the original article in their reprint.What cracks me up is the diggbait article (@ #1 for the title) printed on gadling.com that was passed off as original content, and was really just a warmed over summary of his article (complete w/ hot girls at the bottom to try and get the extra diggs.)

In my understanding duplicate content should be distinguished from so called plagiarism. Say I have an article or a press release sydicated via different PR and/or Aricle sites. All those pages are indexed and appear on google search, not excluded, including the my blog, if I post the content there as well. Google somehow probably filters or diminishes the value of my backlinks, if I have many linkes from the same URL pointing to my site.