I have a blog that used to be an autoblog. It now has probably 1200 or so copied articles that are being spun AND also uniquefied with some plugins that I have.

Google has not given me any issues yet and it has indexed all the pages of that blog. I would like to keep it that way too... so I am trying to brainstorm how I can add a plugin or something to that blog that would make each page even more unique. Any ideas?

I was thinking of maybe having twitter feeds displayed on the sidebar etc, but to be honest I am not sure that would help. Any thoughts?

You guys are overthinking this, dupe content really isn't that big of a deal. You really don't see dupe content penalties unless you have multiple pages on your own site that have the same content. There's just no way for Google to automatically penalize content that isn't unique; it would be impossible for them to automate that because there is no way to know which is the original and which is the duplicate.

For example, try searching google for this phrase: "The earth was without form and void, and darkness was upon the face of the deep". You will see over 640,000 pages that contain the same text.

You are better off spending your time building links or optimizing your serps. Besides, spinning makes your site look like spam and is therefore way more likely to get dropped from the google index altogether.

You guys are overthinking this, dupe content really isn't that big of a deal. You really don't see dupe content penalties unless you have multiple pages on your own site that have the same content.

Maybe..or multiple pages(all of your pages) scrapped from other sites.

There's just no way for Google to automatically penalize content that isn't unique; it would be impossible for them to automate that because there is no way to know which is the original and which is the duplicate.

Yes..the first page indexed with the original content would get some form of "credit"..if your site turns up with that scraped content two months later..google cannot see that..?
Your kidding right..the algo misses that..then what point would it be to have complex search and content algo if it couldnt even do that..??
Go to any competitive search term..see if you can find scraped content on the first pages(on sites that are sitting there for along time)
Most of the time you cannot..why.?
Yes, I realise the odd bad black hatter can get away with it for awhile with backlinks etc..big deal.,,and..it gets worse..why in hell do so many software packages bust there arse to mix up the content/spin content/make content original/change content..what is the point..all those software writer/programmers are wasting them time..yet they are all too stupid to work this out and they could save time and effort and just scrape and slap the same content and it would be just rosy..if you build backlinks of course..
Huston..we have a problem..Its amazing that we dont see word for word duplicate content/affiliate pages on every site/page for any decent term if your hypothesis was right..because it would just require backlinks..right..but we dont even see this..sorry..your not even wrong.

For example, try searching google for this phrase: "The earth was without form and void, and darkness was upon the face of the deep". You will see over 640,000 pages that contain the same text.

Great example supporting my argument..so what..thats a competitive buying term is it.
And..some sites..will have word strings that are the same as other sites..so what..

Besides, spinning makes your site look like spam and is therefore way more likely to get dropped from the google index altogether.

To a manual examination yes..thats bleeding obvious..
But if you are implying some form of algorithm picks it up..thats crap.
The algo,,according to your ideas misses duplicate content..and ranks you anyway(crap)but is sophisticated enough to pick up poor grammatical strings which are all original content.!!
If it could do that..I/everyone would have lost cloaked sites for years..and thats just markov text garbage..I only lose cloaked sites when I went nuts with crap software like autopligg etc..otherwise..they sit there for years..

Yes..the first page indexed with the original content would get some form of "credit"..if your site turns up with that scraped content two months later..google cannot see that..?

Click to expand...

Just because Google indexed one site first doesn't mean it is the original content. Plus, the same content on two sites doesn't mean the content is duplicate. Just google any AP news story and see how many places it shows up. There's no way google would penalize two sites just because they both carry the same story.

Go to any competitive search term..see if you can find scraped content on the first pages(on sites that are sitting there for along time)Most of the time you cannot..why.?

Click to expand...

I have run hundreds of autoblogs over the years and deal with tens of thousands of customers and their autoblogs (I wrote AutoBlogged). I have had many sites ranking higher than the original content and I have NEVER spun content.

The reason why you don't often see scraped content ranking high is usually due to the fact that no one ever links to the scraped version, most people will link to the original content, which is usually already a high ranked site with strong internal links. It has nothing to do withe dupe content penalties.

why in hell do so many software packages bust there arse to mix up the content/spin content/make content original/change content..what is the point..all those software writer/programmers are wasting them time..

Click to expand...

Because a lot of people buy that software, it's a huge money maker, but the fact that there is so much software to spin content doesn't validate the misconception about dupe content penalties.

Awesome feedback, guys/gals. I am trying to come clean with all of my sites which is why I am getting rid of autoblog plugins. You have made me rethink my stance on unique content. Like... I know duplicate content refers to having content on the same site multiple times. In my case it's on my site just once... but likely also on thousands other sites.

I wish people would just shut up about duplicate content within the same site. The idea is fucking stupid anyway. I mean who gives a fuck if one of your page is penalized over your other page which has the same content.

As for "that" duplicate content, well, the brainiacs at Google would like us to believe that they can tell whos content on first, but I dont think so. Their whole system is based on trust, so if your site somehow trusted you'll have no problem copying content from anywhere. Unless theres a DMCA involved, then it sucks to be you, more so if you live in the US of A.

You guys are overthinking this, dupe content really isn't that big of a deal. You really don't see dupe content penalties unless you have multiple pages on your own site that have the same content. There's just no way for Google to automatically penalize content that isn't unique; it would be impossible for them to automate that because there is no way to know which is the original and which is the duplicate.

For example, try searching google for this phrase: "The earth was without form and void, and darkness was upon the face of the deep". You will see over 640,000 pages that contain the same text.

You are better off spending your time building links or optimizing your serps. Besides, spinning makes your site look like spam and is therefore way more likely to get dropped from the google index altogether.

Click to expand...

Of course this isn't strictly true, otherwise I would still have my autoblogs in the SERPs. Plus, most CMSs have post date data which the SEs can read, as well as reading the properties of the page itself to determine it's age.

Note that adblockers might block our captcha, and other functionality on BHW so if you don't see the captcha or see reduced functionality please disable adblockers to ensure full functionality, note we only allow relevant management verified ads on BHW.