As my site is quite new, I thought It may take a while before anything on my site would be seen let alone copied. So I thought I would check her first so you guys could just have a look at it for me, just so I am thinking this is correct.

This article is mine and was written by my article guy and is my copyright and I have not authorised anyone else to use it.

To reduce the theft somewhat, try to limit access to known scraper bots.

Best way to combat the theft (besides limiting access to bad bots) is to show the search engines that content FIRST... With the search engines it is all about where they found the content FIRST. It does not matter to the SE's who the content really belongs too, all they can do is go by where they found it FIRST. Did you get the point that being firrst to show the content to the SE's is important?

***

You can contact this guy if you want, but my experience is that it will be a waste of your time. Your article is most likely already on dozens maybe even hundreds of pages already.

Oh I forgot to mention that if it really important to you to get this content removed from the search engines you can go to te SE's and file a DCMA to get the copied content out the SE's index.

As far as showing SE first, this content has been on my site for weeks and the page has been indexed by google. The copied version was posted 3 days ago.

Checking copyscape it does not appear my pages have been copied anywhere else and shows not to be on any other site (as far is it knows).

Click to expand...

Real visitors are not going to see copyscape, search for unique phrases of your article in google/bing, use "" and search for entire sentences. Chances are, if google found your article first, the copies will not rank or come up for specific searches.

You only really run into trouble, when other sites are out ranking you for unique phrases in your article....

What I do with hotlinkers and scrapers is not block them but to randomise my content (slightly, so it's still legible, but incorrect), so it f&*ks up their own websites. This way they're automatically geting HTTP200 responses and for a while they might not even know that their scraped content is useless, and hopefully will kill their SEO and Google SERPs at the same time.

What I do with hotlinkers and scrapers is not block them but to randomise my content (slightly, so it's still legible, but incorrect), so it f&*ks up their own websites. This way they're automatically geting HTTP200 responses and for a while they might not even know that their scraped content is useless, and hopefully will kill their SEO and Google SERPs at the same time.

Click to expand...

How do you actually safe gaurd something like that. I mean, unless you can determine with 100% accurancy it's a scrapper etc, then I could see this having a fallover effect, which, could backfire and cause you to lose ranking. I believe Google would see this as cloaking...just sayin