>>This means a bigger site can steal content and push the original content producer completely out of the results!?

and

>>It wasn't stolen, it was published with permission,

So, it doesn't mean a bigger site can steal content and the push the original out. You'd use the DMCA for that. What it does mean is that if you syndicate your content/let other people publish it, those sites can outrank your site for the same content. When Google sees duplicate content, it'll pick which one it thinks best serves the searcher's query.

Welcome to the club man, you are not alone many people have had the same issue. I have the same exact issue except scraper sites are stealing my content without my approval and ranking at the top, and also like Topix with just a snippet some forums are ranking higher than my site.

I also have no manual penalty, only thing that I can think of is algorithmic penalty or some bad links? but I never got an unusual links message or some Google users who don't like you probably reported your site and it triggered a google human review. Its hard to even know what to do anymore.

How does google rate user experience? you could have been hit by a human reviewer who just looked at your site and gave it a low rank. I used to have a site ranked in the 100,000's and now it dropped to nearly 2 million when I type my domain in alexa, and majority of scrapers are using my articles and ranking higher than me.

I think google should just change the algorithm back to date and time which ever site published the article first should get the reward not the ones copying it. Because user experience seems unfair some users might favor a site and some might not like the the site, its subjective. I know people who hate the big news sites layout should those well established news sties suffer or be deranked because some users said this site sucks?

I've given up hope, it really sucks to be honest because google is the big dog, and if you can't rank well in google you basically can't rank at all, majority of people use google search.

>> which ever site published the article first should get the reward not the ones copying it.

How can Google know that exactly? Scenario: Site A publishes an article but the bot doesn't come around till three days later to crawl. Site B copies the article (legally or illegally) and publishes the article within 24 hours of Site A's publication but the bot comes around to Site B right away. To Google, Site B published first.

>> which ever site published the article first should get the reward not the ones copying it.

I'll dispute this in another way. Google doesn't really care who published it first. Google wants to send users to relevant sites with a great user experience. If other sites are beating you in many other ways and you give them permission to post your content - why wouldn't they rank? You're not Google's customer. The searcher is.

Okay with that crawl issue, every time I put an original article that I never give anyone permission to use, I check my article in Google search always. sometimes it takes a few minutes to show up. Once I find my article by searching the article title, I notice my article is always the first spot of the second page.

So I know right there that Google already indexed my article.

Now a day later I type my article title either that or do quotation search and snippet search or it and scraper sites are all on the first page, what baffles me about this is, I would see 3 or 4 different sites with duplicates of my content all in the front page, while my article the original is not posted on the front. Why are several different sites all showing before me? I don't get it, you can clearly see the the time of post sometimes the
postings will say posted 1 hour ago or 1 day ago on search.

I decided to check a couple of my favorite blogs that I read daily about football news and opinions, to see if they are having a similar issue that I am having. So when I saw they had a fresh article, I typed in the title of their article or did a snippet search, and their article is on the first page as the first result. Followed by scraper sites, the difference here is the original uploader of the article gets ranked in the top spot. I've gotten to the point of maybe just canning my site and starting over from scratch.

Zazoo, if you want help with your own site, it's best to start your own thread. We need to make sure we don't hijack the thread.

If we're talking about the OP's comment about snippets outranking the original, I go back to my comment: if scrapers are outranking my site, I need to ask myself what quality signals am I missing that the algorithm thinks the scraper sites outrank me.

if scrapers are outranking my site, I need to ask myself what quality signals am I missing that the algorithm thinks the scraper sites outrank me.

Thanks Suzanne,

That's what I'm trying to figure out. We try our best to produce original, high quality content, and we just don't seem to be making any traction. The point of the second example is that there's clearly an issue with the G Algorithms - they say they want to give the user the best experience possible, yet they list a short snippet above an original article, I don't get it. It seems this is an ongoing issue and I've no idea why it's happening.

Well they do know when an article was published, in the results you always see the date and if it was recent, the time. I'm also using the structured data tool to tell Google the date (and all the other info) in the article.

Plus, this was published on my site a week before Yahoo's.

I understand Ashley's point:

Google doesn't really care who published it first.

But it still seems quite crap, especially if the republishing site links back to your original post.

I get frustrated and understand the frustration the OP and what he is feeling. He clearly isn't a spam site and basically is given the same vague answer, how will he know what he is doing wrong if he has no clue what is going on? Someone doing it better basically hurts to hear, especially when you are the owner of original content that is getting outranked by those who duplicate it.

Syndicate carefully: If you syndicate your content
on other sites, Google will always show the version we think is most
appropriate for users in each given search, which may or may not be the
version you'd prefer. However, it is helpful to ensure that each site on
which your content is syndicated includes a link back to your original
article. You can also ask those who use your syndicated material to use
the noindex meta tag to prevent search engines from indexing their
version of the content.

I read the link and this is the closest answer above, since he is giving permission to syndicate his
article he must ask the other sites to noindex the article and I doubt
yahoo or the other sites or forums will do that because they want the search
traffic. Another thing is how does Google pick which version is
appropriate? is it a human picking it or a computer algorithm? this
baffles me because in this situation it seems the Google system is
punishing the one who publishes it first and yes Google knows who puts the article first because you can search by date and it will have the published 1 day ago or 1 hour ago or 10 minutes ago examples by each article. I've seen other sites who rank
at the top for their own article while the sites and forums that use
snippets will always rank below. apparently they don't want to give specific
answers because as matt cuts said, Google doesn't want to encourage
spammers be better spammers, with the bad acting analogy. I
understand his point and it makes sense but its not good for honest
bloggers like OP who are not spammers and are victims.

Another thing that could be a problem for the OP is multiple people reporting a site for duplicate content causing a negative user experience issue, could Google see it as a red flag if it gets multiple reports of spam are going to his site? Maybe that is why a forum is ranking higher for a snippet of his article and other duplicate sites are ranking while he isn't maybe they reported the original content site causing them to get penalized?

Thanks for explaining it better than me Zazoo! That's exactly the problem.

I'm going to try it again soon, but this time ask that they only publish part of the article and link to the original article in a sort of "Read the rest" kind of way.

If the same thing happens again, that clearly indicates it's an issue with the algorithm, as the user (the one whose experience Google is trying to optimize) would want to see the full article, not a portion of it.

Trying to outrank Yahoo! Autos!, even for your own content which you published first, is most definitely the long way round when it comes to the answer here - although partial syndication is a good start.

Allowing others to publish the same content and then attempting to compete with those sites is probably akin to having your article published in The New Yorker or a local trade-rag and then expecting your magazine with the same article to sell more copies simply because you published first. That probably won't happen. Having The New Yorker publish an excerpt is probably the better way to go unless your business-model has factored in competing directly with other magazines and trade-rags who publish your work.

Google probably isn't about to change their algorithm in this regard any time soon. "Google's mission is to organize the world's information and make it universally accessible and useful."

Scraping is an issue probably best handled directly with the scraping site or in conjunction with a lawyer's advice.

Boa Constrictor Seen Eating Howler Monkey in a First

This website ranks above the livescience website and the yahoo news article which livescience gave permission too, I am not sure if GAdailynews has permission or not but they are clearly out ranking the original content provider by one spot. Also when you look at Gadailynews they only put a snippet and the video of the livescience article not the entire article and they put a link back to livescience. how is gadailynews doing it better than Livescience.com?

Its not even as many paragraphs or well written as the livescience article it snips from, so the unique well written and original quality content doesn't get favor in this Google algorithm decision it rewards a snippet duplicate site to be ranked in first place. But its not really as bad as OP's case where he doesn't even get a first page ranking, i noticed all the major news sites or sites with high traffic rank like Gadailynews have duplicate content, when you search a news article sometimes you get 10 duplicates on the same page, all from major news sites or websites that have a high traffic rank to begin with, gadailynews ranks on the first page for almost all of their duplicated content articles, and they don't even have a list of writers for the site.

What is interesting about Gadailynews is the site looks nice and clean and fancy like a professionally paid for site, yet majority of the articles are from STAFF with no name and are duplicates of other news articles yet they have a first page ranking and good traffic. I thought Google rewarded unique content that was the golden rule for bloggers. I was told when I first started blogging that unique content will rank well and help your site but this was years ago.

The goals of "accessible and useful" do not necessarily equate to showing the original author first.

This is less about which website ranks for the same article or which one posted first and more about why a website with a snippet would rank above the full article, especially if the snippet links to the first article.

For the full article, apparently not. But using the snippet with backlink method, if Google were ranking them properly, absolutely. Currently though, they're not, and that's the issue.

The idea would be that Yahoo likes the article, publishes part of it the give readers a taste, then links to the original article for people to read more if they're interested. It's good for everyone (if it worked correctly in SERPs.)