Excess pages polluting your website?

Many, many folks who talk about SEO love to parrot the old line, “Content is King”. It’s a concept I’ve addressedmanytimes in the past.

The common approach to SEO seems to be to throw tons of pages up on a website and hope that it will result in some random traffic.

I do not agree with that approach, mainly because I do not think that random content creates conversions like some people think it does. Branding and trust create conversions. But here’s another reason to remove all those random pages – they are bring down your search engine rankings.

Some folks have noticed a sudden spike in v7n’s Alexa rank from the circa 2,000 to circa 700. This spike is due mostly to search engine referrals. It’s the result of a little experiment of mine.

Folks say content is king. They suggest that adding more and more content to a website can only be good. I disagree. I figure that adding non-performing content to a website will actually hurt your search engine visibility by diluting and wasting link weight on non-performing pages. By removing tens of thousands of non-performing pages, you conserve and direct that link weight and focus it on performing pages, and by doing so increase the ranking of performing pages in SERPs.

It’s a bit of a risky experiment – truly one that requires you to put your money where your mouth is.

About a month or so ago, I committed myself, removed pages from v7n that were xxx number of days old, had less than xxx number of page views, and less than xxx number of responses. Just to be sure that I didn’t remove any worthwhile discussions, I went through the list and checked anything that might be remotely worthwhile.

And I did not delete the threads – I simply moved them to a private, hidden, admin-access-only forum.

Within a couple weeks, I started to see the remaining pages performing much better. Within two weeks, search engine referrals were up 7,000 per day.

Content (marketing copy, etc) may be king when it comes to converting visitors, but for search engine rankings, link weight, domain authority and intelligent distribution of link weight appears to be much more effective, even when it means removing content.

That’s fascinating, John. You say that you “moved them to a private, hidden, admin-access-only forum.” So I take it they now cease to exist for both search engine spiders and human visitors if they’re hidden from both.

You could of course have deleted them with presumably the same effect. However you now have the opportunity of reversing the experiment in say 6 weeks and seeing whether your theory is proven. There’s the dilemma: knowledge or revenues.

It’s easy for me to say, but I’m guessing that there would be no adverse effect if you did that reversal in 6 weeks. I’m seeing a doubling of traffic from Google over the past week or two and have done nothing different. I think there has been a significant improvement in the Google algorithm, although I’ve seen little discussion on that. It makes you wonder.

Like Barry, I would be really interested to see what happened if you did reverse those movements. With those types of increases, I would also be hesitant to undo them. It is a dilemma.

I do think I agree with your conclusion. Many sites have pages that could be disallowed to search engines and the act might improve rankings, and often many of those pages don’t even provide any value if they were indexed – such as individual contact forms for every article page (without any unique content upon them), and similar pages. It makes sense to cull some of those out.

Wouldn’t have the addition of noindex, nofollow for the robots meta tag been enough to get the same effect?

It has the advantage, that you could automate that by storing performance data for each thread in the DB and change the meta tag if a thread drops below a certain threshold. The good thing is that it would still be accessible by humans as it was before, not?

If your internal links point to supplemental results (pages with extremely low PageRank/link weight/what have you), you’re basically wasting juice by linking to pages that don’t exist (they’re not in the main index). Why do that if you can link instead to a page that’s already ranking, right?

“It has the advantage, that you could automate that by storing performance data for each thread in the DB and change the meta tag if a thread drops below a certain threshold.”

That’s a good idea Carsten. I’m using that tactic on one of my sites to keep thin pages from getting crawled.

The page on v7n with the most links is the forum home page. Try to get to the example page from there and you will find it is a solid 4 clicks away from the forum home page.

Forum threads that are 3 clicks away from the forum home page tend to be indexed, and forum thread that are 2 clicks from the forum home page often generate search engine traffic.

But the way we have the forum set up, there are only 9,240 spots available within the 3 click range. Of course we could easily change and even double the number of threads within 3 clicks, but it would further dilute the link weight.

So, we have 9,240 odd spots. Should I fill those spots with threads titled “Hi I’m new”, which will drive absolutely no search engine traffic, or should I fill those spots with threads that stand a chance to drive traffic?

John, I completely agree with your last point. Those are great reasons to ditch the pages altogether.

And even if you did add noindex,nofollow to those pages, chances are they’d remain in Googles index in supplemental. Quite often Google ignores those directives and includes (or keeps) the pages – they just don’t rank for anything and they sit in supplemental, wasting space and probably dragging you down anyway.