Link Development Forum

I'm hopefully launching a new site this week. Part of the launch involves taking a site from 21 pages, to 5000 pages of content. The content is all good stuff, not what you're thinking.

Because I've got so much content it's hard for me to sort through it to give myself internal links. My intention is to let the content get indexed first. Then I'm going to search my site using Google for the juicy terms I want to rank for, and order them by whatever Google feels is order of relevance.

Then I'll take a few of those top ranking pages for the term, find the term on those pages, and link to another page on my site that I want to rank for that term. Maybe even link to the homepage in some instances.

Now I'm no PR distribution wizard. Does this seem like a good way to approach this? A different technique I should use? Any limits I should be careful about?

Thanks for the detailed response. I appreciate the time you put into that.

The nature of the content has dictated the site architecture. The individual pages are 'bound together' in groups of pages. Think of it as a collection of discrete articles. These 10 pages belong to article A, these 10 pages to article B. And the various articles don't have a whole lot in common theme other than they're the theme of the site. So my architecture is kind of like 'Articles' and then a list of articles, then the individual pages.

What I'm trying to achieve is circulating PR or something like that, using new internal links to get other pages to rank for search terms. Wiki does it, I suspect it's a valid ranking/link building technique.

I always post large chunks of content slowly - I have a newish site I am still releasing that I have 8000 entries for. i am at 2500 live so far after 5 months (I really slacked off over xmas - must get back on it). Post 'em one by one and look at them individually to see what linking they need. Tedious, but it returns the most value.

One comment on this - the site: operator can return some funky results. In essence some processing goes on when Google determines what to return, and the results may not be indicative of what Google really sees/knows on your site.

It's an interesting idea, but might be unreliable as Google degrades the site: operator, just as they have link: in the past. If you'd like to see what I'm talking about check the site: operator against the # pages indexed from XML sitemaps in GWT. The variance can often be in the high multiples.

I don't think he's wasting time. I think this is a brilliant idea. I do agree he should release the articles maybe 50 at a time in distinct periods.

Famous story man see's an archer in the woods, with his quiver of arrows empty, resting at tree. he sees hundreds of arrows embedded in the trees, perfect bulls eyes. "How did you hit every bulls eye" the man asks in amazement. The archer answered I Simply fired the arrows, then drew the bullseye around the arrows!

Let Google choose draw the bulls eyes on the trees, then stick arrows(Relevant Keyword Links) into the trees!

I look forward to seeing the results of Wheel's test. Please keep us informed.

I'm wary of the accuracy of doing that for pages that are not well established. Maybe give it a try with Bing and Yahoo for a second and third opinion because they seem to focus on content a little better. Google is heavily weighted by links, sometimes ranking a site with zero content, just 404s- meaning that Google is overlooking content altogether for some sites and ranking solely on links.

I think your biggest flaw is assuming all 5000 pages are going to get indexed [right away]. When you do your site commands you could have hundreds of pages that might be best for getting ranked but they just haven't been indexed yet.