Does it matter how many internal links - subpage?

rDolay

Expert

Posts: 540

3+ Months Ago

We are going to build a huge knowledge/usermanual/faq databases and there would be accordingly 1000+ page that have real/orginal content.

However i am really curious about the amounts of the subpages, since google started to banning the high pr sites that have 1k-10k subpages at he last updates:
-Some sites that have over 1k subpages with the same content banned from google index and their pr's have been erased
-even the high pr sites that have over 1k subpages with real content , their pr lowered very badly at the last updates.

Maybe since this was applied to all of the sites, it doesnt matter that the subpages pr's lowered , because this wont change the ranks since it was for all the competitors however i can not understand it with it details.

I mean could google give good points if we made a good folder-link structure by categorising like that or do we have to linking all the subpages from the main pages and linking from the all the subpages to the main pages?

Whats more that can you suggest for not to be banned because of hell of subpages and keep their PR's with 1-2 minus than homepage pr ?

Oh really thank you very much , i could not think that lol altough they were ready with pr of 4 but by this was google tought they are not subpages but other sites and so will google tought the sub sites are main domain contents or they are completely different sites?

You know the total content of a web site does really matter for rankings.

UNFLUX

Genius

Posts: 6367

Loc: twitter.com/unflux

3+ Months Ago

title fixed -- please do not use all caps in your topic titles. Thanks.

madmonk

Mastermind

Posts: 2115

Loc: australia

3+ Months Ago

Quote:

-Also i have notice that some high pr sites that have over 1k subpages with real content , their pr lowered very badly at the last updates.

There are exceptions when having more sub-pages have a negative effect on the PR. - try finding this in a search engine site -

but, if you have real contents (assuming that they r good as well) and you shouldnt have to worry about sub pages lowering yr site pr.

rtchar

Expert

Posts: 606

Loc: Canada

3+ Months Ago

If I wanted to build a large data directory, I would look at the largest knowledge base in the world ... Microsoft.

They use a combination of servers (office.microsoft.com), and directories (http://www.microsoft.com/windows) without penalty.

I have created graphs of link structures for several large sites, you might want to look at those as well ...
http://www.mrktcity.com/viewlink.html

darksat

Proficient

Posts: 487

Loc: London (via the rest of the world)

3+ Months Ago

The higher the PR you have the more pages google will index on your site.
microsoft=PR10
they dont have the same problems.

rtchar

Expert

Posts: 606

Loc: Canada

3+ Months Ago

Maybe I should make myself clear ... I thought I was answering his question.

I wouldn't use MS as an exmple since people link to it naturally....they don't need to use SEO tactics to get good PR/Rankings...and their site is laid out very poorly...they use naviagtion that is not readable by SE's...especially in the knowledge base.....I'm sure they use different servers to balance the traffic load but the example you are showing uses subdomains....technically they could all sit on the same server. I would look at directory sites to get a good idea of how to architect a huge site.

rDolay

Expert

Posts: 540

3+ Months Ago

Finally i have completed the huge knowledgebase, now there are 3 pages that have 160 - 70 - 80 links to the subpages but i am afraiding to upload them... For the truth our ranks just got better now and if they bag into deep again i think i cant stand this failure anymore.

Will those three pages make the whole site banned just because they have over 50 links to the subpages?

P.S. Those pages are not really just for Google ranking but very important for visitors so please do not offer scattering them to the multiple pages if i got any negative answer i will erase the url from the linking (directoy file linking).

rtchar

Expert

Posts: 606

Loc: Canada

3+ Months Ago

We looked at this very question a while back ...

phaugh wrote

Quote:

It's a guidline not a rule. Here a PR5 page with 100's of links: http://www.internetbusinesscoach.net/linklist.asp

here's another: http://www.buyerbrokerorlando.com/realtors.htm

The only thing you need to worry about a page with 100's of link on it ....is that you are not one of the links below the 100 link suggestion mark. The google spider will crawl the page and bail out somewhere around the 100th link mark. Then you will not get credit for the back link.

rtchar wrote

Quote:

I checked the first one and there are 310 links on the page (297 are external links).

So then I looked at the backlinks of the last group "Web Site Hosting" ... and go figure ... the back link is there in Google!

So I guess the bot does NOT stop after seeing 100 links.

I didn't think it would affect the PR of the offending page, since that is calculated from incoming links.

Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link. Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages. Create a useful, information-rich site and write pages that clearly and accurately describe your content. Think about the words users would type to find your pages, and make sure that your site actually includes those words within it. Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images. Make sure that your TITLE and ALT tags are descriptive and accurate. Check for broken links and correct HTML. If you decide to use dynamic pages (i.e., the URL contains a '?' character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them small. Keep the links on a given page to a reasonable number (fewer than 100).

it seems like Google does not bann pages with over 100 links but it gives more rank i think i have to bet again

rtchar

Expert

Posts: 606

Loc: Canada

3+ Months Ago

I don't think it is a technical limit for Google ... 100 links per page.

Your site will NOT be banned, and your ranking will depend on CONTENT, not link pages.

I think the limit has more to do with how worthless the PR value is after 100 pages. Remember PR is divided among the links on a page.

Quote:

I often wondered if it might also be a mathematical limit. I don't believe PR is calculated to 8 decimal places (probably only 2 decimals).

If this is the best format for your VISITORS then leave it alone.

Johan007

Guru

Posts: 1079

Loc: Aldershot, UK

3+ Months Ago

All your pages will get spidered presuming you can achieve a natural PR6. I would keep everything on the same domain and possibly use folders. The more pages in a domain the bigger your "base rank" is though not PR accross your pages and these pages maybe lower in PR but you can idnore that because its ranking that counts not PR.

rDolay

Expert

Posts: 540

3+ Months Ago

Wow! i see that Google just start to add new 100s of subpages at the "site:www.mysite.com site" result.

I noticed 3 important thing.1- I think there is no STH as sandboxeffect because those pages are not 2 weeksold yet.
2- Google accepted to crawl links from a page that have over 100internal links.
3- Google spider is hitting very deeply into our site lol i think there will be a PR or BL update