Aaaalrighty, then.
Now I have a new issue. If the Google 110 k thing is correct, how come the page in question is COMPLETELY cached by google? Now, again, I have given ya'll the number 327 k to include both text and images, but does Google differentiate?

Geez, this is getting more complicated than even I imagined.......I checked the cache and the entire page is there...I also checked a few downloads some of the folks here were nice enough to give me (thanks again, ya'll! ) that would check the page in question and they all came up with more/less the same number.

Blech......ya know, sometimes I think I should stay off the computer for more than 30 minutes at a stretch and then I'd be able to just ABSORB information...........is it possible to have screenburn on you eyes?

Personally, I "lecture" any new customer on the virtue of our 7 second rule of
having the home page come up in less than 8 seconds on a 28.8 modem. I
picked that number up several years ago and I keep using it even though it is
probably outdated.

That is all well and good and I don't want to lose visitors as one contributor
mentioned because the page won't come up quickly. The real reason is that I
am starting them off on what I believe is going to be practical SEO down the
line. (I am not saying fast is better for SEO, I am implying that text is better
than a jpeg taking up a bunch of space.) I need to balance the customers
desire (sometimes) to add big pictures and everything else on a single page.

The lecture allows me to start the education process of SEO and web design.
The two should go together. I like to get them off on the right step, I don't like
surprises anymore than the next guy. My motto: "If it were cheap and easy,
everyone would be doing it."

Oh yes, I forgot to mention that CSS will also tend to give you speedier pages
after the first on is loaded anyway. Speed is important, never PO the visitor.
If you have to handout a large anything, let the visitor make the choice. Have
small thumbnails that enlarge to larger ones. We have even added the kb size
so the visitor will know that this 900 MB file is gonna take a while. Just joking,
but you get the picture. Don't take them to a page and trap them with a big DL.

Be aware with today's internet "problems" that 7, 8, 9 or 10 seconds maybe
just a hopeful wish. I have seen some of our "fast" pages take forever to load
and who knows why. The next day everything is just dandy.

Not to in any way suggest that anyone do this, but what would stop keyword stuffing from 102k+ on the page? That way it wouldn't show up in cache and would be very difficult to catch. Or does Google not even look past 101k at all, even for indexes? That's important to know, as well.

Not to in any way suggest that anyone do this, but what would stop keyword stuffing from 102k+ on the page? That way it wouldn't show up in cache and would be very difficult to catch. Or does Google not even look past 101k at all, even for indexes? That's important to know, as well.

Ian

Ian,

I think your lawyer background is showing with a devious question like that

Actually I'm just jealous that I didn't think of it. It is indeed a potential loophole if they read past 101K.

101k is a lot in one way, and may not be enough in another. For example, the first page of this thread is 98.3k. One more post on the page would not have been indexed, for example, and if at the bottom of that page was a menu structure or some links, they would not have been followed.

This would have implications for heavily coded pages, since the html must be saved for the cache to be visible as a page instead of a jumble of text. You can check this by doing a view source of the cache that google shows you.

It would also have implications for a catalog page on an e-commerce site - the bottom items might never get indexed no matter how highly ranked the page is! Ouch. And some E-comm sites tend to be code heavy.

I'll put that on my list of questions to ask for the next SEO conference (hopefully I'll be there)

In the meantime, I'll start looking for pages in google's cache that are more than 100k, and if I find any that are cut off I'll do a search on any unique text I can find on the original. Hopefully that will tell us.