The Moz Blog

I think that sometimes, we in the field of search marketing try to make the concept of ranking more difficult than it really is. True - there are hundreds of ways to build a link, an infinite number of keywords, thousands of unique sources to drive traffic along with analytics, design, usability, code structure, conversion testing, etc. However, when it comes to the very specific question of how to rank well for a particular keyword in standard organic results at the engines, you're really only talking about a few big key components.

#1 - Keyword Usage & Content Relevance

While I don't believe in keyword density (reference: nonsense), there's no doubt that using your keywords intelligently and creating a page that is actually relevant to the query and searcher intent is critical to ranking well. My general best practice is to use the primary keyword phrase as follows:

In the title tag once, and possibly twice (or as a variation) if it makes sense and sounds good (subjective, but necessary)

Once in the H1 header tag of the page

At least 3X in the body copy on the page (sometimes a few more times if there's a lot of text content)

At least once in bold

At least once in the alt tag of an image

Once in the URL

At least once (sometimes 2X when it makes sense) in the meta description tag

Generally not in link anchor text on the page itself (this is a bit more complex - see this post for details)

For those who've done the nonsense words testing to see how the engines respond, you know that you can certainly get some extra value out of going wild and stuffing the keywords all over the page, but we've also seen that once you reach about this level of saturation I've described above, you're getting about 95% of the value you can get, and even the tiniest amount of extra link juice can make a page like this outrank a "super-stuffed" page (usually).

#2 - Raw Link Juice

Some people call this PageRank or link weight or link power - basically it refers to the raw quantity of global link popularity ascribed to the page. You can grow this with internal links (from your own site) and external links (from other sites). A page with a phenomenal amount of global link power, even if the sources aren't particularly relevant and the keywords are barely used, can still rank remarkably well in Google & Yahoo! (MSN & Ask are both a bit more keyword & subject focused from what we've seen).

Link juice operates on the basic principle that was used in the early PageRank formula - that pages on the web have some (low) inherent level of importance and that the link structure of the web could help to point out pages with greater and lesser value. Those pages that were linked to by many thousands of pages were very important and thus, when they linked to other pages, those pages must, by extension, also have great importance.

Carrying this theory back to your own pages, you can see how raw link juice will have a large impact on how the search engines score their rankings. Growing global link popularity requires both link building (so your site has enough link juice) and intelligent internal link structure (to ensure that you're flowing that juice to the right places).

#3 - Anchor Text Weight

As the search engines evolved in the early 2000's, they picked up on the usage of anchor text and found that by weighting the keywords and phrases pages used to link, they could get an even better idea of what pages would be about and which were most relevant to particular subjects. The anchor text of links is now a critical part of the ranking equation, and when seen in great quantity, it can overshadow many other ranking factors - you can see plenty of web pages that are weaker in all the other three factors I describe here ranking primarily because they've earned (or, oftentimes for commercial terms, bought) many hundreds or thousands of links with the precise anchor text of the phrase they're targeting.

Note that anchor text comes from both internal and external links, so if you're trying to optimize, it's wise to think about how you're linking to material from your own pages - using generic links or image links may cost you some of the ranking power you'd otherwise earn by having internal links with accurate, relevant anchor text. However, you can go overboard here, so be cautious - and note that 100,000 internal pages linking with anchor text doesn't provide the same value as 100,000 external links with that text.

#4 - Domain Authority

This is the most complex of the factors I describe in this post. Basically, domain trust refers to a variety of signals about a site that the search engines use to determine legitimacy. Does the domain have a history in the engine? Do lots of people search for and use the domain? Does the domain have high quality links pointing to it from other trustworthy sources? Does the domain link out primarily to other trusted sites? Do analytics and registration information and temporal link growth fit with expected patterns?

To influence this variable positively, all you really need to do is operate your site in a manner consistent with the engines' guidelines. If you want to earn a lot of trust early on in a domain's life, get lots of sites that the engines already trust to link to you. And if you're looking to spoil that trust, link out to bad neighborhoods, use manipulative link growth practices that don't match up to queries or traffic patterns and play the churn & burn game.

As a wrap up, I'd love to hear your opinions on these four factors and whether you think there should be 5, 3 or 20 instead.

p.s. Remember that this post is my personal opinion only! Sure - I'm basing it on my experience, which is relatively robust, but I don't doubt that others have there have very different conceptions of what comprises the bulk of the rankings equation, so please use your own judgment (note our new blog disclaimer, which applies to everything you read here).

Nice roundup. But you forgot to mention that too many links with the same anchor text might hurt a page. I think this is important to stress, specially when clients finally understand what anchor text is and try to squeeze out that link power any time they get a chance.

Not sure why you don't believe in keyword density. I don't think search engines are smart enought to compreend text yet. So relevance should be measured by keyword density, among other factors, until contextual analysis becomes a better tool for them.

I consider your advice to be conservative white hat. Which is not a bad thing in itself :-D

C'mon Rand, that link if filled with math equations. I'm a word man (better than a bird man, in Jim Morrison's opinion), not a math man!

Anyway, you can analyse the issue to death, but when you come down to it, if you write naturaly about any subject, you'll have a higher density of some words related to that subject. If that's not keyword density what is?

I agree carefu, and while I believe keyword density (and distribution) is important, I also believe there is a point of dimishing returns, not only in search engine value but perhpas more important - user readability.

Could this be what tipped the scale - Jim's domain is registered for 10 years (as Rand is recommending) and Matt's domain is registered for 6 (and expires sooner)? Or was it the number of links from del.isio.us (59/17)? Or just the most obvious - Jim's title and url have the keyword ("linkbait") closer to the beginning?

IMO keyword density gets way too much attention in some SEO circles. You see sites promoting keyword density analyzers and stating definitively that there is a specific standardized threshold. These tools either imply or state outright that if you go over that threshold the page will be demoted or even penalized. I believe that engines are smart enough to differentiate between unnatural keyword stuffing and natural use of targeted keywords. They may not be perfect but if the keyword is used naturally but goes over the arbitrary density threshold the engines will be able to recognize it and not punish the site. That being said I also believe they can tell when a site is just stuffing keywords to increase the number of times it appears on a page. I don't pay a lot of attention to keyword density. I create content that looks and reads naturally while trying to focus it on the targeted keyword. If I go over or under what is recommended by the tools, so be it.

I don't pay a lot of attention to keyword density. I create content that looks and reads naturally while trying to focus it on the targeted keyword.

Isn't writing relevant, natural content, while paying attention to 'targeted' keywords, done so because you want to make your content 'keyword rich'. Whether or not you are trying to hit a specific saturation point, it's all done for the same purpose isn't it? So can keyword density be defined or applied in different ways?

Does this mean that my Web development company putting "Website by HyperArts" in the footer of every site we develop is killing us instead of helping us? It's a completely legitimate backlink. I don't think we can assume that Google would just wholesale brand these links as "bad" or "spam."

"Not sure why you don't believe in keyword density. I don't think search engines are smart enought to compreend text yet. So relevance should be measured by keyword density, among other factors, until contextual analysis becomes a better tool for them."

I'm quite new at this, but, from my experience Google does comprehend text. I have pages ranking for keywords that are not even in the content of the page, at least not the exact term, nor in the tags. This leads me to believe that (1) Google will rank pages for synonyms (2) The order of the keywords does not matter and (3) Keywords inbetween keyword phrases are ok. (Although exact keyword phrases would rank better)

1. Keyword proximity. When you have multi-word keyword phrases, the proximity of the words is an important factor that search engines also consider. See section 2.3

...it has location information for all hits and so it makes extensive use of proximity in search.

You say:

Never in link anchor text on the page itself

The problem with this one is that you assume the search engine doesn’t associate the text on the link with the source page as well. See section 2.2 for evidence to the contrary:

Most search engines associate the text of a link with the page that the link is on. In addition, we associate it with the page the link points to.

2. Query- specific link analysis vs non-query specific. When your page can rank high no matter how relevant the sources of the links are to your keywords, it means the link analysis algorithm in use is not query specific. HITS is a known example of a query-specific one.

3. The text surrounding the link. We often forget that oftentimes, the text surrounding the link can provide more relevance information to the search engine than the text on the link itself. This is particularly the case when the text just says “click here”. Search engines cannot expect every site owner to be SEO savvy. See section 6.1:

As for link text, we are experimenting with using text surrounding links in addition to the link text itself.

I'd like to propose that Feedburner subscriptions, item uses, and frequent publication of relevant content that relate to "today's search volume" be part of #4. I don't think that Google bought feedburner primarily because they wanted to provide a nice service. I think that they bought it for the data.

As an experienced .Net developer who's recently been gaining SEO skills I'd be inclined to add "Ensure Your Site Architecture Is Built With SEO In Mind". I'm specifically referring to Flash, AJAX and CMS issues which, believe me, if left to most developers they will incorporate if you aren't in control and thereby negate a lot of the other positive ranking factors you suggest! Also duplicate issues on eCommerce sites (multiple categories etc) and URL ReWriting are hard issues to get across the importance of to many (stubborn!) developers.

As the majority of commercial site owners don't actually hand code their own sites I think this is particularly important to add this to your list Rand.

I totally agree that building search-friendly pages is critical to the SEO process, but I'm not sure it needs to be in this post, which is really about what the big factors in search rankings are - I think if you're hitting on these cylinders (particularly the first), you're probably good to go with regards to indexability.

While not ideal, I wouldn't think these will dramatically hurt rankings as it sounds like you are primarily talking about duplicate content, in which case those duplicate pages will be filtered out, assuming that the same product pages are being served up under different URLs.

The impact may to crawl equity through inefficient crawling that comes from duplicate pages.

It may effect rankings depending on how much link power the site has. How many pages does it have in the main index? Over 1,000? Duplicate content will be filtered out but by avoiding PageRank dilution you may increase the number of pages in the main index.

More pages in the main index = more internal anchor text = (possibly) higher ranking.

Now dont qote me on this... but duplicate content does seem to harm a pages visibility - infact for one iste, I have the wrong versions of the page in the Yahoo index ranking - we had to quickly start placing some 301's...

Rand, I could kiss ya face. Excellent, excellent post. Don't buy 3092 page SEO books, you've got 90% of the ingrediants right there. If you put that into an e-book with another few pages of best strategy to get those ingredients, you'd have the most efficient SEO book going :)

It is awkward to mention in company meetings that the more I learn about SEO, the fewer elements I find myself focusing on. I guess that everyone supposes that it should be the other way around: that I should be learning about thousands of new elements that affect ranking. What I am learning, however, are the best ways to focus on these few important elements. It just sounds too simple to other people. Maybe I should make up some buzz words just for the meetings: "yeah, I learned about the algorithm update modification section in the source code and am currently researching how we can implement that into our SEO." That oughta do it.

Haha. If nothing else it will satiate everyone. I find that when I start using SEO buzz words everyone just sort of looks at me and nods since they don't know what the hell I am talking about. Besides they can play some buzzword bingo while you are talking.

I've been telling other bloggers who are now discovering SEO that the vast majority of the tips you see are in that last 10%. A lot of newbies focus on that 10% because it's what most of the vocal SEO's spend 90% of their time talking about. The great analogy is that the last 10% is like a sprinter leaning to hit the finish line first, there's a tremendous amount of "easy" things you can do to help you win the race. When you start being competitive in ultra competitive areas, that's when you need that final 10%.

Well said. Although the old idiom of "if you build it, they will come" is not so true as it used to be... but you are dead on themoney - get the basics right, and the final bits are just more productive.

Sometime I shudder to think if "authority" sites actually started huge focus on SEO and content, what would happen to industries that compete with them to provide info...

Hey Rand I'd say you're right on the money. People make stuff to complicated. From a BlackHat prospective this is all I look at.

Link Volume & Anchor Text ( these are one in the same for me)Parasite Domain WeightKeywords in the Place you mentioned

In my opinion there's way to much voodoo out there and the bottomline is most people don't have 10k sites to see how this stuff works on a repeatable level. They have 5-10 sites that they've hypothesized theories out of, but in my opinion that's just not a big enough sample.

The nonsense word test is very powerful and you can learn quite a bit from it. I think most b;ackhats have been testing those for years to reverse what the algo's are doing.

I do not strictly follow the method you mentioned above. I do though take care of the semantical structrure and presentation of my web pages/web site content.

Additionally I concentrate on Googles PhraseRank algorithm, which as all already must know here, is a process of indexing including identification of phrases and/or related phrases.

That system analyzes the sequences of words and evaluates them as good or bad phrases. Good phrases are the ones that occur quite frequently across the indexed documents or have a distinguished appearance. And besides all above, it is a very interesting method against manipulation attempts such as with the well known nonsence method "keyword density" or the "dynamically generated" keyword-rich web pages.

I didn't read all the comments but I did do a search for the word "crawlability" (if that's even a real word) and didn't see it anywhere on this page. I don't know about the rest of you guys but I receive a majority of my traffic from the long tail keywords and while all 4 of these points are great and might help you rank for some competitive keywords and are all necessary, making sure your site is crawlable for search engine spiders is just as important especially for an ecommerce site.

I worked with a couple of clients who had everything else but had a javascript only based dropdown menu and Google was only able to index a very small percentage of the products in the database. Once adjusted, traffic tripled month over month as pages were indexed. Another example would be another client that was using a Category page that used a form submission to deliver the product detail pages and needless to say, no product detail pages were indexed. By correcting the crawlability, search engine performance can increase dramatically while if you have crawling issues, it's a giant hindrance and in my book that makes it just as important as the 4 issues in the post. So here's to #5, Crawlability!

Rand just curious. What are you feelings on OBL's on your web docs that you are trying to rank. Sure relevant ones can and do help the doc rank, but what about "PR hoarding" or "PR leakage". For example nofollowing links to shopping cart or contact us pages. Is this even necessary in your mind?

I'm not a fan of keyword density either and think keyword prominence is far more important.

The problem with "keyword density" in my mind is the perception, that there's some mathematical high point that matters. Like hitting 5.6% density will make any page rank number one, but if I fall below 5.4% or rise above 5.8%, then the page won't be optimal.

Sometimes I think we feel we need to "label" everything in SEO, with some special name built on some theory or rule... there's nothing wrong with "keyword usage," that is, using keywords multiple times on a page where they make sense. But things like how competitive a phrase is aren't taken into account, which means that keyword density for any page could be completely different than any other page... in which case, isn't keyword density just a measurement, which doesn't in itself carry a causal relationship with anything else?

I think these are all great "factors," and in this light, yes, it probably is far more simple than made out to be.

But I'm not sure that factors is the best term. I would compare these to schools or departments within a university, in that each school covers a very broad topic or theory of ideas, which in general may be rather simple to define... where it gets complex is all of the individual theories and sectors within each school, some of which may conflict with others, and the real complexity comes not only in knowing and understanding the sectors, but how and when to apply the teachings.

If you read the link - you'll see that keyword density is not even a metric the engines would use to measure spam or stuffing. It's really my opinion that density as a number is 100% useless - there's no "best range" no "defnitive limit" nothing.

I do like the phrase "keyword prominence" though - that's a great way to describe it.

Yes, I should have clarified that better... by "metric" I wasn't meaning to imply search engines, but as a metric in the most basic sense.

The danger though is exactly as you have identified, being able to measure something is one thing, but it doesn't mean that a metric is used by search engines or of any value, to search engines, SEOs, or anyone else.

Sometimes being able to measure something gives it value that doesn't exist to begin with.

Keyword prominence is certainly easier to nail down than the very squishy concept of keyword density. What I haven't been able to find on the web is a description of how the spiders crawl a page and what they count as content. Is it only the text between the body tags? What exactly does the spider to be the top of the page?

We only have that exact term in our meta keywords 1 time and nowhere else on the page. The word developers is only used twice on the page. Once in the meta description (flash developers) and once in the meta keywords (website developers).

We do not rank well for the following scenarios;

inanchor: website developersintitle: website developers

Google does not see website developers as one of the incoming links into our website per our webmaster tools area.

So, how do we rank #1 out of 105 Million results for Website Developers?

I just would like to point out that you dont need to be an SEO expert to use these recommendations in practice. Even since we read this article, we changed the way we do Websites (we are a design, not an SEO firm) and made sure our customers provided their text optimised as per "Factor #1". Just by using this advice, we saw a tremendous increase in organic search traffic for all of them.

Great post Rand. Perhaps I can convince you to take this a step further and address the relationships between pages and their effects on SEO. As demonstrated at SMX Advanced there are still a lot of questions and disagreement in this area.

I typically see one of four points of view.

Raw Unbridled Atomic Theming – One or two major keywords should appear in every <title> element and the more often major keywords appear on every page throughout a web site the more optimized that web site is for those keywords.

Siloing or Pyramiding – Use your major keywords for your home or category pages then break down each category using long tail variations.

Neutral – Search engines evaluate each page separately and will select the most relevant page or pages based on content and inbound link anchor text without regard to the content on other pages, except for inbound internal link anchor text.

Negative – Using major keywords too often on too many pages will dilute the ranking strength of all pages and drive rankings downward.

I have seen each of these in use or heard SEO professionals promote them. I suspect that search engines have pretty much settled on the on page factors that they like and that the relationships between pages is where many on site evaluation advances are happening.

I myself am a silo proponent. This methodology promotes logical structures and, for people, easy navigation and usability. How about you?

I think that the approach you take depends on the site. If a site's enormous, with hundreds or thousands of pages, siloing can work great. If it's a smaller site, you have to look at a more atomic approach (kaboom).

As in most things, I don't think there are any clear-cut, 100% cases. Best to keep all the methods in your toolbelt.

is, I presume, a reaction to the "Entertainmen" post that went big on Sphinn recently. Any plans to post your thoughts on that, any lessons it's taught you (there may have been none to learn)? I know that your recent link from the Great To Good post to a dissenting view is one thing that came out of it.

And in case anyone thinks I'm trying to stir up old rubbish here I have to say that I though the post in question was tottaly off base in it's tone. It just struck me that it was such a big deal over on Sphinn that a '1 week later' post would be interesting.

Actually - it was in response to some critiques from some of the people I respect most at Cre8asiteforums (although Andy Beard also brought it up). Regarding that post - we actually had a quick talk about that individual at the office and voted unanimously that in the future, no one from SEOmoz would respond to anything from him, so probably no follow up there, sorry :)

But Ciaran, that's a perfect way to phrase it and totally true. We really do "daren't answer" - when there's so many positive things to spend time on (and so much demand on all of our time, especially now), that expenditure is very hard to justify.

If we just do a good job at providing value on the blog, with the tools, in the articles, with Q+A, etc. we'll hopefully be able to earn the respect of folks like that - a much better and more positive way of going about it than to get into catfights :)

I came into work today dead set on posting a youmoz piece that seems it would have been almost identical to this. I think you hit everything I was going to say, however I was going to use 3 more general factors.

Relevance

Popularity

Authority

I think that the way that you broke it down you have a lot less interlinking topics though. Great post, makes me feel good that someone got this posted, even if you beat me to it.

Excuse me i'm not good at english, i'm leaving in senegal . This post is really interesting. I just want to ask you this : According to what I learned at SEO, google attach most importance to backlink relevance. That is contrary to your assertion "A page with a phenomenal amount of global link power, even if the sources aren't particularly relevant and the keywords are barely used, can still rank remarkably well in Google & Yahoo! (MSN & Ask are both a bit more keyword & subject focused from what we've seen).

I hope you link to this post in the 'beginner's guide' - this is pure practical gold for any new SEO! Theory is great, but a 'what to do and how to do it' post will usually trump a 'why to do it' post...

On Point 4... we certainly experienced domain authority in a positive way for our old domain. Any new blog post or forum post immediately ranked highly in Google (got indexed and ranked within 30 minutes typically).

However, having migrated our site and changed our domain, and despite doing all the 301 redirect stuff, we're more than 2 months post migration and have clearly lost our domain authority. We don't know how long it will take to come back.

The full story for anyone interested in our site migration and the SEO impact is at: http://econsultancy.com/blog/3244-econsultancy-site-migration-and-seo-impact-the-story-so-far

I think another factor is even more important: social engagement.It's not recklessly that comment spam has been growing. Email is being less used (being spam just 1 of the factors, and the appearance of new communication technologies another one for sure) and ppl are using the Web more and more.On the last decate, Web was just 1 of many communication channels (NNTP, email, irc, etc). But on this ending decade everything merged to web: chat, forum, relation networks, webmail, blog, even web services! And Web was static, now it is becoming more and more interactive, even more with AJAX. The tendency is that every new technology runs over it, with some ppl even foreseeing it wil take tradicional Desktop place!Then, in the past friends liked each other's sites and webmasters asked other sites to trade links. But now content is interactive, this post is a nice exemple, it talks about a subject, we come and comment about this subject, and with our comment we leave a link to our sites.If the subject is good enough, we talk about it on our sites, and link the original post, giving it the "juice". And the the original site receives a <b>pingback</b>, knowing that we comment about it on our sites, and links our sites again.Nowadays, instead of references being made of simple link trades, they are based on relevance!

I'm wonderring whether the translation is authorized from seomoz, therefore the link to article source is unnecessary. OR, your articles can be taken into translation and published elsewhere online freely?

# 1-2 in <title></title> (variation accepted, the closer to beginning, the better)# 1-2 in <meta description></meta description># 1 in <h1></h2># 3-5 in <p></p> (first, middle and last paragraph)# 1 in <b></b># 1 in <img alt></img># 1 in URL (.com/keyword/)# not in link anchor text on the page itself

As far as I've been able to tell, updating a page on a regular basis has one effect: it gets the spider to crawl the page more often. That's not going to affect rankings, except in that changes to a page can affect that page's relevance to a given query.

If a page is in the supplemental results, you might try updating it, though you also probably need to crank up the juice to that page for it to get recrawled. Page freshness is one factor (though it may be a very minor factor, who knows) in pages turning supplemental.

This is a great summary of the basic skills SEOs need when they are just starting out. Although there's other factors to consider I agree that these are the most significant.. the domain age especially in google. Which I think is a almost ridiculously over-weighted...

Hi stufoster. Not to put words in Rand's mouth, but I think what he is saying is as follows: if you want the homepage to rank for 'keyword' then indeed it shouldn't link anywhere with the single word anchor text 'keyword'. This doesn't mean that you might not have another page that you want to rank for 'keyword services' which is optimised for that and which you link to with the anchor text 'keyword services' (though note that I think that page should link back to the homepage with the anchor text 'keyword' in this example). To do otherwise risks keyword cannibalisation - it can be hard to be disciplined and avoid targeting the exact same word or phrase on multiple pages, but it really is important. Having a cannonical page for each search phrase is crucial - if you don't know, how can you expect the search engines to know.

This doesn't mean that you can't have similar phrases on different pages (or even phrases that include the words you are targeting on other pages) but you need to set up your internal linking structure to make it clear that page x is about 'keyword' and page y is about 'keyword services'.

Will - precisely! For example, with that "hedgehogs" page, you might indeed want to link to a page on "hedgehog mating habits" with that anchor text. Of course, you'd also want to make sure that the mating habits page linked back to the primary "hedgehogs" page with that anchor text, too :)

In that case, Rand, you may want to alter the text of the original post slightly for this point, as it (currently) still implies that any use of the keyword in an anchor text is inadvisable in any case.

I don't wish to rehash the recent discussion (and divergence) about linking and anchor text, especially as other posters got here before me, but I would strongly agree that the use of anchor text such as "hedgehog mating practices" linking to a relevant page on a page targeting "hedgehogs" is not only not bad practice but is in fact good practice on a number of levels...

I guess I'll add another one to the list, you sertainly don't want to use your keywords in the anchor text of external links. I am sure everyone already knew that, just wanted to put it down so it's here. :)

Rand:I think I may have over analyzed reading the basic search engine seo guides - In your example in this post of "hedgehogs" and "hedgehog mating habits" I understood avoiding cannibalization to mean that the single word "hedgehog" in "hedgehog mating habbits" should link back to "hedgehogs" page.

Finally someone has been able to communicate what I have been thinking for over a year. With all the crazy things going on with SEO and whats "HOT" this month that will get you rankings, it comes down to the basics of making a good website and making it simple, straight foward, and going "back to the basics".

You're obviously the expert - but this is dead-on with my own experiences.

I have pages with zero content - literally, just an index page that says "under construction" - that rank well because I've seeded so many links with accurate anchor text out and about on the web. And these are for somewhat competitive phrases (example: entertainment in {major city}).

And the page rank of the pages holding the links doesn't even seem to matter THAT much in the rankings.

The only thing I would add to this - and it is somewhat implied in your post - is the shear volume of content.

I know in order to attract the linkerati (and their valuable link juice) you need good quality content. I get it!

But I've had success with ranking for some sites with just regular run-of-the-mill content (a step above PLR articles or MFA articles - but a step below anything that would make it to any page of Digg, etc.) - even when I haven't had the links to back them up.

[Sidebar: My experience has been Yahoo loves lots of content and Google loves lots of links].

And these 'second-tier' articles are actually the ones that end up getting the most "long tail search" referrals.

I think the H1 pragma is bit too old and not as effective. Taking it from our product pages resulted in a substantial traffic increase. If not very careful, Google will penalize you for over optimizing.

Finally, it's important to note, some minor adjustment might win you better ranking on Google yet will pull you down on yahoo and others.

I only wish if we could indicate on the robot.txt page which bot to exclude without running the risk of generating duplicate content and almost having two version of the sites for specific search engine to see.

Neat post! I liked the content and I have to say the images were pretty creative on this one :) Particularly the pattern mentioned in #1 seems to help a great deal, especially if you don't get a whole lot of people linking to your page. In the real estate sites we do, we try to put up unique content about areas, but it hasn't been too popular to link to :(, but good on page optimization has helped us be found in Google :)

Im rather confused about the last point in your list of optimisation techniques where you state "Never in link anchor text on the page itself".

Your link text to that page in your navigation would have you keyword within the anchor text, right?

Are you saying that if you have a navigation that is constant throughout the site, it should not include the link to the page that you're already on?Example:If you're on seomoz.html then your link in your constant navigation that links to seomoz.html should be removed for that page only? And the same would apply for all other pages?

Sorry about the confusion - what I'm saying is that if you have a page that's targeting "hedgehogs" you would not want a link on that page pointing to any other page (on your site or any other site) with the exact anchor text "hedgehogs." You're basically diluting your own relevance or telling the engines that there's another page they should be considering as the reference for that particular term.

I don't really have a problem with having more than one page with some degree of relevance to the same keyword. After all, if one page is about pet hedgehogs and the other is about Spiny Norman, the enormous hedgehog who haunted Dimsdale Pirhana, both pages will be optimized for "hedgehog" in the process of optimizing them for the respective longer phrases.

Besides, it's always nice to get that second indented listing on the SERP.

I agree to an extent with this, but I always try to get two pages ranked for my heavy traffic keywords using the indented listings. This way, I get both the #1 & #2 organic spots for the term and increase my traffic.

The best way to do this is to point the majority of the keyword rich links at the main page, and a smaller portion of links to the second page to get it to show up as the additional listing. There is great value in having two organic listings together in the SERPs. I don't see this as diluting so long as you are not trying to promote the two pages equally, one must be the main and the second should not be promoted as much, but should be promoted enough to get it to show as the indented listing.

Picky, but c'mon Rand it's an ATTRIBUTE. This non-attention to detail is what worries me the most about the majority of people's writing on the web. I know what you meant, but what of something else where you mis-type what you meant, on a subject that is new to the reader, and then convey the wrong meaning to that reader.

Luckily there are less than another dozen repetitions on the site:
http://www.google.com/search?q=site:seomoz.org+%22alt+tag%22

randfish, great post. We, as SEOs, always have to think of target audience, and we never forget what is the purpose of the website. If you promise apples give them apples, in the long run they will be pleased and come back. So the word "consistency" will be the perfect word to describe what you need: consistency with keywords: in title, description, URL (if possible), content, and anchor text inks.

So Rand, how would you explain this? Yes, there is keyword usage and content relevance, but there is no link juice (unless Google is counting juice from links they don't report as being indexed), no anchor text weight, and no domain authority.

Your site ranks #1 for the exact keyword phrase of the domain name? I don't find that surprising at all, especially given how non-competitive it is. Certainly having the domain "keyword" for the exact "keyword" search has been shown time and again to be very easy at Google - they appear to perceive many of those as "navigational" type searches.

I don't want to worry you, but I've seen the following happen a number of times: a new site gets indexed and immediately ranks very well, just like the site in your example. A week or two later, however, it's dropped out of the top ten for everything except maybe its own name, and then, assuming it was optimized well enough, it slowly starts to work its way back up again.

That's precisely what I'm worried about and what I have set up the client to expect just in case it happens. I'm going to have to look around here on the moz and see if there's any more info about this...

But it's actually ranking in the first 10 results quite well for a number of keywords that are in the content but not exact matches with the domain, and I'm just puzzled in the first place by it being indexed within a few days of the site going live since I've never had a site get indexed so fast.

Google, Yahoo and MSN do not pickup the title attributes of images or hyperlinks. If you use them properly though, they can be helpful in terms of accessibility and usability. If you stuff them with keywords or irrelevant descriptive text, can be that penalty filters can be triggered, which I am not sure though. But I know that can happen with the alt attributes, so I assume that it is possible to happen with them too.

I agree that keyword stuffing can be beat pretty easily with just your recommendations of proper keyword saturation and linking; it's so much about the linking and the authority (which are ultimately married for the long run). It makes keyword stuffing unsafe if you're doing client work.

A great post. As an aside, as much as they are intended just to be fun for most of the readers here, I think the four images could be used all by themselves as a mini-introduction to SEO. They're excellently done.

I disagree that domain authority qualifies as one of the big four. Just run a search on "free seo course" and clickthrough to my post (I'm too lazy to link to it, sorry).

Link trust isn't necessarily tied to domain authority, domain age, etc. A highly authoritative domain, SER of example, can sell links, and do. Those links cannot be trusted. The phenomenon of wikipedia ranking for just about anything and everything under the sun, in my opinion, is largely due to high link trust (most inbounds to Wikipedia are editorial, unpaid for, unreciprocated, many from trustworthy sources), sheer link power (just about every page has high TBPR due to millions of backlinks), relevant on-page content, laser-targetted anchor text on 99% of all backlinks, and internal, contextual anchor text.

Domain authority says v7n can rank on the first page for [britney spears naked] or [free seo course] due to its link popularity (high PageRank). Google wrote several papers citing that exact problem as reasons for exploring solutions beyond PageRank for ranking search results.

Halfdeck - I read your blog post and I am still confused as to what is the point you are trying to make.

For me, trust (TrustRank) is more about filtering the spam than giving weight to an specific site/domain.I agree that the authority of a domain has more to do with the quantity and quality of the incoming links. I do think the temporal factors play an important role too.

As Rand mentioned on his post about parasite hosting, it is possible to rent pages on domains with high authority and get those pages to rank high for competitive terms (I've been there, done that). Those pages need to be keyword rich, though. Do you remember the Wordpress fiasco?

I think incoming links with targeted anchor text provide bigger ranking boost than on-page otpimized pages on a high authority domain, but I think the later provides a more stable ranking. Which one provides the most ranking boost is still a judement call IMHO. On the other hand, Google can tweak the parameters of their ranking formula at any time and make this type of disccussion irrelevant.

I'm not sure I'd put V7N or SEOmoz in the high "authority" category, despite our million+ links. I'm really talking about how a page with no links on a site like Wikipedia or many EDU and GOV domains, big news sites, etc. will rank despite hitting none of the other factors particularly well.

Rand - That is very easy to test. Just make a keyword rich page about Viagra, host it on this domain and link to it from the home page or any other page. I am sure you will be ranking very well for Viagra related terms in a short period of time :-)

I'm not sure I'm OK with you characterizing SEOs who pour their heart and soul into marketing, link building, content creation and the methods of optimization described above as "lazy" - there must be a better way you can phrase your messages. I've heard from folks who know you that in person, you actually come across as quite a reasonable, intelligent, friendly guy - and it's only on the web that you seem to adopt this overly negative persona. Maybe there's a way to work on that?

Your substance is usually good, and we love having you here, but there must be a way you can improve the style, right?