Introduction

Here are 10 tips on SEO for your ASP.NET website. I'm sorry that some of the screenshots contain German language text, but I hope you'll get my point despite that

1. Avoid duplicate content

A mistake made very (very) often: when you register a URL and point it to your web server, you're using the "www"-subdomain in most cases. "www" is nothing else than a subdomain. Follow these instructions to get rid of duplicate content:

Imagine you're the owner of the domain "example.com" and you're configuring your web server using the IIS MMC and creating a new website:

The first info we need is the description:

Then we need the host header information:

In the screenshot above, you can see a very important thing: the host header for the configured website is "www.example.com".

After you've finished the configuration, you can type "http://www.example.com" in your browser and everything runs fine. Now, type "http://example.com" and ... the web server answers as well.

For search engines, this is a major problem - which URL is the main URL? When content gets published, which URL is the one where the information was published first? For Google and other search engines, "www.example.com" and "example.com" are two different URLs.

Avoid that by configuring a permanent redirect.

Create a second website on your web server:

Use "example.com" - your URL name without "www" - as the host header.

and configure it as a permanent redirect:

This way, "http://example.com" redirects to "http://www.example.com".

Well, now some of you will think that a JavaScript redirect or a Flash redirect will do the same, or even something like:

Response.Redirect("http://www.example.com");

But, this is not true. The web server will answer to the above shown redirects with a "HTTP/1.x 301 Moved Permanently" response. So the search engine get's the info that this resource has permanently been moved. This way, all ranking-power, link-power, etc., moves to "http://www.example.com", which is a major factor affecting the ranking of your website.

2. Avoid duplicate content , part II

Well, when requesting "http://www.example.com", your web server will deliver a certain page - by default, it's the "Default.aspx". Many websites, codeproject.com as well, redirect and link (internally) in a wrong way: open a browser and type "http://codeproject.com" - you'll get redirected to "http://www.codeproject.com/index.asp" - "index.asp" seems to be the default-site for the domain www.codeproject.com - but redirecting to it from "http://codeproject.com" or linking to it within any page of www.codeproject.com is just a big mistake.

No matter how big the influence in your site's ranking is, google's PageRank is still a ranking factor. So while www.codeproject.com has a PR of 7, www.codeproject.com/index.asp has a PR of 5 - while these two pages deliver the same content. This is bad, and has a very bad influence in CodeProject's ranking, because the two URLs deliver the same content. Google doesn't know that "index.asp" is the root-site of CodeProject, so Google tries to evaluate (through algorithms) which site is the more important one. But the major problem here is that all the link-power, ranking-power, etc., get's divided - CodeProject could rank a lot better without having "index.asp" in the search engine's index.

To make sure that your site's ranking reaches the maximum, you should concentrate on one URL - make sure, even in internal link building (links to the "Home" page of your website), that you link to your domain name: www.yourdomain.com, instead of www.yourdomain.com/default.aspx.

3. Choose individual, good <title>s

This is very easy to achieve, and has a big influence on your site's ranking: don't use static titles. The <title> tag is one of the best ways to optimize a site for a special topic or keyword. Don't use a dozen keywords in the title tag, because you're "paying in" for each and every keyword - "paying in" means, that the weight of your site gets divided for all the keywords mentioned in the title tag. Choose one or two keywords, or a keyword-combination, to achieve relevance and a good ranking.

So if you have three sites, the Homepage, the AboutMe page, and the Contact page, you should give these three pages individual, useful titles. The less keywords you use, the better the influence for these keywords will be. Same applies for <meta name="keywords" and <meta name="description", but the title tag has the biggest influence nowadays.

4. Clean up your source code

Something hard to believe but still true: clean up your source code and try to minimize the code as much as possible. Follow these rules:

don't use inline CSS, use external stylesheets whenever possible

don't use inline JavaScript, use external .js files instead

don't leave HTML-comments

don't use massive line-breaking (twenty lines with only a linebreak or something similar)

don't use viewstate when not necessary

don't use a <form runat="server> when not necessary (comes with hidden fields)

The better the relation between the content (==text) and the (HTML/CSS/JavaScript) code, the better your ranking will be. The smaller the source code, the better this relation will be.

6. Use <strong>, <h1>, <h2>, <h3>

Use the HTML markups <strong>, <h1>...<h6> regulary - use them to structure the site's content. These markups are made for weighing some words more than others - so it is for search engines. Use these markups whenever it's useful to do so.

7. Validate your source code

Use the wonderful HTML-validator at http://validator.w3.org to validate your site. If it's valid, there will be no punishment for a "bad technical solution" - use the validator to find problems, get rid of them, and find a better ranking instead.

8. Get to know your users

I don't know about you, but I'm building websites for users in the first place. So my intention is to get to know how I can achieve more interests for my site. The first thing is to find my audience - the better my ranking is, the more visitors I will have. So I'm using keyword-tools to find good keywords and keyword-combinations to find the keywords I need to optimize for (because that's what the keywords' potential visitors will use):

Find the good keywords, look at your competitors, and start optimizing.

9. Keyword density

Keyword density is very important for a site's ranking for a special keyword. Imagine you have a site about "free source code" - well, you should defiantly try to mention this keyword combination as often as you can - as long as it makes sense, of course. Try to reach a keyword density bigger than 3.5 - so mention "free source code" 3.5 times per 100 words.

10. Get linked!

Links to your site have the biggest influence to your sites' ranking. Try to get linked from as many other websites as possible. Use social bookmarking and other social services like digg.com to get more links and more visitors. Example: digg.com-integration of this article:

11. Use "speaking" URLs

Use www.urlrewriting.net for speaking URLs. The more keywords you'll get in your URLs, the better the ranking could be.

If you set up a redirection from example.com to www.example.com, you will loose the query-string during the redirection. For example, example.com/product.aspx?id=1 will be redirected to www.example.com/product.aspx.

To solve the problem, you need to change the redirection URL to: http://www.expample.com$S$Q, and tick the box labelled: "The exact URL entered above".

If you are using Google Webmaster Tool, you can specify the preferred domain. For example, your website can be accessible with www or without www. For that case, you can specify which URL you prefer for your site/blog...

>>11. Use "speaking" urls

Why do we need to use this site?? As we can write the URL-rewriting in our ASP.NET Projects, I don't see any point to use this site for my site/blog.

>>4. Clean up your source code

I agreed half of it.. I dont think we should remove all comments from HTML code.

>>5. Make your site crawlable

I think Google will do something to crawl the Silverlight website. I think there are some SEO articles for crawling the sliverlight website.

If you are using Google Webmaster Tool, you can specify the preferred domain. For example, your website can be accessible with www or without www. For that case, you can specify which URL you prefer for your site/blog...

Well, google is important, but not the universe (msn/live.com, yahoo, ask, ...). Nevertheless: if someone with a strong website for your keyword/topic links to the "wrong" domain, your "right" domain won´t achieve a better ranking through this (despite your webmaster-tools configuration)....

Michael Sync wrote:

>>11. Use "speaking" urls

Why do we need to use this site?? As we can write the URL-rewriting in our ASP.NET Projects, I don't see any point to use this site for my site/blog.

urlrewriting.net is a great, free, open source tool for massively cool url-rewriting. Clean postbacks, very easy to integrate, etc - it´s my favorite, but you can use any other rewriting technique.

Michael Sync wrote:

>>4. Clean up your source code

I agreed half of it.. I dont think we should remove all comments from HTML code.

And you "think" that, because...??

Michael Sync wrote:

>>5. Make your site crawlable

I think Google will do something to crawl the Silverlight website. I think there are some SEO articles for crawling the sliverlight website.

Well, google still is not able to follow flash-redirects, and flash-menues. Same DEFINATELY will be with Silverlight. Use Silverlight for cool Browser Games, etc, but not for site navigation. Otherwise there will be no chance for search engines to follow the navigation

if someone with a strong website for your keyword/topic links to the "wrong" domain, your "right" domain won´t achieve a better ranking through this (despite your webmaster-tools configuration)....

I dont think so. Google see the right domain even the strong website is linking to my wrong domain if I told to Google which one is the right one before in advance using web master tool. but I'm not sure about other search engines.

hartertobak wrote:

And you "think" that, because...??

Because Google knows the HTML very well so they knows which one is "comment" tag that don't need to crawl...

another thing is that those comments will be other developers to understand about your code... so, why remove?

hartertobak wrote:

google still is not able to follow flash-redirects, and flash-menues.

Yeah.

hartertobak wrote:

Same DEFINATELY will be with Silverlight.

Yes for so far. but we will be able to create the search-engine friendly silverlight application so.. As Silverlight is based on XAML, all Google need to do is that they need to find a way to crawl those XAML tag.... another thing is that silverlight is not just an object like flash.. it is based on XAML and it can be used with ASP.NET Ajax.. so, it will be the developer's responsibility for creating search-engine friendly silverlight website....

Google see the right domain even the strong website is linking to my wrong domain if I told to Google which one is the right one before in advance using web master tool. but I'm not sure about other search engines.

That would assume, that you can influence the ranking algorythm through google´s webmaster tools, which is nit true. The preferred domain setting is for displaying/linking to your domain in the SERPs not for ranking issues. Other search engines don´t even have something like webmaster tools or a "preferred domain" setting.
Please get my point: avoid this issue by the simple technical solution I presented in this article.

Michael Sync wrote:

hartertobak wrote:
And you "think" that, because...??

Because Google knows the HTML very well so they knows which one is "comment" tag that don't need to crawl...

another thing is that those comments will be other developers to understand about your code... so, why remove?

Right. Google knows html very, very well. But some people, years ago, tried keyword-stuffing and others techniques with html-comments (and trying it still with <noscript>-Tags...). Today, comments have only one effect: what´s inside the comment will no pay in for a sites ranking, but the amount of commented html-code increases the number of chars in the html-source - so the relation "text to source" get´s negative influence. It has negative influence, although this is not a big ranking factor.

Michael Sync wrote:

hartertobak wrote:
Same DEFINATELY will be with Silverlight.

Yes for so far. but we will be able to create the search-engine friendly silverlight application so.. As Silverlight is based on XAML, all Google need to do is that they need to find a way to crawl those XAML tag.... another thing is that silverlight is not just an object like flash.. it is based on XAML and it can be used with ASP.NET Ajax.. so, it will be the developer's responsibility for creating search-engine friendly silverlight website....

Fine, Silverlight´s based on XAML - that´s true. Does or will google be able to read XAML? Well, I think you´ll get some ideas why this won´t happen by reading this Blog-Post:http://www.nikhilk.net/AjaxSEO.aspx
Nikhilk describes a good start for addressing this issue, but there is one major (again: major!) issue with this:
Don´t deliver a search engine robot something else than what the normal visitor gets. That´s cloaking, and will get punished.

So you need to wait until google will support silverlight, and - just to help you to imagine how long you need to wait and how low the chance is at all, just try to get, what this xaml does while reading it in an editor like notepad:http://www.microsoft.com/expression/events-training/globalevent/Page.xaml

Nope: if you want your site to be crawable, don´t use Flash for navigation. Same is with Silverlight until the opposit is proven.

Google see the right domain even the strong website is linking to my wrong domain if I told to Google which one is the right one before in advance using web master tool. but I'm not sure about other search engines.

That would assume, that you can influence the ranking algorythm through google´s webmaster tools, which is nit true. The preferred domain setting is for displaying/linking to your domain in the SERPs not for ranking issues. Other search engines don´t even have something like webmaster tools or a "preferred domain" setting.
Please get my point: avoid this issue by the simple technical solution I presented in this article.

Michael Sync wrote:

hartertobak wrote:
And you "think" that, because...??

Because Google knows the HTML very well so they knows which one is "comment" tag that don't need to crawl...

another thing is that those comments will be other developers to understand about your code... so, why remove?

Right. Google knows html very, very well. But some people, years ago, tried keyword-stuffing and others techniques with html-comments (and trying it still with noscript-Tags...). Today, comments have only one effect: what´s inside the comment will no pay in for a sites ranking, but the amount of commented html-code increases the number of chars in the html-source - so the relation "text to source" get´s negative influence. It has negative influence, although this is not a big ranking factor.

Michael Sync wrote:

Yes for so far. but we will be able to create the search-engine friendly silverlight application so.. As Silverlight is based on XAML, all Google need to do is that they need to find a way to crawl those XAML tag.... another thing is that silverlight is not just an object like flash.. it is based on XAML and it can be used with ASP.NET Ajax.. so, it will be the developer's responsibility for creating search-engine friendly silverlight website....

Fine, Silverlight´s based on XAML - that´s true. Does or will google be able to read XAML? Well, I think you´ll get some ideas why this won´t happen by reading this Blog-Post:http://www.nikhilk.net/AjaxSEO.aspx
Nikhilk describes a good start for addressing this issue, but there is one major (again: major!) issue with this:
Don´t deliver a search engine robot something else than what the normal visitor gets. That´s cloaking, and will get punished.

So you need to wait until google will support silverlight, and - just to help you to imagine how long you need to wait and how low the chance is at all, just try to get, what this xaml does while reading it in an editor like notepad:http://www.microsoft.com/expression/events-training/globalevent/Page.xaml

Nope: if you want your site to be crawable, don´t use Flash for navigation. Same is with Silverlight until the opposit is proven.