And he goes further to say that his traffic fell, and concludes "it does nothing". And then concludes that the field of SEO is all snake oil. He even takes the time to put the boot into Tag Clouds, saying that they don't work either.

I'm here to tell you : he is wrong, plain wrong. Patricio Robles covers it well with this blog post:

In that post, it's explained how John may not have correctly redirected his old urls to his new urls using 301 redirects, or that he may have not given the change long enough to work.

DNN Human Friendly or 'Long' Urls

Where I'd like to step in and help eliminate some of the fuss is to say this : Long Urls with keywords in the address absolutely work. DotNetNuke standard Urls are similar to the standard Wordpress Urls that are mentioned. This blog post, without any sort of Url modification, would look like this : http://www.ifinity.com.au/Default.aspx?tabid=65&EntryId=57 .

There are several aspects to why good urls are important to Search Engine Optimisation strategies:

1. Keywords in Urls and in Anchor Text Matter

2. Clear Urls which signal content are more 'clickable'

3. Database Id ridden Urls are an invention of lazy programmers

I'll discuss each of these points separately.

Keywords in Urls and in Anchor Text Matter.

First I'll cover why keywords in the link text are so important. It's because they convey, in an unambiguous way, what the destination Url is all about. You can only choose a few keywords, so generally they indicate the correct content of the destination page.

There's an old experiment that is useful in determining how much value the text in a link matters for optimising for a particular keyword. It's very simple : just google the words 'Click Here'. I'll even put a link in for you (opens in new window) : http://www.google.com/search?q=click+here

What comes up at the coveted number 1 spot? Why, it's the Adobe Acrobat Reader download page. If you search that page for 'click here', or look in the Url, or look in the Page Meta Tags or description, you won't find the words 'click here' anywhere. So how does it rate in the #1 spot? Simple : millions of web pages around the world have a link which says something like 'Click Here to download Acrobat Reader'. Google interprets those links and determines that when it comes to 'click here', the Acrobat download page must be pretty important.

But you will have noted that the Url for the destination page doesn't contain the 'click here' words. So why do keywords in the Url Matter?

Two reasons :

1. People are lazy copy/paste addicts.

2. You have to choose a limited number of words for your Url, so the choice of the Url conveys important meaning about the page.

The first reason is clearly displayed in this blog post. Instead of coming up with a meaningful text link for the linked blog posts above, I just posted the Url and left it at that. Windows Live Writer does the heavy lifting and converts it into a link for me, as does Outlook, as does just about every rich text editor out there today. Chances are that most time you get a link for your site, you won't have control over the link text either. This next bit is important : because people are lazy and generally just post the Url as the link, you can get de-facto optimization of the link text by having your chosen keywords in the Url. It's a simple but often overlooked fact - you can't go wrong relying on laziness of other people. Think of it this way : the DVD industry business model is relying on the fact that people are lazy and want to sit on their couch for hours on end. I think you should too.

The second reason is that search engines very much use the content of the Url to rank pages for keywords. Again, you have limited choice as to what to put in your Url, so search engines reasonably deduce that the contents of a page closely match the Url. It can't be spammed or faked. This is why there is duplicate content penalties in search engines : otherwise we'd all post 100 urls covering all possible search phrases for a page. By rewarding site owners for having canonical Urls, the search engines force us to pick one single set of keywords in a Url. And then, they know, the contents of the page closely match the Url.

I'll give you a concrete example : on November 10, 2008, I converted this blog to use the new Blog 3.5 release, which put keywords into the Urls. Comparing the period from August 10 to November 10 with November 11 to February 11 (handy it was exactly 3 months ago), my average pageviews/day for this Blog have increased by 55%. In this time I've done nothing else differently, and the majority of the visits are for old content. I've only managed 7 posts in that 3 month period. It's not properly scientific, but I'm sure increased rankings and clickability from better links has to be attributable to even half of the gains. Who wouldn't like a 55% increase in traffic?

But don't just take my word for it : even Matt Cutt's (he of the Google anti-spam team) has search engine optimised Urls. I think he should know.

Clear Urls are more clickable

If your page is lucky enough to appear somewhere in the top search results for a phrase, the person doing the searching is going to be presented with a link to your page. The search engine will present that link, as is. The person looking at a list of links on a search results page then has to make a decision : which link to click on? Now, statistics and studies tell us that most people are going to click on the first result.

But if they don't, which out of the others are they going to try?

My experience with Google Adwords testing tells me that the text of the URL absolutely affects the click-through rate. I've run two ads with the same content but with different 'link text' (in Adwords you can type any old thing in for the Url Text, the actual link isn't visible to the visitor). And I'm here to tell you that links that look like they match the search criteria get clicked on more.

Why? I've no idea but I'm guessing the more plain and simple the link, the more people trust what's on the other end of it. Perhaps it's wrapped up in the psychology of labelling and branding - linked to why all pasta sauce tins are red and not white. Personally I don't care, but I use the information to spur me into creating clickable Urls for my sites.

Database Id's in Urls are the result of Lazy Programmers

I'm here to tell you I'm a lazy programmer. Any programmer who tells you they would not rather choose the quicker, easier route to a destination is probably not speaking the truth. That's why so much software has terrible user interfaces. It's because the user is being force to conform to the program model, rather than the other way around. It's part of the reason why the iPod is so successful : the UI assumes the user knows nothing about the internal file storage mechanism of the music files. You hardly ever see the filename in an iPod : it always picks up the song title / artist and shows that instead.

In the early days of database driven websites and content management systems, the main focus was getting the things to work. To expedite that, programmers took the quickest and easiest route to Urls : exposing the underlying database structure as part of the Url. No interpretation or logic required : send table unique id in, get unique table record out. The really early stuff just put together a query based on the contents of the url, which made them laughably easy to do Sql Injection attacks. Nowadays platforms like DNN resist Sql Injection attacks very well, but the early days of the architecture are still there : you get the page of a DNN database by giving it the TabId. The TabId is just an auto-generated number that the database provides. It's why most DNN sites have a Home page of 36, because the Home page is the 36th record to be created on a new install.

So the only person that benefits from having database Id's in the Url is the programmer. Not the search engines, not the site owner, nobody else. If the iPod had come out with an interface that showed '01 - song.mp3' in the list, they would never have caught on. So you shouldn't do this with your website either.

Long Urls with Keywords are an SEO Must Have

To recap, if you have a long Url with keywords accurately depicting the page content, you're better off than if you have a Url stuffed with numbers and other database-generated keys. It's better for the users, and what's better for the users helps you in your search engine optimisation efforts.

And I'm not saying this just because I distribute a DotNetNuke extension to help transform the Urls of your DNN site : the whole reason for building that in the first place was to get better results for my own site. That it caught on in popularity was a happy externality.

If you use the Url Master module for your DotNetNuke website, you will get better SEO results. I get emails all the time from happy customers telling me their site is performing better. You won't have the problems that John Dvorak had, because it automatically 301 redirects all your old urls. It's just a pity he chose Wordpress over DNN, I guess.

Your say : have you got an anecdotal story about how putting in human friendly urls helped your site? Please share in the comments below.

Well written post Bruce. As you know, I'm a "Url junkie" and would give my soul for clean, "hackable" Urls any day for all the reasons you've described above and more. And Dvorak was a fool before, but this tops it all off.

Thanks for taking the time to write this. I bought URL Master for a recent project and I have no idea if the ranking will be improved. The old site was indexed on Google and there was no SEO effort so I am thinking anything I do will be better. I have been on a rant lately because I miss the days where I didn't care about what a URL looked like, and I really did not see the point, but this article has enlightened me, but I am still a bit grumpy.

@Mike<br><br>It can be annoying when changes in technology come along which don't always seem logical or useful. The good thing with concentrating on good urls is that it is actually technically interesting to achieve. The other good thing is that, if done right, in can give you a significant advantage in traffic for only a little bit more effort. Eventually this arbitrage situation will slowly disappear as all web technology is updated and old numeric urls become a thing of the past, but for people who get in first with better urls, there will always remain a first-come, first-ranked advantage for certain keywords.

A month ago we've launched our new website but I'm not happy with it... well I'm happy with the visual result but not in a SEO-wise!<br>But because of Batibouw (www.batibouw.be) - an important event here in Belgium - the site needed to be released as soon as possible...<br><br>Our site is running on DotNetNuke 4.9. Tried to opgrade it to DotNetNuke 5 (because it's more XHTML valid) but after several attempts I decided to start a fresh installation...<br>Now starting from scratch (and having somewhat more time) I want to do this right and make it as SEO friendly as possible.<br>My first concern is the URL... <br>Products/Interior/tabid/77/language/nl-BE/interior.aspx?cid=2&product=d044ac31-3eee-48fc-81f1-1ffdc982fe35<br><br>I thinnk you see the problem :o)<br>I've tried to use URL master but we're working with multiple languages (7 languages) and I want to use keywords in the corresponding language.<br>interior.aspx in English, binnenwerk.aspx in Dutch and so on...<br><br>Is this possible? Can I make the URL a lot smaller? Something like:<br>/Interior/Plaster/Granol’color Brillant/<br><br>Of the course the name of the product (Granol’color Brillant) is another problem because of the quotes and spaces...<br>It's a difficult issue...

@thomas, you've got some work to do there. I am working on a multi-language version of the Url Master module, but there's a lot of things to cover, it's not going to be out anytime soon.<br><br>You might try looking at the Adequation multi-language module, it works quite well with the Url Master module, but it may or may not suit your product listings.

Bruce,<br><br>Why does your URL master tool not provide dashes (-) where there is a space in the page title. It seems this is the only tool that you're missing. What about capitalization - does your tool do anything for that? Google is currently seeing many of our pages as two pages depending on capitalization - we aren't sure why - but it is a major issue right now.<br><br>Could your tool help with either of these things?<br><br>Thanks-

@eric<br><br>The tool does both of those things (hyphen space replacement) and capitalization (forcing to lower case)

Hi, I'm Bruce Chapman, and this is my blog. You'll find lots of information here - my thoughts about business and the internet, technical information, things I'm working on and the odd strange post or two.