August 6, 2008

As the web grows older, standards applied to the quality of your web presence (especially if you are into business) get enhanced and polished. W3C (The World Wide Web Consortium) keeps issuing more standards and tweaking old ones, and following those tweaks to a date is not an easy task in itself.

November 2, 2007

I have been in this industry for a few years now and I have had everything that goes with it: good experience and bad experience, joy and disappointment, days of frustration and hours of glory. I have worked with all different kinds of clients and provided very different packages, always tailored to the situation and to the client’s wishes. Most of those people are remembered for positive reasons with a few notable exceptions.

September 7, 2007

A few months ago, my dear friend Deb Gell, the owner of the Websavvy Directory and a fellow SuperMod at the IHelpYou forums, offered me help with making my website accessible. Deb knows so much about web accessibility, WAI-AAA, section 508, etc, it’s unbelievable. I definitely accepted her generous offer, and soon the work began.

April 20, 2007

For years and years SEO has been perceived as an art of acquiring rankings in the search engines for certain keywords. Well, it’s true, of course, but only up to a point. I believe it’s also a big simplification that can lead (or should I say “has led”?) the industry into big trouble.

A typical SEO campaign usually starts like this: we carry out a keyword research, create a big sheet of keywords we need to target and then start writing pages of content around those keywords. The client expects us to bring their site to the top of the three major engines (or, at least, just Google), so we try hard.

Sure, an ethical SEO will also try hard to make sure the content makes sense, doesn’t ruin the overall concept of the site, fits in the navigation and provides useful information to one or another category of the potential visitors. But the longer the SEO campaign lasts and the more key phrases we include, the harder it becomes to comply with the rules above. Sooner or later it becomes the “site for keywords” situation; that is, we become slaves to our keywords and start messing with the initial philosophy of the site.

C’mon! It’s the logic of the development of the website that should dictate the choice of keywords, not the other way around!

The engines’ best interest

The engines are interested in delivering quality search results to their customers – the searchers. They have no reason to encourage aggressive optimisers who fight with each other for rankings and forget about the quality of the resource they are building. Actually, the engines have all the reasons to weaken (and, if possible, destroy) the existing business model dominating in the SEO industry, in which the clients want nothing but rankings from their SEO consultants, demand guarantees that the rankings will be delivered and often pay for the actually acquired rankings rather than the work done by the SEO.

That’s why the engines work hard to make the results of the SEOs’ work unpredictable and the search result manipulation as hard as possible. It is extremely easy to create an anti-SEO filter (an optimised page is very detectable), but such filters make the SERPs worthless, so the engines are trying to find balance. I’m sure that all the latest updates, starting from Florida in 2003, reflect the work done by Google to reduce all the abuse done by SEOs. Other engines are moving in the same direction. For example, the most popular Russian search engine, Yandex, which used to be very easy to manipulate, is following Big Brother Google’s example and applying new changes to the ranking algorithm to make the SEOs’ life harder.

Other measures Google is taking right now are messing with the number of indexed pages (which goes up and down almost at random), the Supplemental Result status all of sudden assigned to innocent and content-rich pages and other “practical jokes” that make it impossible to predict how an optimised page will behave in the engine’s results.

The new philosophy of SEO

So, what should be the new approach to SEO as business? What should the clients expect from their SEO consultant?

Well, the SEOs should still do their best to maximise the chances of the client’s website at ranking well for thoroughly chosen keywords. They should still apply basic SEO to every content page and dedicate a lot of time to research and self-education to keep up with the latest trends in the industry. But I said “maximise the chances”, not “deliver rankings”. We all – SEO practitioners and our clients – should once and for all accept that it’s the engines’ prerogative to decide which sites to put at the top, and we should never again get annoyed with it.

The SEOs should be able to provide detailed advice on how to design websites that are friendly to the engines and how to find balance between the development of the website and the keywords without turning the site into a mess. The SEOs should be able to guide and guard, making sure the site won’t get into trouble after certain SEO-related measures are taken.

SEO is a long, slow process, similar to raising a child and having nothing in common with waving a magic wand. The role of the SEO consultant in this process is the role of a teacher or a doctor. We are here to make sure the child (a website) will grow up healthy and learn the right ways to live and develop.

Of course, if an SEO is paid for acquired rankings, it becomes very hard to follow this philosophy. We all have to agree that this business model is outdated.

The rankings can be earned

The only right way to earn rankings is to develop our websites as logic, common sense and our visitors’ interest dictate, and then apply basic SEO to the pages built on this basis. It will take years, and there is nothing wrong with it. It always takes time to build something worth a reward, something that stands out and deserves special attention.

Sites that use this kind of SEO strategy may not show impressive results at first, but grow steadily and demonstrate an amazing ability to withstand the Google Hurricanes (a.k.a. Major Algorithm Updates), in which sites built to chase rankings often fall from the first pages and out of the Top 1000. I wouldn’t be surprised to find out that some sample sites that aim to earn rankings rather than chase them are used by the search engines’ research teams to fine-tune their algorithms and make sure that good rankings will go to those who genuinely deserve them. Try coming up with such a “sample” website, and watch the engines getting kinder to you with every new update.

A few days ago I received a request from one of my former clients: to find out why exactly the Google rankings for his site so drastically dropped and didn’t come back even a month later.

I said I couldn’t help, because nobody except a few people in GooglePlex knows exactly what happens to the ranking algorithm when the rankings drastically change. We SEOs can spot certain patterns and come to conclusions (which are probably, but not necessarily, true), but we can’t give a site owner an accurate list of reasons for ranking drops in each particular case and say, “Just do this and the rankings will be back”.

But I’m naturally curious, so I retrieved the old keyword lists from my archives (I haven’t touched the site in question in months), along with the saved results of the manual rank checks I do from time to time while working on websites. A brief check showed at once that though rankings in most groups of keywords had dropped dramatically indeed, other keyword groups, on the contrary, had improved over the months. This led me to the following conclusion: the ex-client’s site is not being penalised or permanently filtered out, but is suffering from yet another Florida-like update.

The “Florida” Google update (Google updates receive names according to a system very similar to the system of naming Atlantic hurricanes) struck in November 2003 (about the time I entered search engine optimisation). It dropped to the nowhere millions of sites that used to comfortably sit on the first positions for their favourite search terms. SEOs who had never seen anything of the sort before started making thousands of desperate postings on various SEO forums and blogs, and nobody knew what to do at first. Then people slowly calmed down and decided not to do anything rash, but wait and see what happens.

In January 2004, another update named Austin dropped still more sites. It looked like the second wave of Florida, but at the same time it brought certain victims of Florida back to where they used to be. A month later, the Brandy update released even more Florida-struck sites.

This happened more than three years ago. We’ve had more updates of a similar kind (supposedly applying new filters developed by Google engineers, testing them and then rolling back those that have proven ineffective). In October-November 2005, we had the Jagger update that went in three stages, but even more updates remained unnamed. Right now, in March-April 2007, we are having another one. It is also noticed that every update affects different niches and SERPs, in turn. So, the first advice: if your site got hit by a major Google update, don’t panic or turn your site upside down.

I’m not saying don’t do anything

If your site suddenly loses all the rankings, it’s a good time to check how clean it is. Check and double-check your code. Even such a small thing as a questionable alt attribute or a misplaced <h1> tag can trigger a filter. Check your linking patterns and if you are heavily cross-linking 20 or 30 sites belonging to you, remove the unnecessary links and merge the sites themselves where possible. Check your outbound links for bad neighbourhoods.

If you are sure your site is clean in terms of SEO, consider moving to higher website building standards. For example, if you have been considering switching from table-based design to a table-free one, there is no reason not to do so now. You can also add more content or revise your navigation. But be careful and don’t make your site worse than it was.

An aggressive link building campaign won’t help, but a few new solid links you might receive from your friends are a good thing.

Your rankings will probably return slowly during the next months. If they don’t, the next Google update might bring them back to you. In certain cases you will probably have to wait through two consecutive updates.

I know that it hurts when a sudden whim of the #1 search engine harms your business and reduces your income. Unfortunately, it’s useless to get annoyed about it and blame (let alone sue) Google. Google owes you nothing. Another useless thing is to keep bugging your SEO consultant all through the update with endless “Why me?” or “Please do something” emails and phone calls. Even the most knowledgeable SEO expert can’t stop a hurricane.

Comments Off on What to Do if You Suddenly Lose All Your Google Rankings

April 4, 2007

Very often website owners wonder what exactly is wrong with their websites in terms of SEO. They could have easily fixed the errors all by themselves, but the lack of specific SEO knowledge makes it impossible. Reading SEO forums and blogs full of conflicting advice doesn’t help; all that people get as a result is a headache and a list of 100 or more factors supposedly important for SEO. Too often, things like Google PageRank or Alexa rank will be among the first on the list, when in fact they shouldn’t have been there at all.

This article is meant to help both webmasters who would like to make an assessment of their websites for SEO factors and SEO practitioners who consider adding an SEO review to the list of their services. Sure, the following is just my personal point of view, but it’s derived from my previous SEO experience and the knowledge accumulated over the years.

The on-site factors

It’s essential to check the following on-site factors (in no particular order).

Content. Does the site have a lot of textual content for visitors to read and for the search engines to index? Is it done in plain text, graphics or Flash (because the last two options make it invisible for the engines)? Is it unique?

Use Copyscape along with Google and search for a long sentence copied at random from the site (with and without quotation marks); this will help you identify any copies of the content you are reviewing on the web. There are three possibilities: the site you are reviewing steals the content from another site, the other site steals the content from the site you are reviewing, or a legitimate republishing (with credit and a link) has taken place. In the third case there are no reasons to worry, but if content stealing is going on, the owner should be informed about it in either case to take the necessary measures.

If the site is built upon graphic images for the most part of it (like an online photo album), alt and title attributes along with page titles can help a little, but not in a competitive niche. In this case, if the owner of the site is still interested in the search engine traffic, the only way to go is to describe the pictures, at least briefly (human visitors will be grateful too).

On the other hand, if the site is built as a single Flash movie, it has no chance in the search engines, unless an alternative HTML version of the site is provided in one way or another. It’s possible to use a cache:http://www.domainname.com/ Google command to see what Google sees on the home page of the website.

I sometimes retrieve the keywords from the title tag, insert them into the Google toolbar and click on the highlighter to see how often these keywords are used in the visible content of the page, but I never calculate the keyword density. Nobody ever knows the optimal number that will work in every case, so calculating it is a waste of time.

Title tags and other meta information. The importance of the unique title tags on every page of a website is widely known to everyone who has ever studied SEO basics. Apparently, not so many people have, because I constantly see websites that have the same title tag throughout the whole website (typical for template-based sites, as some templates allow website creators to add only one so-called “website title” while configuring the website). The same applies (even more often) to “keywords” and “description” meta tags. Granted, these tags are less important than the title tags for rankings, but a properly optimised website still should have unique meta tags on each page.

I would like to mention two other meta tags I’m seeing all the time:<META NAME=”robots” CONTENT=”INDEX,FOLLOW”> (or <META NAME=”robots” CONTENT=”ALL”>, <META NAME=”Googlebot” CONTENT=”INDEX,FOLLOW”>, <META NAME=”Googlebot” CONTENT=”All”>, sometimes all four at once) and
<META NAME=”revisit-after” CONTENT=”10 days”>.

I always recommend eliminating them, as they are nothing but useless code bloat. The only meta tags that would make sense – in some special cases – are <META NAME=”robots” CONTENT=”NOINDEX,NOFOLLOW”>, <META NAME=”robots” CONTENT=”INDEX,NOFOLLOW”>, or <META NAME=”robots” CONTENT=”NOINDEX,FOLLOW”>.
(The examples are given for the HTML 4.01 doctype; in XHTML they would look different.)

Navigation means. Usually, websites have a navigation menu bar on the left or at the top and an alternative “mini sitemap” at the bottom. Some websites use only one or two navigation means mentioned above, some use all three, and sometimes the right column is used for navigation as well. Certain pages can be interlinked from within the content (a good thing as long as the new page doesn’t open in a new window), and last but not least a good site always has a sitemap.

When I review websites, I always pay attention to the spiderability of the links in the main navigation menus. Top and left-side menus (especially those that have dynamic drop-down parts) are often powered with JavaScript. If the code still contains the plain href HTML links with URLs in them, the menu will be spiderable, but if you see JavaScript variables instead of the URLs, or if the location.href command is used, the menu is useless in terms of search engine friendliness and SEO.

Actually, a simple menu with the rollover effect as well as a dynamic menu with drop-down submenus can be implemented using CSS rather than JavaScript. This change of the technology will certainly make the website better for the search engines, but apart from this it will reduce the download time (JavaScript-powered rollover menus are usually built upon graphic images, and the CSS-powered alternatives use text) and improve usability. I always add this recommendation to my SEO reviews.

If the website has breadcrumbs, it should be pointed out as an additional SEO/usability benefit.

Redirects and other HTTP responses. I always check if the website has a 301 redirect from the non-www version of the URL to the www one. If there is no such redirect, I always recommend one.

It’s also important to check what kind of an HTTP response non-existent pages return. It should always be a 404 (with a custom 404 page shown in the browser; it’s permitted to use the site map or the home page as the custom 404 page). This server header checker is very good for checking HTTP responses: user-friendly and reliable.

The code. I always check the code for validity using the W3C validator (I don’t trust browser plugins). If the code is not valid, I recommend fixing all the errors. Other code-related issues are these: table-free layouts are better than table-based; for table-based layouts the number of nested tables should be as small as possible; JavaScript functions (where they can’t be eliminated) and CSS styles should be moved to separate .js and .css files to avoid bloating the HTML page itself. XHTML 1.0 Transitional is better than HTML 4.01 Transitional, but XHTML 1.0 Strict and XHTML 1.1 are better still.

And of course, no frames!

More on internal linking. All internal links pointing to the home page should be linked to the root (http://www.domainname.com/ rather than http://www.domainname.com/index.html). The same rule applies to subfolders (link to http://www.domainname.com/folder/ rather than http://www.domainname.com/folder/index.php). Unfortunately, I have yet to review the first site that follows this rule; even some well known SEO practitioners are guilty of breaking it.

URLs. If the site is built upon dynamic URLs, I always recommend at least considering switching to static URLs or reduce the number of parameters, if possible. Granted, dynamic URLs can get crawled, but they show Google Supplemental Results more often than do static URLs, are more difficult to control via the robots.txt file and blur the hierarchy of the website. Static URLs with subfolders (or pseudo-subfolders) allow us to make the hierarchy of our websites more obvious, besides, we can from time to time name a page after an important keyword or two (separated by a hyphen). All these factors help with SEO to some extent.

Last but not least, I look at the overall number of indexed pages in Google, Yahoo! and MSN (Live) and compare these numbers with the size of the site.

The off-site factors

This usually means links. I do look briefly into links, and use the Yahoo! search engine for this purpose, because Google shows only a tiny part of all links, and MSN has disabled this functionality altogether. The linkdomain:http://www.domainname.com/ command gives us some ideas of the backlinks of the website. The overall number is usually highly exaggerated, but the correlation of this number with the age of the site gives me ideas on the history of the site. The quality of the backlinks is important too. If I see that all links come from a bunch of forum threads, I know at once that they don’t add much to the site’s authority. One link from the Yahoo! Directory could do more for this purpose than two hundred links from forum signatures.

I look at the age of the site (not in the whois records, because the date of the domain registration and the date of the first upload of the website can be far apart). I look at the copyright notice, and if there is none, just ask the website owner when the site was uploaded.

We know that Google looks at historical data (and even has registered a patent for it). Should we assume that they use this data to calculate the average amount of links acquired by the site every year? Or every month? It does make sense. My own experience tells me that old websites that have accumulated a decent number of links (mostly, natural links) do quite well, but old domains with too few links perform in Google even worse than relatively new websites. The same applies to websites that haven’t added new content pages in years. Even a quick and aggressive link building campaign won’t improve the situation at once; it will take months before Google realises that the site in question has been revived.

Anyway, quick and aggressive link campaigns are a thing of the past. The engines now look for naturally developed great resources. That’s why the main purpose of a good SEO review is to help the website owner to turn the website into a resource as close to “great” as possible.

It’s important to look into the backlink patterns of the website. If most backlinks come from websites belonging to the same owner and heavily cross-linked with each other, the site is asking for a severe ranking penalty. Such things should be pointed out to the owner – urgently.

What else?

Of course, I briefly review the websites for the most obvious kinds of SEO spam. The procedure is described in detail in this article; the only thing that has changed since 2005 is the Supplemental Results. These days, if a website is showing a lot of Supplemental Results in Google, it doesn’t mean an approaching penalty for spam anymore. It often happens to innocent sites.

Often, in order to discover doorway pages on a site you are reviewing, you will need to ask for an FTP access to it.

Things I don’t include

There are certain things I don’t include in the SEO review. One of them is Google PageRank, which has become so weird and deceptive that there is almost no point in looking at it, except in some special cases. It is still somehow correlated with the overall link authority of the site, but the correlation is very weak, and the PR itself has no direct affect on rankings.

Alexa Rank. It is supposed to show which sites have more visitors. Actually, it shows which sites have more visitors with the Alexa Toolbar installed on their computers. Alexa is capable of collecting the data about visitors only when the visitors already have their toolbar. How many people know that this toolbar exists, let alone install it? Not many. This alone makes Alexa Rank a worthless parameter.

Current rankings. No, I’m not saying that the rankings are unimportant. I never include them in the SEO report because they can change tomorrow and also because the client can be on a different datacentre and see a completely different picture. The rankings are affected by many other factors, like, for example, the TLD of the search engine (compare what you see on Google.com with what you will find on Google.co.uk), or the language of the interface. Besides, the choice of keywords used at the time of the primary SEO review can be faulty, because the thorough and deep keyword research is a separate task and is often done after the review. And finally, most of the tools that check rankings violate the TOS of the search engines; those that use legitimate APIs often supply very wrong information if you compare their data with the results you actually see in the SERPs.

The review is done to discover (and hopefully, fix) the fundamental problems of every particular website and for this purpose the ranking report does nothing.

Comments Off on How I Make an Assessment of a Website for SEO

March 12, 2007

These days, if you would like to have a quality, professional-looking site, you can’t neglect the SEO factor. Being friendly to the search engines has become a must. But how do you properly distribute your SEO budget to make sure it won’t be wasted?

Time has made its corrections to the problem. Advice on spending an SEO budget in 2007 is very different from what the SEO experts might have told you in 2003 or even in 2005.

Invest in design

In the past, the search engines had to work on what was available to them. In search of web pages relevant to a query they often had to rank poorly coded and unfriendly sites highly in the SERPs, and invent different ways to bypass unfriendly design. They can still do it, but on the other hand they can afford to be much pickier now. Unless the query is really very obscure, the chances are there will be a lot of sites available to match it, and the engines can put the highest quality sites at the top.

That brings forward the importance of good quality design. I don’t claim to know for sure that the engines already give a boost to the sites with valid HTML code or table-free layouts, and if they do, there is no way to tell how big that boost is. But it’s obvious that sites with friendly, static URLs are much easier to get ranked highly than are those with obscure URLs having several parameters in the query. Also, Google shows sites with dynamic URLs as mostly Supplemental Results more often, which is not a good thing, either.

So, here goes the first SEO advice of the year 2007: invest in design. It’s worth the money to hire a good web designer who knows how to build search engine friendly sites and can provide a search engine friendly CMS. It might be worth spending a little money in advance to hire an independent SEO consultant to make an assessment of the previous work of the designer you are about to hire. In the long run this investment will pay for itself.

Invest in content

No matter how the rules of the SEO game change, one thing remains: the engines love good content. In most niches (except the most competitive ones) adding a page of properly optimised text with a matching title and meta tags, and incorporating it properly into the structure of a search engine friendly site brings new good rankings fairly soon. Besides, adding good educational content on a regular basis is the only way to build real natural links.

There are a lot of methods of link building, some of them appropriate, others questionable, and many others totally unethical. But the point is that regardless of the method, all links that are built cannot by definition be natural. They can look natural in some cases, but that does not make them natural.

All methods of link building that are known to search engine optimisers are known to search engine engineers as well. And no matter how closely those links resemble natural links, they will sooner or later show a detectable pattern and will be recognised as built links. What to do with this knowledge is for the engines to decide. Lately, they have been known to give a lot more authority to natural links than to links that have been built, which is only logical. Natural links are those that have been earned by a site through its being appreciated by users, and that is what the engines look for.

I’m not saying that it is necessarily a bad thing to build links – it depends entirely on the method. There are a lot of legitimate methods of promoting websites such as press releases and directory submissions. These links bring us visitors, and to some of them the engines still give certain weight. But they are not natural links.

Frankly, I can’t see any webmaster giving a natural link from a content page to a page that has purely business-related information (unless it is a review of a business). It’s a lot more logical to expect a link from within one article to another that was used as a source or has helped the writer to reinforce a point. But if it is a link to a business site, it seems more logical to place it on a resource page. But…

We all know that resource pages have been utterly devalued by both web surfers and the engines due to heavy abuse over the years with what is known as a “reciprocal link exchange”. As a result, Google de-listed most resource pages during 2006, and devalued links coming from most of the remaining pages of this type. Links from content pages are a lot more likely to get clicked and to pass link value to the destination page. This gives us another reason to invest our SEO money in good educational content. Thus we genuinely benefit from giving.

Invest in self-education

If you learn the basics of SEO, you won’t have to pay to a consultant for developing titles and meta tags for every page of content you are about to upload. You will be able to do it yourself, along with proper internal linking, validating the code of the new page and other aspects.

You will be able to do directory submissions for your site, which is the best solution anyway because nobody knows all about your site better than you do. You will be able to establish contacts with other practitioners in your field that might bring you fresh opportunities for quality link building that an SEO would surely miss.

However, you need to know how to avoid bad neighbourhoods, and this requires special SEO knowledge that goes far beyond SEO basics. If you invest in developing this knowledge, your self-confidence will grow, and you will feel much better on the web.

Still have some money left?

Submit your site to Yahoo! Directory. Launch a small PPC campaign. If you have time, join one or more paid Business Networking sites, e.g. Ecademy, LinkedIn or Xing. They will give you still more opportunities to publicise your services, as well as some link building opportunities – but paid management teams will make sure you won’t be able to abuse them. Free networks too often get abused by spammers, scammers and other weird people.

Still have more? Invest in more content!

Comments Off on How to Spend Your SEO Budget

March 2, 2007

SEO copywriting is the core of the overall SEO process. If your site contains no copy, search engine spiders will have nothing to work with. If your copy is not properly written for SEO purposes, it will never bring you really good search engine rankings. Conversely, if your copy is written with only SE spiders in mind, it will repel your human visitors and detrimentally affect your business reputation.

How do we find a balance between an optimal keyword density and a good marketing pitch? How do we please spiders without displeasing humans? It already sounds complicated. You might already be thinking of ways to feed one variation of your page to search engines and another to your human audience – if you are, stop here! The point is to make the same copy equally attractive to both types of site visitors, not to deceive or abuse. We are here not to discuss spammy SEO methods.

An optimal keyword density is not 60% or 80% – that would turn your text into unreadable nonsense, and smart search engine algorithms will immediately see it is artificially stuffed. They know how to tell natural from unnatural language, so the good news is that you won’t lose out by using your normal writing style when copywriting for search engines. Just remember about your keywords – it is easy to include them for the simple reason that they are relevant to your topic. SEO copywriting – conveniently enough – becomes your best friend when you wish to make your copy clear and descriptive for human readers.

What is an optimal keyword density? It is widely believed that it is somewhere between 3% and 4%, but that is not a rule. With directories, it is often higher, but that’s good, too. It is not the result of artificial manipulation – it just happens when you include titles and descriptions of all the listed web resources, which, provided the directory is categorised well, should be relevant to the same theme.

But directories are directories, and SEO copywriting is another thing. There is really no need to count the words or calculate the percentage of keywords. If you are not sure that it is optimal, put your keywords into the Google toolbar and click the highlighter to highlight your words in different colours. It helps a lot. With experience, you will soon see if your page is under-optimized – or worse, over-optimized – and make the necessary corrections.

SEO copywriting tips

Be subtle. Nobody but a very experienced SE optimizer should guess at a glance what you are targeting.

Be creative. Targeting words for SEO purposes doesn’t mean your copy should become poor. All the general rules of good copywriting still apply. Humans should enjoy reading your copy and easily grasp your ideas.

Be logical. Include your targeted key phrases in link anchor text to send your visitors to related pages, but do it for usability’s sake first and SEO purposes second. Never abuse the method; it looks highly unprofessional when abused.

Be descriptive. Optimize your title tags, headers and sub-headers. Include your key phrases in them – but again make sure you do not overuse them here. The first rule of good quality SEO is never overuse or abuse.

Be expressive and friendly. Yes, spiders are insensitive to human emotions, but remember that your readers are more important to you and that people react emotionally. Spiders are said to be insensitive to emotions … but are they really? Sometimes it looks as if they are as emotional as we are – rewarding our spider-friendliness with good search engine rankings and penalising us for being indifferent.

If you think of spiders as your adored pets, your SEO campaign will be successful and your SEO copywriting perfect and skilful.

Comments Off on A Few Secrets of SEO Copywriting

February 2, 2007

How many times have you looked hesitantly at a website you have just come across, wondering where to go next? Basically, if the navigation of the website is confusing, all other efforts directed at improving the usability factor will fail. On the web, navigation is the key to everything.

Websites that are hard to navigate are quickly abandoned by visitors.

How do you navigate without a map?

In real life, navigating without a map is a hopeless task. The virtual world of the WWW has many things in common with the real world; that’s why most websites also have maps.

The main purpose of a sitemap is to provide a clear and easy navigational option for website users who have failed to find what they were looking for using other means of browsing through a website. As soon as your site’s page count has exceeded 15 or so pages, it’s time to add a sitemap.

Sometimes I see websites that look very usable and intuitive at first glance. A standard small menu at the top seems to provide clear, one-click access to all the important pages on the site. Assuming the site contains only 8 or 10 pages, I start clicking on the top menu links one by one, and only on the 6th visited page I suddenly (and quite by chance) discover a lot of links pointing to important content pages and learn that the site, in fact, contains about 60 pages or so. Yet just from looking at the home page of the site, I would have no idea all these content pages existed.

Of course, you can’t link to everything from your home page. It would look cluttered and perhaps even spammy if you do so. Having a sitemap is the best way to instantly give your visitors a basic idea of how large your site is, what are the most important sections of it and what kind of information they are likely to find if they take the time to explore it. Of course, when a site is large and has thousands of pages, you can only add the most important pages to the site map, but with a smaller site (up to 100 pages) there is no reason not to list all pages on the sitemap.

It’s just as important to make the layout of the sitemap intuitive and scannable. No funny tricks that might make your visitor wonder what it’s all about. No grey text on a light-grey background. Links should look like links and stand out clearly, and the colours and the font sizes should ensure good readability. More important pages can be highlighted using larger fonts, and it’s a good idea to reflect the hierarchy of the site using indents.

To maximise the effect of having a sitemap, link to your sitemap from every other page of your website, and make this link easy to find.

Consistency

Consistent navigational patterns make it easier for a user to get acquainted with them. Different patterns on different pages can create a feeling of “running in cycles”. On the other hand, too often different sections of a large site require different navigational patterns, to reflect the specifics of every section. Usually, visitors expect a total 100% repeatability from the left-side and top navigation, but if you have additional navigational links in the right column of your site, certain diversity is acceptable.

Breadcrumbs

Another standard way of optimising the navigation of websites (I mean optimising for users, though the search engines love it, too) is using so-called “breadcrumbs” – the short-cut links located at the top of the content part of every page and outlining the path back from the current page to the home page of the site. They ensure your visitors will never get “lost” on the website.

Unfortunately, so many websites neglect this very handy navigation element and make us wonder where we are.

Categorisation

Articles, tips or resources should be properly categorised. Please include resources in fitting categories and provide several alternative options for navigation. Remember, if you create a mess inside your categories, it will annoy visitors.

Help me find my way home…

The home page should be linked from every page of the site. The logo is traditionally linked to the home page, but please provide an alternative text link in the main navigation too, for better usability.

Page names

The page names should “speak” to your visitor. Ideally, it should be easy to tell what the page is about by simply looking at the URL of the page. A page name like 4867.html is not very helpful in this regard.

Often I see sites that just frame other sites, so whatever I click, the URL in the browser stays the same. Apart from being confusing, it makes it impossible to bookmark particular pages. That’s yet another reason not to use frames for regular sites.

Love and respect your visitor…

… and the visitor will love and respect you.

Comments Off on Confusing Navigation – the Worst Enemy of Web Usability

January 30, 2007

SEO is believed to be an easy profession. At first glance, it really is, which is probably why the number of various SEO specialists, SEO professionals and SEO experts is growing daily at a tremendous rate. Being a moderator on one of the major SEO forums, though, I know very well how many of these self-proclaimed experts offer their services first and come to forums asking the most basic questions later. For these people, the apparent simplicity of SEO works like a trap.

So, what actually makes a good SE optimiser these days? What knowledge and skills should an ideal SEO possess? What approach to the profession will ensure success?

SEO copywriting – what does it actually mean?

SEO copywriting is not stuffing good web copy with keywords until it becomes bad web copy, often to the point of being unreadable. SEO copywriting is copywriting first, and the SEO part second. This means you need to be fluent in the language of the website you are optimising. You also need to know how to be persuasive, which suggests being a psychologist, if only on an intuitive level. Only after you have achieved this can you start playing with keywords, but you must still never, ever forget that your primary readers are still humans, not the engines. So, be subtle when inserting keywords. They need to be prominent, but not annoying. And don’t forget about the titles and other meta information. They need a lot of attention, and certain language skills, too.

Of course, you can always outsource the copywriting part, or at least hire an editor who will take care of your grammar and style, but we are now talking about an ideal SE optimiser, i.e. a hypothetical person capable of handling the whole SEO process all by him/herself.

Directories will keep you busy

Submitting to directories is mostly a routine task, which requires a lot of patience; it suits a hard worker. But, apart from this, certain knowledge and experience is required to tell a good directory from a spammy one, and to craft good, attractive titles/descriptions for the site you are promoting. Of course, if you are acting as a full-scale SEO for this website, you already know everything about its business goals and niche by the time you get to directory submissions, but if you are hired to do submissions only, make sure you have read and understood the site’s content really well prior to starting. A short briefing with the client is a good idea, too.

The techie part

SEO involves a lot of design, programming and server administration issues, which you can’t afford to neglect. An ideal SEO has to be a high-class PHP, ASP and Java programmer, an experienced server administrator and a perfect HTML coder. In real life, it’s often enough to be able to explain the tasks (in depth) to the administrator/programmer, so you have to at least be fluent and convincing in their jargon, because they will often ask you: “Why do I have to do this?”, “Are you sure it’s so necessary?”, and say “Methinks you’re just making it up!”

How do you turn session IDs off for spiders? How do you implement the 301 redirect? What if the redirect has to be done using ASP on IIS? How do you use mod_rewrite? How do you make sure the HTML page with a Flash object embedded into it validates for the W3C standards and displays properly in all browsers? These are just a few questions out of many more you will have to find answers for if you want your SEO work to meet high standards of quality.

HTML itself presents a lot of problems, and a good optimiser is expected to be able to handle them all. A general code cleanup (removing unnecessary tags, moving JavaScripts and CSS styles to separate files, W3C validation and getting rid of unnecessary nested tables to reduce the code bloat) is a painstaking job; but according to the newest standards it’s not enough any more. If you wish to be respected by your fellow SEOs as a really high class specialist, you need to be able to handle CSS-controlled layouts created without a single <table> tag.

Been there. Done that. If you think it’s easy, I have to disappoint you: it is not. At least when it is your first time, be ready for a lot of sweat and tears, as well as a few moments of total desperation. Cross-browser compatibility is going to be the hardest part.

Marketing considerations

Offering pure old-fashioned SEO as a separate service is not the best idea these days. Your client needs to be able to estimate the ROI of the whole promotion campaign to make sure the start-up budget won’t be wasted while waiting for the first results of the SEO campaign. Expenses on PPC and other marketing efforts should be carefully planned from the start and combined with the SEO expenses, otherwise the business is at risk of being ruined right after being born. So, if you offer SEO, you need to be able to offer SEM, too.

What’s more, you need to be able to measure and analyse the results of the ongoing SEO/SEM efforts using various statistics packages and customised tracking software. You should be able to provide a well-grounded usability advice aiming at the improvement of conversion rates. You should be the best.

Marketing your own service will be a never-ending effort you need to be ready for. The Internet is crowded; the competition in the SEO industry has become staggering. You will have to use all the opportunities the Net offers for businesses, to continuously market yourself, and to look for more. Let yourself relax just for a day and you will probably find yourself unemployed, even if you are a well established practitioner.

Professional ethics

There is no such thing as a widely acknowledged professional code of ethics in the SEO industry. All efforts aimed at creating one have so far been in vain. But it doesn’t mean that all SEO practitioners/companies shouldn’t have their own codes of ethics. It’s just as important for an SEO as it is for every other representative of any legitimate industry existing in the world.

Apart from the importance of an ethical approach to SEO techniques as such (no blackhat techniques or strategies, no matter how tempted you might be), it’s just as important that you never rip your client off. Your services should be worth the money paid for them, so make sure you always estimate the outcome of your campaign correctly. Don’t overestimate. If you underestimate and over-deliver, it will create a much better impression, and your grateful clients will soon start referring their contacts and partners to you.

In regard to blackhat tactics, another issue arises. Even though you should stay away from them at all costs, you absolutely need to know what they are, and be able to spot them and evaluate potential consequences. When you start working on your client’s site, you are often not the first person who has tried to optimise it. Often, your actual SEO work starts with cleaning up the mess created by your predecessor, and like the rest of the work, you have to do this part well.

What else?

A good SEO has to stay on the cutting edge of the industry’s latest trends and changes. When you deal with the search engines and the Net as a whole, everything changes so fast that you have to continuously monitor forums and read newsletters and articles to maintain your touch with your profession. There is no other way.

A good SEO has to be patient and diplomatic, to communicate well with clients/partners and keep the relationships mutually beneficial.

And last but not least, a good SEO has to be realistic. Remember, you can spend the rest of your life trying to get #1 in all engines for a tough term like “SEO” – and never get there. Don’t do this. Instead, concentrate on real goals, on something you can – and will – achieve.

Why the best SEOs work in teams

The number of skills an ideal SEO has to possess looks a bit overwhelming, doesn’t it? Actually, the task of finding one person who can do all the things mentioned above (and some others) is close to impossible. That’s why most SEO shops, even if they are small, usually consist of several people, not just one person wearing 15 or so hats. It seems reasonable. But as the industry develops and grows, we should expect the SEO companies to drift towards specialisation and partnerships. While larger companies will still be able to handle the whole range of SEO-related tasks by themselves, smaller ones will have no choice but to get more and more niche-specific, and form partnerships to outsource the work to and fill in the gaps. This way we’re all doing what we do best.