The YouMoz Blog

SEO - 10 Things On My Wish/Gripe List

This entry was written by one of our members and submitted to our YouMoz section.The author's views below are entirely his or her own and may not reflect the views of Moz.

This is my wish list from Google and the other search engines. Rankings should be based more on the content of the site and less on technical aspects (tags, domains, links). The theory is that, as SEO teams, we could devote more time to content, the good stuff that actually benefits the user.

1. Focus on Content. Focus exclusively on content. Best content wins, period. Once an algorithm can do that, it will corner the market. Perhaps a hive model rating system like Digg, but for search.

2. Ignore Keyword Phrases in the Domain. Stop rating domains based on keyword phrases. It just leads to stupid, boring, non-creative domains. Plus, people purchase 100 more as they think it helps (although that is a separate education issue SEO teams must tackle, but it's related). Ugh. What a mess. Let’s try a system where domains don’t matter so people can choose ones for branding purposes alone.

3. Ignore Keyword Phrases in the Page Name. Stop rating pages based on keyword phrases in the page name. Ditto, only it's worse with super long URLs. Let’s keep things short, not as short and ugly as TinyURLs, but short nevertheless.

4. Stop Counting Links. Stop counting links entirely, or simply discount them and put more emphasis on content. Oh Noes, the sky is falling. Yes, I FULLY realize this is not possible and the basis of the PageRank algorithm. However, wouldn't it be cool if truly the best content was ranked No. 1 every time? It would probably put a lot of link teams out of business and put more content writers in business. Wow, that would be a huge shift, but something to think about. Of course, with communities like Digg, Twitter and others that simply do rate things based on content & ideas, it’s going to happen naturally.

5. Conversion Tools. Create tools that are super easy to track SEO conversion all the way to purchase or inquiry and rankings inside Webmaster Tools. Yes, there is the new Google "conversion" filter, but it's not quite the same. The same people spends money with you anyway; make ‘em happy, they will spend more.

6. Google Keyword Suggestion Tool. Allow Google Keyword Search to show all data ever, even if a phrase only has a few queries. I hate seeing zeros and then knowing that they have a few real queries. Just tell me the real data even if it's 4 people all year – that is good data to know.

7. Anchor Text Rating. Stop anchor text rating. Again, it's just painful to read good copy only to have it ruined by a stupid SEO anchor text link put in the middle of a paragraph. Again, a focus on better copy.

8. Choose Your Own Filters. Let the searcher choose what they want to rate by (e.g., title, meta, anchor, and other tags). That is an odd idea, but might work. Choose your own rating system to see what filters you want to have more important.

9. Ignore the Title Tag. Ignore the title tag. OMG, did I just say that? Yes. Ignore it, as it's just as abused as the meta keyword at this point. This will allow designers to put back in company names and nice, short, friendly page names to actually help the user.

10. Roller Coaster. Somehow stop the roller coaster of rankings. It's just hell to see a client go from 1 to 10 to 5 to 1 in a few months, or even days. It's really fun for client relations. This is more of an off-topic rant, as ultimately it’s up to the SEO team to make the rankings always high, but sometimes that is simply not possible. Perhaps setting up client expectations is a better route.

About peteboyd —
Peter Boyd is General Guru of PaperStreet Web Design which does web, print and marketing for law firms and professional organizations.

66 Comments

I actually think many of these in your list have very good reasons behind them and keeping them is better than not. IMHO.

I thought I should elaborate…

1. Focus on Content – Content can be manipulated because it originated from the site owner. You probably need an offsite factor to counter balance onsite only factors, like, say links given naturally by 1 website to another. If all that the search engines had to go on was content then the web would be stuck back in 2000 when scrapers were stealing content, slightly modifying it for keyword density and then calling it original.

2. Ignore Keyword Phrases in the Domain – I can kind of see your point here, but I do believe that there is true value in giving ranking weight to “Visible” elements such as URLs and Title tags.

3. Ignore Keyword Phrases in the Page Name – See # 2.

4. Stop Counting Links – See # 1.

5. Conversion Tools – I am not a certified Google Analytics consultant or anything, but I think they offer this service for free and it is labeled “Goals”.

Yes, our main goal is to focus site owners on putting out quality content, ideas and information. If that happens, then usually links and everything else will fall in line. A lot of the gripe list would fall away. :)

One main point is that overall, a lot of day to day SEO tasks are focused on fixing a lot of errors by site owners, versus creating content, ideas and information.

Of course, that keeps us busy. But its sometimes tedious work; versus creating innovative ideas.

I'm glad I saved you a little typing. Now you have a few extra minutes in your day for link building, keyword research, quality content creation and maybe some other little thing that can make the web a better place. :)

1. I would recommend to Google to think long and hard about adapting the SEOmoz Webfluence Algorithm. I have not seen an search engine algorithm with more dependable set of criteria that will match a human searcher's query with extremely target results.

2. Cool, cool.

3. Cool, cool.

4. See # 1.

5. I think that is why they still keep a paid version of Urchin around.

It's definitely part of the SEO. If you are not recommending content changes and additions, then you are not doing a job as an SEO consultant. Although day-to-day duties vary by team member at each company, its definitely part of SEO for a client.

It always bugs me when I hear this kind of thing - ever heard of the concept of "link bait," Darren? In my experience, solid content yields solid links. You can't expect an authoritative website with high PR to link to your site if it has useless, crappy content. Granted, in some instances, good content is not required for a good link - but I agree; updating/adding content is a daily part of SEO. If you don't practice this, I'd say that you're doing your client a disservice.

An "Edible Arrangements" delivery from Google pre-penalty would be lovely. Too bad Google loves being mysterious.

I'm not saying that good content won't help rankings--I'm simply saying that "content creation" is different from "content optimization."

The word "optimization" implies that something already exists.

What if someone came to you with nothing but a domain name and asked you to optimize it for search engines? Would every step from there forward be considered SEO? Hosting, server configuration, installing a CMS, building a framework, writing content, etc., etc., etc...

It just sounds like your definition of SEO is from an in-house SEO's perspective, whereas mine is from a consultant's perspective. If you are working on your company's site or your own site, then yes... you're going to create content to drive traffic.

But what about this example:

Someone comes to me for SEO consulting and wants me to optimize their company's website. Their company sells specialized medical probes that cost tens of thousands of dollars. Their keywords are things like "high energy gamma probe" and "medical nuclear imaging device." Their main competitors are major universities posting technical reports and the results of medical experiments.

If "creating better content" is part of SEO and I offer professional SEO services, then I should be able to produce content about high energy gamma probes.

My point is... content creation is not a service that website owners should expect an SEO consultant to provide.

As for my own SEO services, I charge a flat hourly rate and do whatever needs to be done, starting with the tasks that I think will yield the greatest results (unless the client asks for something specific).

the line, if there is one, is quite grey, and when it really comes down to it, it's all about people's perception and semantics when it comes to the definition.

when building a website from scratch, a lot of people say they want it built "with SEO in mind" [those that have heard about it]. the practices done with this terminology then would be optimizing the title tags to be SEO friendly, meta tags, checking structure and navigation, checking content, etc.

the end result is a website that is optimized - though it was optimized during creation.

or, you can take a website that has been up for a year, that was not thought about from a user-friendly / se-friendly point of view, and optimize it to bring it up to where it should be. since user-friendliness and se-friendliness kinda go hand in hand, the idea to check and update / optimize the content is just standard, imo.

i still see, though in a decreasing amount, a lot of websites developed by the ignorant [or those who are strict developers and don't want to think about what happens with the website after it is launched...] and incorporate the use of poor tactics with page titles, lack of meta's, poor nav flow, etc. but like i said, it's in a decreasing amount.

now that the word is out about having your site be search engine friendly, a lot of people are researching, and a lot of people are hiring SEO's, or trying to implement the techniques themselves. i think with this epiphanic movement, you will find that search engine optimization becomes a standardized service with website development [though i still think there will always be sites that aren't friendly to se's, and since things change and the algo's change, etc. there will always be new things for the pro's to implement to stay on top of it all with...]

end point - content still needs to be well written and relevant as that is the primary source for gaining links. unless of course you are able to implement new applications, idea's, etc. in some other form that will attract links, visitors etc. but this becomes a semantic thing again where people would consider those items to be content anyway [visual, at least]. it still needs to be reviewed and is part of of the SEO / SEM process.

Unfortunately, everything on your wish list either already exists or would negatively impact the quality of search results.

You see... Google has a "cloud" of computers that can process extremely large sets of data, and every time they make decisions like "let's use the content of titles and inbound anchor text," they can run trillions of calculations to see exactly what effect that decision would have on the quality of their search results.

In other words, Google's ranking algorithms are backed by real data analysis. You're wishing for things that have already been proven to be inferior.

P.S. I have a strong suspicion that this entire post is intentional bullshit/flame bait. I find it hard to believe that someone who writes so meticulously would sincerely believe any of this. Plus, item #10 is a gripe about the ranking "roller coaster," yet your profile says that's your favorite thing about SEO?

Link bait or flame, not really. Just a top 10 list of ideas. Scary ideas.

Google wants to point people to the most relevant article (and make a buck from that). It just happens we are stuck with some algorithm paramaters that are effecting the usability / readability of the web.

Architecturally wise (tags, links, anchor text copy) some elements could be improved. Would that lower search quality, perhaps. However, search engines job is to invent a better mousetrap.

I firmly believe that from a user standpoint, keyword phrases in the domain suck, as do keyword phrases in the URL. So does keyword phrases stuck inside web copy. Shorter, more memorable domains and URLs are always better; but for a SEO perspective we all know keyword phrases rock in the URL, so we do them.

I do believe you skipped over the fact that I state that its a Wish list and "Yes, I FULLY realize this is not possible and the basis of the PageRank algorithm."

So yes, links are here to stay for awhile. However, the web influence factor with social networks is bound to start effecting rankings more. Of course, those are links in of themselves.

It just happens we are stuck with some algorithm paramaters that are effecting the usability / readability of the web.

I think there are examples of this that could be argued fairly well, but "keywords in domain names" and "keywords in URLs" are certainly not among them. When I look at the URL of this post, I see the following:

Well put. For the most part, the majority of users prefer to see keywords related to their search query in the domain of websites in the search results - why? Because the domain name is seen as being more relevant to their search query.

If you search for "SEO Tutorials" then most likely a domain name like seotutorials.com will definitely stand out to you as being potentially very relevant to the search query "SEO Tutorials".

Now, I totally agree with the point that branding and domain trust can go a long way and achieve trust as well becuase I am certainly not discounting the fact that say a strong SEO brand like SEO Moz or SEO Book will stand out to the searcher as well (if the searcher is familiar with SEO) because of their strong brands.

All that to say, searchers see domain names that match exactly to their search queries as potentially being very authoritative (i.e. why else would there even be type-in traffic if not for this fact) and searchers also see trusted brands as potentially being very authoritative as well.

A winning strategy is some type of combination of both of these elements if possible or even a very strong amount of either of these 2 elements.

I agree with Realicity that maybe this piece is maybe just flamebait because of a lot of the kind of absurd things that are wished for but either way it was a well written piece and opened up some fun discussion - even though Realicity is taking words right out of my mouth :) I just couldn't resist commenting on the keyword domain issue. :)

PS Sometimes it seems that the term "keywords" can almost be mentioned as if it is a dirty spammy type of word and if you have a site that has (Oh No) "keywords in the domain name" then you are associated with running a botnet or a massive group of splogs or who knows what else but in reality as SEO's we almost forget that what exactly is a keyword? It is a a search query. It is exactly what a searcher wants. So... if a keyword (aka search query) is exactly what a user wants and the name of your website (aka domain name) is arguably the absolute best way to tell what your site is about then of course potentially the best website to want to visit from a searcher's perspective is a website whosed named after exactly what they are searching for. Simple I know :)

Yes, there are definitely examples of abuse of the URL and page name. Of course there are abuses of every SEO tactic, hence why the meta keyword tag is no longer in use; because its invisible to the end user (unlike Title, URL and page name).

SEOMoz Example

SEOMoz, being an SEO team, obviously does a good job of keeping it clean; with a nice short domain name and page name list. So they are a good example.

Usability

I do disagree from a usability standpoint that people remember long page names. I do think that people will not remember a long page name such as that, it would be better if it was just /seo-10 or /seo-wish/ or just the article number. Shorter is always better for usability. For URLs sometimes it is good to have nice short keyword URLs, even for usability (although it limits the brand). There are plenty of sites that stink at the URL long and hard to remember.

Theory vs. Practice

Practically speaking, I completely understand why keyword phrases are in the domain. Its a good indicator of the site. However, for a brand, it leads to issues.

You don't see SEOmoz calling their web site seoteam.com or Nike having their domain runningshoes.com. However, in theory if keyword phrases in the domain matter, then sites without keywords are in essence penalized. They have to make up the points in other areas in essence. Of course they do that, and do it well, but again its a factor to consider. I would just rather focus on the content of the web site.

Examples

I should have included bad examples in my article. That would have helped show the theory.

A lot of these complaints/gripes are from a user's perspective. People who spam the Search Engines are using techniques that exploit the link counting, the keywords in the page name/domain, etc.

But in the end, this is what we, as SEOites, have to stay on top of. We have to separate black hat and white hat. We have to weigh the grey areas and learn just how to rank well. We need to learn how the search engines rank pages so we can get our own sites ranking well. We are at the whim of the almighty search engines (mostly Google).

In an ideal world, great content is all it takes to rank well but in reality, it doesn't always happen that way. If you have great content, people will scrape it or you might get beaten by someone spending a lot of money buying links, etc. I find great, useful content is a lot cheaper than hiring SEO consultants and doing a full blown link building campaign. So from a financial perspective, it is better to produce good, useful content. It will get linked to naturally because its' so awesome and useful.

I like this post, while I disagree with some of the viewpoints. I do agree that links should be cleared out of the algorithm. Maybe not entirely, but somewhat. There is just too much manipulation there even if Google claims to rummage through good and bad links correctly.Most of the points mentioned should not be removed though, especially the title tag. It's important to mention that these factors are typically not looked at on an individual basis, but rather together as a whole to determine relevancy, ranking, authority, etc. I don't believe the SE's look at title tags soley. They look at title tags, anchor text, links, and so on together. So to remove one would mess it all up. If they looked at title tags as an individual gauge of relevancy, then I would say throw that factor out the window.

Personally I think Google and most search engines should base much of their algorithms on the user experience and behavior with the site. TOSAS, pageviews, bounces/bounce rate. The user behavior stats should go directly back to the keyword/query to determine relevancy. If a particular keyword brings visitors to your site with low bounce rate and high TOSAS, then you're rewarded. This of course, is subject to manipulation though just like anything else.

I know behavior now counts somewhat, but very little and I think it is an important factor to consider.

You can end poverty by inventing the replicator. At least that is how they did it in Star Trek.

The ideas put forth will cause the entire current SEO rankings game to change; sure that is the goal of the article. Put forth some ideas for discussion.

Will it work practically, no of course not. The article stated that its a wish list and we know it would not work. It would cause rankings to change, probably for the worse.

The real issue is that as SEO teams we are too often focused on technical aspects and what pleases Google. This diverts attention from creating better content for the end user.

That is what started the article, when I was having to review a technical report of a poor designed site, even though it had great content. Content that should have ranked it higher.

We have all seen a sites, that should rank high, but for technical reasons do not. Those are great clients as we fix something and poof it ranks high. SEO helps in removing the technical obsticles, but on a theoritical level, should those be in place at all?

We have all seen crappy content sites rank really high too. Ruins the search experience, and is what some algorithms are purposely set to curtail.

Why should we care about long page names in the URL, other than to please Google? Why should we care about keyword phrases in the domain? Why should the title tag be full of keyword phrases, instead of just the site name and page headline? Why should we have to link build at all? Why should anchor text matter in the middle of a link / paragraph?

Yes, all those help Google detect a page, but does it help the user? If it does not help the user quickly access information on a site, then its an obstacle that should be removed. Some of the items on the list are obsticles at this point.

I was just pointing out things that we probably all dread in doing, but have to do, to please search. If the only reason we are doing them is to please search, then that begs the question of why it should be there.

Bottom line, we all do those tactics because we know they work and that diverts attention / time / money / resources from producing better ideas and content.

Will it work practically, no of course not. The article stated that its a wish list and we know it would not work. It would cause rankings to change, probably for the worse.

So you're "putting forth" ideas that you know won't work, hoping that it will stir up some kind of intelligent discussion, and you wish Google would return less-relevant results because it would make your life easier.

The real issue is that as SEO teams we are too often focused on technical aspects and what pleases Google. This diverts attention from creating better content for the end user.

Um... no. "Creating better content" is not SEO. You're basically saying that SEO teams are too often focused on doing their job, which diverts their attention away from doing the copy writer's job. If you're not a fan of technical details... you might want to consider a career change.

Yes, all those help Google detect a page, but does it help the user?

Yes! What you don't seem to understand is that Google programs their search engine to return results that satisfy users. Why does Google prefer pages with unique, relevant, descriptive titles? Because USERS prefer pages with unique, relevant, descriptive titles. Writing unique, relevant, descriptive titles isn't diverting attention away from creating good content--it IS the creation of good content.

Here's my final attempt to make you understand why your post is illogical garbage. If you still don't get it, then please change professions because you're making the rest of us look bad.

Your logic:

1.) Google uses signal xyz to rank pages.

2.) Some websites abuse signal xyz.

3.) peteboyd doesn't think signal xyz is important.

Therefore, Google shouldn't use signal xyz to rank pages.

Reality:

1.) According to the data Google has collected from hundreds of millions of users around the World, most web pages with signal xyz tend to be higher/lower quality.

2.) Google adds signal xyz into their ranking algorithms, observes the results, and verifies that overall, using signal xyz improved the quality of their search results.

It's definitely part of the SEO. If you are not recommending content changes and additions, then you are not doing a job as an SEO consultant. Although day-to-day duties vary by team member at each company, its definitely part of SEO for a client.

Google Algorithm

I said the best content should win. By best, I mean most relevant and thorough. That is Google's goal too, to point users to the most relevant content. How Google accomplishes that is up to them. Right now, they use page names, keyword phrases in domains, and links as some of the signals.

However, if Google algorithm was 100% accurate with all its signals, SEOMoz would be rated higher. They have some of the best content available online, yet still lag at #9 for "Search Engine Optimization". There are countless examples in each industry. So perhaps those signals need to change.

That is the point of the article, that the best content should win and too often the sites with the best content lag behind sites that use pure link campaigns and tags to rank higher.

To dumb down my explanation any further, I would need some colorful construction paper, paste, macaroni, glitter, and a clown with superb balloon animal skillz. Unfortunately, I don't have any macaroni... so I'm just going to step aside and allow you to continue on your journey.

Have you considered building your own search engine? You certainly have a knack for this sort of thing. Who knows, maybe you'll build a Google-killer!

"The article stated that its a wish list and we know it would not work. It would cause rankings to change, probably for the worse."

You make no sense here at all!! So, let me get this right, based on your above statement, you have a list of things you wish for, but you have a strong hunch that it will detrimentally impact on the quality of search engine results?

Why would you deliberately wish for something if you thought it was going to make things worse?

"The real issue is that as SEO teams we are too often focused on technical aspects and what pleases Google."

I'll let you in on a secret. We focus on what pleases Google because Google focuses on what people want. That's why it's the #1 search engine.

I honestly can't believe that after all the comments have weighed in and provided clear evidence of Google using tried and tested theories, you STILL try and defend your theory.

@Darren....Just realized my comments duplicate one of your replies. Sorry. Bonus thumbs up for getting there before me.

Theory articles can spark debate, and make you rethink factors in search and why you are doing X.

Offer a study that specifically states users like link building campaigns, keyword filled title tags, keyword stuffed pages of odd anchor text, long keyword phrase domains, and long page URLs with keyword phrases. To date it's only been stated, but not backed.

"Theory articles can spark debate, and make you rethink factors in search and why you are doing X"

Quite! This post has sparked debate, but you don't seem to be rethinking your stance in spite of numerous people weighing in and disagreeing with you. Maybe it's you who needs to re-evaluate your stance?

I am definitely willing to re-evaluate. For the most part its just people stating the obvious though:

"Google search works now and your ideas would cause it to break."

Yes, of course. But, the factors on the wish list have been abused. Meta keyword was devalued once it was abused.

Sure Google tries to curtail the abuse. But for a real world example, why should a site such as SEOMoz, with possibly the best content on SEO be relegated to a lower overall standing in search for its core term?

If the algorithm awards other sites with keyword phrases in domain, massive link building campaigns, then something can change in the algorithm. This is just one industry example.

5. Conversion Tools - I would love to see by default the Conversion Filter and Goals integrated right into the Google Analytics code we insert on all pages. Yes, you can do this manually, but what not make it super simple.

6. Google Keyword Tool - I think most people agree that more data is better.

7. Anchor Text - Again cut down on the abuse.

8. Filters - Just a wild idea. Personalized search came about and basically allows this now.

9. Title Tag - Again cut down on abuse of the tag. Title tags on YouMoz are so clear and easy. Yet on some sites they are super long and definitly not user friendly.

10. Roller Coaster - This is again a wild idea; but I think we have all had those days.

I'm sorry Pete, but I still don't agree in the slightest and don't think any of your suggestions are practical or indeed suggestive of a superior algorithm.

"4. Links - Obviously this has to stay."

I firmly believe in the democratic power of links. They act as the single largest indicator of popularity. Popularity is expressed when sites are relevant. Thus, relevant sites are awarded with more inbound links. In turn, Google values this factor as the single strongest measure of relevance available.

That is cool, let me try to explain further and not dig my relatively new SEOMoz posting grave too deep. :)

No, links are good. The board wins.

Question though, if we are recommending a course of action specifically to increase search, is that task always helping the clients brand?

Let's take domains for example, which is better for SEO?

WidgetLawyer.com (practice area only)

NewYorkWidgetLawyer.com (practice area + geographic area)

Marbury.com (firm name)

I deal with a lot of law firms daily, so excuse the law references.

If all domains are new registrations, then from a local search perspective, newyorkwidgetlawyer.com is the way to go. From a national search perspective, probably widgetlawyer.com is the way to go.

However, from a brand perspective, perhaps marbury.com. It allows them to be not just be a widget lawyer or a New York widget lawyer. They can do any type of law and brand their name in all collaterals, which helps with recognition of their firm members. Law firms are sometimes hard to brand in one area as they do a lot of different services, but so do a lot of major companies.

If we are to recommend picking their firm name as the domain, it will not help their search ranking (won't hurt, but won't help). So we have to make a decision between SEO and the brand.

If the algorithm did not count, or value domains less, then the choice is a bit easier and most people would lean towards just using the firm name.

The points in the post is that at times SEO is at odds with branding. I found that to be the case in choosing domains.

i think the optimal way to attack your domain issue would be to purchase the domain marbury.com [from your example] for branding purposes, and purchase the others [if available, and if they service nationally as well] to target the keywords.

targeting the keyword domain isn't absolutely necessary, however. i would recommend going that route if micro-site and landing pages are in your marketing plan, as it's a great way to target specifics and track your action prompts.

i'm glad that i read through all the posts because yes, i think that the links are most definitely important, and i'm glad that you finally agree.

the idea behind link acquisition is to go out and request links from authoritative websites that are, potentially, in your industry and, potentially, relevant to your website. granted, you can gain links from a plethora of websites that have nothing to do with you, but, it's the process of acquiring those links, building those relationship, etc. that makes link building worth while. and those that link to you, as i'm sure you're well aware, show to google that they trust and believe in you, your website, and your product/services/information.

obviously there is always room for spam, illegitimacy, etc. but that is why google is constantly changing their algo's to keep up to date to counter that. hence why they devalued meta keywords.

i think your post has definitely garnered much response, both friendly and otherwise... and though i don't know why you'd wish for such things, it was interesting to follow through on.

Solid idea. Yes, we have two tactics depending on the budget of the client.

Low-cost domain tactic: Purchase both the branded (marbury.com) and keyword phrase domain (widgetlawyer.com). Do a 301 redirect of the branded domain to the keyword phrase domain. That way the firm can use the branded domain all they want on print material / email and the site still appears when typed in (although with a new name in the URL). Search engines index only the keyword phrase domain as the dominant domain and the client benefits from the keywords in the domain. However, client's have to be talked into this tactic, as it goes against what is the norm. It still may alienate some members of the firm or certain practice groups. It technically still brands them as widgetlawyer.com online if visitors notice the URL. That scares off some people at organizations who do non-widgetlaw.

High-cost domain tactic: Yes, setting up niche sites works. Of course, that costs more time/money. It might not be right for clients who have a total internet marketing budget of $5,000 to $10,000 initially (including developing the web site, content, maintence, SEO and everything else for their web presence). We deal with smaller organizations that have smaller budgets. So now you are dividing your time / money across sites. It also forces some of your best content to be off-site and causes you to have to do double work in getting two sites ranked high. So that has pros / cons. Good strategy if you have a higher budget.

In the Abstract: However, realistically if we are doing all that just to compete, then it takes time away from just producing better informative content for their main site; especially considering small budget firms.

That is what the wish/gripe list was about; anything that takes away from producing relevant content (articles, studies, ideas, graphics, blogs, etc.) is what I was trying to discuss. The topics seemed to be attacked as this is what I recommend to improve search algorithms, when it was not intended to be at all.

Anyways, its been interesting first post, to say the least. I should have provided specific trade-off examples such as monetary, branding and time of each topic would have helped explain the gripes better and reasoning behind them.

well, i think realistically you can produce great concepts, great designs, and great content on small budgets...

i mean, why should the little guys suffer from the lack of quality just because they don't have a lot of money?

our firm works with small budget clients all the time - budgets that average $500 - $2500 per project [people think that we give ourselves away, but it's really all just part of our business model / plan]. we give them the same quality end-product and the same customer service they would get going to a firm that would charge them 5-10 times as much as we do. the result is they are able to compete and stay alive, just like you and me and everyone else in this seemingly rough economy.

there seems to be a common theme here. you say that the best content should win. if you're good at your job, and know your clients, you can develop end-results for them in an appropriate amount of time, and at a decent cost that will benefit them and yourself.

and as a side note, it doesn't take much to set up a micro-site that will help them both in their SEO / SEM campaigns as well as their budget.

if you are working with small client budgets and are going to try and monopolize on the acquisition of two domains [basically creating two website's for each, one being a micro-site] then why would you stretch yourself thin, and try to make up the slack in the end?

It also forces some of your best content to be off-site and causes you to have to do double work in getting two sites ranked high.

does this mean that you're going to create sub-par websites, content, ideas - all because they can't dish the green?

to me, this seems a bit unfair... and perhaps a situation to which you should refer them to someone else that will give them something quality for a lower rate. that, mr. peteboyd, is called integrity. and if you do what you suggest for these high-cost domain tactics, but for those who can't afford them, you have compromised yours, it seems.

it's a business of money making, but also a business of providing a service that is quality for those soliciting your services.

Yeah... and then they should add a feature that lets us request re-inclusion in Google's search results, after we have fixed whatever caused the penalty.

And if there's still time... perhaps they can invent a way for us to submit a whole bunch of URLs at the same time. I have a 100,000 page website, and my fingers hurt from typing in all those URLs one at a time.

this should not exist for one main reason. if you get a penalty...you are doing something wrong/deceptive. you shouldn't get a warning for that. if you do everything right, then you would have nothing to worry about so no need for a warning, right?

But what is "right" and "wrong" as deemed by the search engines is constantly changing. What is was "right" six years ago is blatantly spam and considered black hat now. We are in an industry that is constantly evolving, and as such the "industry standard" doesn't stay a standard forever.

So for site owners who a) can't afford a trained SEO to do their optimization or b) are just learning or getting started with SEO, it would be good to provide a warning or in the very least, a reason for the penality. I truly believe there are people who get penalized who aren't spammers trying to game the system..they may just not know all the ins and outs yet.

I would disagree with that, what you could get away with 6 months ago, you may not be able to get away with now, but that doean't mean it was ever right. I am struggling to think of any examples where something that has been actively promoted by the search engines as somthing good for your website is now detrimental, rather than just being ignored as usually happens when something becomes obsolete.

I had a think about this. If I was manipulating the results in the way Google doesn't appreciate it and they warn me about that, I would probably then have a chance to find a way around it. I believe penalties are a bit like speed radars... you don't know quite where it is but you know if you drive to the speed limit you're safe.

I agree 110%. The main question would be how would google rank the sites then. Do you just add as much content as possible about your topic? Yeah, links are a pain in the butt. I have a feeling that soon google will only be counting links that are highly relevant.

"It's only theory. But theory eventually leads to a better mouse trap."

Theory can be defined as a coherent group of general propositions.

You've stated some general propositions, but they certainly aren't coherent.

Personally, I would categorize this post as a collection of mostly bad and uninformed ideas that will have no positive impact on building a better mousetrap.

One specific point I would like to make relates to your opening suggestion of "Best content wins - period". First of all, best anything is highly subjective, so who makes that decision?

You suggest a hive model similar to Digg. So in essence, let's remove the collective brainpower of an algorithm developed by thousands of unbelievably brilliant PhD's and replace it with a bunch of mostly 20 something goofballs, because that will be a much better indicator of best content.