Blog Comments & Posts

I wanted to establish if external links from Twitter hold any weighting. It is a well-known fact that Twitter uses rel = “no follow” on all external links, but do the search engines still cache these and consider them a vote?

The "be sure it's links" section is golden, a lot of webmasters are very quick to point the blame at external factors than accept their sites shortfalls (weak content, poor design, clumsy UX).

I think point 4 is the only real productive solution. Chasing your tail trying to figure out why Google doesn't like your link portfolio is never going to drive a website forward. Instead, if Webmasters think about the long game and continue pushing for highly authoritive and relevant natural links, then they are only ever going to succeed. However, I appreciate the argument that some websites simply can't afford to wait for these things to iron themselves out.

I love the proposal too. As an added benefit for Google/Bing, this feature would provide a really strong spam signal direct from webmasters (i.e if a large volume of sites are including 'disconnect: www.badsite.com' then it would be a pretty clear signal that badsite.com is dangerous territory)

I have to agree. I'm sure plenty joined for the exact same reason (I sure did). We can see now as the amount of advertisement's between songs is growing, that the number of users is declining.

Whilst the original post has mentioned the facebook intergration, you seem to miss its negative effect on Spotify within its community. I see it being much like how Netflix saw a dip when they introduced a higher base price.

Whilst getting the launch of a product right is a great help in creating success (and Spotify did this), it is the continuation of that success which really matters.

People's ability to be unnerved by technology is inversely proportional to their understanding of it. The more I learn about retargeting the less I mind it but to the uninitiated it can feel like someone's been rummaging through your bins.

I'd agree completely with your closing statement but lay stress on the first clause -

"Executed in the right way, it could be a very powerful technique for creating high performing, high converting websites."

For one search term (blue tomatoes) your competitor could buy a link and get away with it but if he’s buying links across all of his major terms then there’s a higher chance that his link profile is going to include a known link broker - and that's when he gets into trouble.

I love doing competitor analysis as to can open your eyes to new opportunities but the issue i have at the moment is that i have found a competitor walking a very fine line between paid links and good PR.

e.g. Give products in return for a paragraph full of achor text. Targeting charities.

I am trying to establish if this is bad SEO (Google could take action or not) because it's working fantasically for them.

Congratulations, that is brilliant news and to be having twin must be extra special if not just that little bit scary.

In the run up to Deceber it will get very exciting and tiring at the same time.

I know what have to come my wife is due to drop on the 19th of October, right smack in the middle of the SEOmoz london conference. Hence the reason I am not going. :) :(

I wish you all the best, i would love to see your guest blog once and a while just so you can give all the mozzers an update on your two bunddles of joy / horrors (You know you are going to have those days).

You would be surprised how much a user would like contribute if they know the reason WHY they are spending their time doing so.

Reviews are a logical UGC methods if your site sells a service. We recently had a drive to get our user to get involved reviewing products and within 3 months we generated 10,000 reviews with over 500,000 words of unique awesome content, which is overwhelming.

Why I think it was successful...

Clear messaging of

- What we want and why

- What is benefit to them

- What is the benefit to others

And making it easy to do so;

- No long login forms

- Obvious usability (big cuddly buttons)

Not only is this content unique but its written in a language that other users from you target demographic use for their search terms therefore it makes it easier to perform on long tail terms.

And…

Long tail terms = High propensity to convert as the page is more relevant to the users search

As the content on the lister page is parameter driven and a user can select the filters which builds the URL in any order, this results in a lister page that has millions of combinations duplicating the content hundreds of times.

But due to the URL building in the order of selection the end result is two URL duplicating content.

So using XML to manage it the URL is passed through a URL reviewer;

Example:

First it breakdowns the URL in to each components (Parameter) and checks if it’s a valid attribute, if it is then it will rebuild the URL in the order presented in the XML. This rebuilt URL is then presented in the Canonical Tag.

This strips out and thing you don’t want such as affiliates tags, but also make it consistent.

I totally agree about your comment on Canonical. I have implemented it on a pretty big website and have seen very little impact but one thing I have noticed is the URLs on the new pages Google is caching are alot cleaner.

I.e. They match the Canonical Tag of that page.

If anyone is interested in how i have iplemented Canonical Tags have a look at Appliances Online co uk.

Also i am a little disappointed that H1 doesn't seem to have as much weighting as it used to. Its one of the first examples i use when explaining targeted term, oh well. :)

I wouldnt worry about you page rank dropping, you may find the pages you link through to have increased which may suggest that these pages may have increased in SERP as you are pass trust rank and juice to them.

Also google is always shifting its scores to make it harder for that perfect 10.

First of all, I am so glad that Rand is getting second thoughts on using iframe to stem the flow of Juice. When I read his blog the other day it did make me feel a little uneasy as in my opinion its very messy.

I think the theory of the global domain driven by the power pages does make sense to me and optimising these pages to see them rank should be the focus.

Obviously having every page in the serp is everyone’s dream but I feel you need to choose your battles, find the pages you want to perform and ensure they are easy to crawl and if that mean sacrificing juice (Evaporation - Great analogy) then so be it.

Optimising your link structure within your site ensuring when its passing rank trust juice ensure it maximising its effectiveness with good alt text is a good win, but there is only so much on site optimisation you can do? Don’t forget guys the 60:40 External to Onsite optimisation.

I am a big fan of applying an attribution model, so you assign 30% of the revenue generate to the first click, 30% to the last click and the remaining 40% spread evenly over all the clicks of that user.

So essentially the first and last click get a 30% bonus. This highlight so many RTM which were thought to be none profitable (ie. Generics terms)

We had to switch to Coremetrics who use first party cookies to ensure a 98% accuracy because Google who use 3rd party cookies ranges around 60% if that. Also it tracks a users total lifecycle, even on separate machines.

I agree rand, it was difficult leap to start putting outbound link on my retail sites but the benefits outweigh the cons.

Links to the charities that you support is a great one, not only are you potentially more likely to get reciprocal links but is can make your online business seem more real and trustworthy to your customers.

I understand where you are coming from regarding the use of sub domains. Interestingly a SEO company that provide ad-hoc SEO advice called Big Mouth Media suggest when creating targeted landing pages to use sub-domains as the best approach.....hummm.

I am right in saying that you are advising to use a subdomain to target specific keywords?

I work in the kitchen appliances sector and if I were looking to target a particular product e.g. WMD960G, would creating a landing page with rich targetted content with a URL of wmd960g.appliancesonline.co.uk be the best approach?

Also are you saying that use of microsites not to be used? If I had say 4000 microsites each targetting the specific productand then pass a appliance type link to the main site, would this create enough diversity?

I understand your point about brand 'poaching' but I still think that a person typing in a brand is doing that for a reason. Surely they would click on the brand they are searching for. I have trialled competitor bidding and have found that after a couple of days or so Google increase your minimum bid to unsustainable level, which in turn increase everyone else cpc level.

But again I can only draw upon my past experiences within my sector, so this maybe a different for other sectors?

I agree with the guys that talk about branded bidding is probably not best use of advertising spend as I would expect you site to sit at the top of the natural SERP page anyway and if you not then you need to be.

I see alot of companies still using PPC and appear position 1. I think when in this situation you need to turn off the PPC as searcher is looking for your brand and the first time they find it they will click, i.e. they will see the PPC ad first not the Natural.

Best to re-invest your advertising spend in other more competitive terms.

I have a competitive analysis report using Insights for Search for a month and have found it very useful.

One of the neat things about this is looking at the up and coming keywords. Look out for the keywords marked as breakout, they are the next big thing!

If your interested in Insights for Search have you come Google Ad Planner. A very good tool to bring together demographic data, daily unique visitor traffic numbers, where that visitor went after/beforea nd the sorts of keywords they used. All designed to help exploit their content network. Check out the link below. Been using this to aid in finding key site to target for PPC and SEO.

You need to be careful when checking the results yourself as Google tend to changes your results if you are logged in to your google account and if you have Google Web History enabled. For you to get a true result you need to log out, clear your cookies and history from your browser. Then close the browser and re-open you should see the a more true result.

I know the last posting for this was back in April but a really neat feature I would like to see would be to the ability to bulk upload mutliple keyterms and then trend my keyword performance overtime. Also having the ability to categorise my keywords would be a worthy addition.

Is this sort of functionality on the horizon?

...As an additional note, current tools such as Firefox's rank checker only reports on the first listing it finds irrespective if it is the page you were tracking the performance on. Which result does SEOmoz's checker return?