On the Web

Profile Information

I'm the Director of Search at Clubnet Digital, a full service digital marketing and creative design/development agency based in Plymouth, in the South West UK. We create bespoke creative and highly integrated digital marketing campaigns for businesses that collectively utilise a wide range of channels and compliment each other appropriately based on our clients specific requirements.

Blog Comments & Posts

Based on the examples provided, should this not then become a case of the expert advising the client of what would be better to increase organic visibility for, whether specific keywords or semantically related terms/topic?

SEO evolved to something more broader, encompassing many more methodologies with some overlap and this is what should be educated. Businesses still need SEO, just perhaps not in the way that they may be thinking and how it *might* have been 10+ years ago.

If a business is wanting to rank for "best headphones" and "restaurants in miami", then there is some definite scope for educating and advising accordingly because that's not the right way of thinking anyhow.

As such, the title is misleading and this would have got the thumbs down from me if I could see this any longer.

It's all about context and in this particular context, it looks spammy and unnatural linking the text 'best business books for photographers' to the article.

The context was to briefly mention a website in passing. I.e - [this article] about [this subject]. I agree with Danny, had the paragraph been edited accordingly (like Danny's example above), then there would have been additional scope to use a descriptive anchor but as it was, using the word 'article' as the anchor probably would have made the most sense.

Exactly the same scenario in how I just linked to Danny's comment here. It wouldn't have made sense using a different anchor than 'example' the passing term I am using to reference something.

You said yourself, in your opening paragraph, guest blogging for links is what Google are taking action on and is where the problem lies.

Moz manually screen every piece of content on YouMoz before it makes it to publication and only the best of the content makes it. The content that is really useful, valuable, shows significant research with facts, findings and results. All the boxes that any content on the web should tick.

Seeking the right company should start with those that can see the bigger picture outside of the misconceived term 'SEO' and can demonstrate understanding of how the industry is evolving and what factors are required for not only ranking well but more importantly, what gives your customers great experience onsite and offsite, driving conversions - aspects that many in the "SEO" field would not consider or attend to.

You should never have to chase the algorithm, with the right designers/developers and creative individuals, just code and build a great website pushing unique quality content. Always focus on the end user, your customer. That's all Google care about too.

To be fair, if you were soley relying on the weight of exact match domains to prop you up in the search results and now you are nowhere to be found, then the task of building up an authoritative domain and ensuring your the website at that location is optimised well, has not been met.

EMD's have long been over-rated and it's about time we saw the value of this metric on it's own devalued. That isn't to say, that if you have built up a fantastic website, well coded, awesome content, semantic, relevant, trustworthy and is generating decent social signals, that it would all be for nothing... All positive signals in the 200+ ranking factors that Google use, if outweigh the negatives, will still rank well, regardless of the domain.

It sounds as it you're blaming any change to how EMD's are considered as the sole reason to why a domain is no longer ranking but this is extremely unlikely to be the case, there will likey be plenty of other factors as to why it no longer has as good visibilty in the search results.

I would tend to agree, Google are only too aware of credit links, typically in footers and I don't feel these will be playing a particularly significant factor in being flagged as part of the Penguin update. They could well have implemented a number of checks surrounding these to determine whether they are paid links too - in which case, is another story.

As long as they are nofollowed though, I wouldn't expect Google to even bat an eyelid at them anyhow.

What we do when designing sites for clients - if we add a credit link in the footer (and often we don't), we write a script to make the link nofollow on all pages apart from the homepage.

Personally, I feel sitewide links are pretty much discounted anyhow other than the first instance but at least how we're doing it, we're letting Google know that we're not looking to pass link juice from the pages other than the homepage (which typically be their most authoritative page anyhow).

We run a social network with hundreds of thousands of pages indexed in Google, and we've had a sitewide link pointing to another one of our websites for the past 12 - 18 months or so, it doesn't hinder the performance of the website in question nor pose any negative results. It is evident that they are both part of the same company group however though.

I've seen a relatively large shift in first page rankings across a number of industries but haven't seen any hands on evidence of low quality garbage ranking well (although am hearing plenty of other people noticing this).

With regards to your own situation, I would perhaps start first by analyising http://www.linkdatabase.dk/ and how this domain has been affected since the Panda/Penguin updates - it accounts for almost 73% of your website's backlinks, almost all with the same anchor text I believe. The website screams low quality and the content is close to article directory blog/link network type junk that the Penguin update went after. If the backlinks are followed too, this is exactly the kind of scenario's that Google will be checking for when identifying website's to hit.

Although I have to say, there is often more than meets the eye with a lot of website's link profile. It only takes a couple of backlinks from other sites part of a bigger ring for this to pass bad association to your website and if the positives aren't outweighing the negative signals then it's likely you'll be sucked into the algorithm update hits.

I'm looking forward to the day Google devalue EMD's in their entirety though. It will be this year I reckon, this will fix many page one results for thousands of queries alone.

We've seen no negative effects since the Penguin update thankfully across any of the sites we're involved with. It does appear that half the internet seems to have been affected though.

It's a welcome introduction that I really hope stamps out a lot of the affending contributors to garbage, junk and spam online. I think soon, many of those that have always adhered to "white-hat" practices will truly start reaping the rewards and those quick to shun ethical tactics by spouting that the only way to rank is to play dirty will be spending considerabe time trying to fix the negative effects that these latest updates will have on their sites.

Anchor text should be used to accurately describe the page you're linking to, this should be relevant and appropriate to the end user, the visitor.

There is much less weight placed in anchor text now however and more emphasis placed on the relevance of the page, context used and the content surrounding the links.

I wouldn't worry too much about ensuring keywords you wish the page to rank for are contained within the anchor text, especially with the recent introduction of the over-optimization penalty from Google. Just ensure you are creating quality content that your target audience want.

PageRank / link juice will be distributed equally via external links on a page but many other factors will come into consideration which will include authority and relevancy which could essentially mean one external link may pass more weight than another.

This would essentially be where a number of other factors come in to play. They certainly won't be ignored (unless the sites are flagged as spam, in a bad neighbourhood or have conflicting/duplicate content), the value that they pass will depend on a number of other signals but will ultimately come down to the authority / relevancy of the linking page - if the link serves little purpose, then don't expect much out of it. If there is a semantic connection between the two pages, and the method in which the linkage has been conducted ticks all the boxes, then expect weight to be passed.

SEO is not "all about cracking the search engine", it's about adhering to search engine guidelines whilst providing excelled customer experience and value. Anyone trying to 'game' the system will eventually lose out in the long-run.

With any business/website, the paramount objective should be to provide value and drive ROI, when done correctly (as any business/website owner should whether a search engine existed or not), your brand/business/website (etc) will naturally perform well in the search engines.

That's all Google are about - relevancy, value, customer satisfaction, quality... Do things right, and you'll naturally reap the rewards. Try and "crack" the search engine, and be prepared for the long haul!

Rarely is there a case for a link to point to the same page from within the same page in my view, allowing multiple anchor text to pass would only encourage spam and low quality junk (in most cases) which is the one thing Google would not want to do.

I've yet to see any evidence from our own tests (albeit very brief and basic) that confirm multiple links with varying anchor text will pass relevancy for each anchor text and affect ranking. I hope this isn't the case.

Typically, the image alt attribute should be used as alternative text in the event the browser does not display images - these should be descriptive of the image and not be used for stuffing keywords.

The image title attribute should be an accurate title for the image and is there tooltip style text pop that appears when you hover over an image - this ideally, should differ from the alt attribute and again, should not be used to stuff keywords.

The title attribute does not bear any direct connection to ranking signals whereas the alt attribute will be considered more by search engine crawlers to analyse image content and rank accordingly.

One would presume that the link to any given page within the actual body content would be the link that search engines paid attention to - this would naturally be the place to look for where the linking page is saying "Hey, look, I'm the link that should be passing the weight through to that page" hence the previous thought I had about potential devaluation of links in navigational / side menus which I mention in my comment further down the page.

1. In my experience, Google Bombing has often been conducted when a large group of people agree to all link to the same page regarding a particular subject. The likelihood that all anchor text is exactly the same, is unlikely and in which case, would still play a role in ranking said page for desired target. Even in the event that the mass building of links to the same page with the same anchor text, I would expect that page to fly up the rankings albeit, these days, more than likely only for a temporary period while the algorithm kicked in and rejigged the rankings accordingly.

2. I wouldn't be surprised to see the devaluation of guest blogging / author byline links this year (if at all possible), it's a process that is (as with anything) becoming more and more abused. It is extremely difficult to obtain natural keyword rich anchor text backlinks these days and guest blogging does remain one of the most effective means of acquiring backlinks of any great value, the key is to build relationships with respected, reputable leaders in your field and guest blog from trusted, authoritative websites. It's only a matter of time before the likes of Google really step up a gear in their combat to spam and end up literally devalueing millions of splogs accepting junk guest content purely to manipulate search engines and drive traffic/numbers.

3. The flow of PageRank is distributed evenly between external links, so 25% of 'juice' to "abc", 25% of juice to "def", 25% of juice to "ghi" and 25% of juice to "xyz" (this does not take into consideration the flow of internal PageRank bear in mind) so that's probably the easiest method of viewing the process. The more external backlinks (without the nofollow link relationship) on a page, the more PageRank that will be released via those links.

Excellent video Rand, exactly what I've been preaching for some time and glad to hear it clarified and confirmed from SEOmoz. So many will insist on linking multiple times within the same page to the same page with varied anchor text adamant that anchor text from all links will be picked up and used as a signal.

My only thought on this previously was wondering whether search engines may devalue navigation menu linking or allow inbody anchor text to over-ride navigation menu anchor text/image alt text possibly - it sounds like that isn't the case from your video - which too, is also good to know. A good 95% of sites online (probably) use keyword rich anchor text for internal linkage which is almost humerous that almost all of it is discounted(?) for pages that are linked to with navigation/side menus.

With this thought in mind, when we re-developed our agency website and subsequently developed our own CMS, we built the page hierachy andset up of the page internal linkage in such a way so that not everything was all linked to from the same page which allows far more scope for passing anchor weight internally between pages.

For example, our services page hierarchy makes no use of drop downs and will only link to top level services from its page, from each top level service page, it will link to sub sections of that service, with always a link back to it's parent item creating a highly 'relevant' internal linkage structure.

I'm glad we built our system like this now and is good to hear that my decision to do so may well result positively based on what you confirm in the video. I only hope the "tiny" amount of internal anchor text passed warrants the way we have built our platform - heh.

The whole IP range saga for manipulative link building is highly over-rated. This may have been effective 3+ years ago but the web has progressed and I'm a pretty firm believer that there is little to no use in opting for 'fake' sites spread over different class IP's.

Just another jargon hyped term used to sell to people that don't know any different. The only drawback of maximum potential link weight is from a network of sites hosted in the same IP range that all interlink with one another - that's not to say that these links are useless though - they just won't pass as much weight.

But, building links when done 'right', this shouldn't even be an issue anyhow.

I've been saying for some time now that Google has all the necessary data required to be able to filter out most of these 'splogs' and such on the web. Taking them out of the index will vastly improve everyone's search experience on the web straight away.

Similarly to your thoughts, in the same manner, it would be possible to downvote blogs that score poorly against many of the metrics that can be used to assess the 'real'ness of a blog and failing to reach a certain threshold will just take it out of the index.

There's too much $hite on the web as it is so this would be a welcome introduction and one that I hope to see coming into play far more from Google. It would likely render a lot of SEO companies link building tactics useless in the process so that would just be a bonus. The more we can prevent complete garbage spouted across the web, the better.

I certainly don't think you are far off the mark with your thoughts at all Mike.

No-one will be able to trace your answers from the published results. The only organisation that will know what you've answered is SEOmoz, and I hardly think their purpose for this is to 'out' your business model.

Totally, most won't have a clue about this but I guess I also see this as being another selling point when pitching to the customer. Go above and beyond in showcasing your knowledge and explanation in the difference between you, your hourly rate and what the client gets (benefits) from your fees vs other companies.

It comes down to how much do you value your service... Rather than a comparison to another company offering the same service...

We come across many companies/freelancers that call themselves SEO's when they clearly demonstrate on public forums that they don't know much beyond page titles and metadata, and even then, they get the basic of basics incorrect too.

Fees across the SEO and other web agency based businesses industry differ massively and the only way you can charge more for your time is when you can prove your weight in gold and drill down to the most technical elements of SEO, even to the extent of design, coding and development.

The results of the survey (I don't feel) will really tell you whether your company's pricing structure is correct, you will just be able to gauge what level you're at compared with other companies - some of which may have professional designers/developers, copywriters or someone that even spends all day testing URL canonicalisation aspects and duplicate content but nothing else.

How far can you go? To what level can you implement your services? How much experience have you got? Are you employing juniors or have they been around in the industry for a while? How active are you in the SEO scene? Do you find yourself often in a position to correct others about SEO because you know they are talking nonsense? And then, there is understanding search engines, knowing everything about the latest algorithm updates, looking at a web page and knowing where it fails and where it works - many other things that all come with experience. And often, a lot of aspects a "small SEO company startup" will have yet to extensively test themselves and learn with experience and subsequently the reason, why they are not in a position to be able to charge large monthly retainer fees for their services (for example).

Why would something be wrong if Facebook was their most visited website? The platform boasts the most active members in the world and is used daily for personal use through to marketing and advertising.

For anything to be taken away from this discussion at interview level, it would need to explore the individuals' activity on the network during office hours. But even then, it's not a discussion we would raise when hiring for an SEO/marketing position.

We had an absolute nightmare hiring for our first SEO position at Clubnet Search Marketing. Seeing as we are situated in the South West UK, most people haven't got a clue what SEO even is (really, Plymouth is like ten light years behind the rest of the country) let alone having any experience or worthwhile knowledge in the field.

We tried a number of recruitment agencies and via the job centre but these were close to disastrious (especially the job centre, 95% of applicants could barely spell and write legitimately). There wasn't much mileage in posting the adverts on our website either as this was before we had our new website developed and launched so the old site wasn't the most professional of platforms to try and recruit via.

We're now in a position where we get at least two people per week sending CV's and covering letters wanting to come and work for us - granted, mostly don't excel with a background in SEO (mainly IT or other marketing) but still, it's nice to see plenty of people wanting a career in digital marketing/SEO in an area where the sector is so unknown.

I guess I, like many of the guys interviewed, have learnt by trial and error. Getting a feel for the right paths to recruit via overtime can certainly save a lot of stress and time in future instances. It took almost 7 months to recruit our first member of our team and just 5 weeks to recruit in our second instance. Phew!

Unfortunately, not overly valueable for those that conduct ethical and 'white-hat' SEO already and (also unfortunately), those that don't, won't pay a blind bit of attention to how Google determines quality and ratings.

All useful ideas for enhancing your ecommerce website albeit most should be considered all year round rather than in the run-up to holiday seasons.

Check out our 105 Killer Ecommerce Tips for Online Retailers too. Whilst it's a little dated now (April 2010), many of the tips still apply and should be paramount to running a successful ecommerce store. We are also planning an updated list which we hope to publish within the next month.

If possible, it is always advised to create unique and accurate product descriptions for all products manually. We have seen over the years that this does make a difference to rankings. As you say, many ecommerce stores will just copy and paste generic text from the suppliers website and when there are 250 instances of this in Google's index, it would be impossible for them all to rank well anyway - which is where 100's of other onsite SEO factors come into play.

The ecommerce sites containing the same set of product descriptions will not be penalised in any way, it's just the rankings (longer tail for the product pages in question) will suffer or not even rank/index at all.

The recommendation from markwrightseoof using the link element will only tell a search engine which page should be treated as the original source and is subsequently only utilised for your own content (it doesn't make logical sense to refer to a 3rd party site shouting "hey, don't pay any attention to my page because that site over there had the same content first".

In some cases, it is not possible to create unique product descriptions, we have one client who has all his ecommerce site/product information served to his domain by another company, and this company does exactly the same for about 50 other sites. In this instance we are having to create a new website as a portfolio site, uniquely optimise and pass highly relevant weight through to his original site and at the same time building trust/authority on his new site to rank that too for a wider range of targets.

There are a wide range of potential duplicate content issues when it comes to ecommerce websites, there is plenty of content online about this. The main causes are query parameters caused by product filters in the categories of your ecommerce website. There are a number of ways to resolve these, one being to block search engines from crawling the filter URL's in your robots.txt - For example: Like the below:-

Disallow: *?price=*

Disallow: *&price=*

Disallow: *?colour=*

Disallow: *&colour=*

Disallow: *?size=*

Disallow: *&size=*

Other ways could include the <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW"> tag or instructions in your .htaccess file.

Thanks for pointing out the 'Sparks' section, despite being on Google+ for a month or so now, I hadn't even noticed this area. Have set up my sparks and will now see what targeted content I can display in my feed, great stuff!

I was getting hundreds of these exact templated emails every week across about 60 of our clients' websites. Incredible to think that so many link building companies are using these 'old skool' appoaches and that they work...

They are clearly spam despite their efforts to convince you otherwise and they are clearly templated and in some cases automated (in the sense of a spambot) as I have received loads of the same email from the same person requesting links for a load of sites that weren't even on-topic...

I've got to admit, my contribution on here has been pretty much non existent compared to my activity on other sites/blogs and social media streams. I'm not really sure why as I am a regular reader of SEOmoz and am subscribed to the RSS feed.

However, after reading this article, this kinda encourages me to comment/contribute more which I will aim to do, I have very little spare time to reply on all blogs I read but I'm definately going to try and boost my MozPoints now, hah. :)