Category Archives: SEO

12 months is a long time away from SEO. I’ve been out of the UK for a year backpacking around the world with no phone, no laptop and no SEO. It’s something that I would recommend to anybody, I pretty much at the time of my life (blogs are being posted a year on via www.achinesenomad.co.uk and all the pictures are in sets on flickr).

I’ve spent the last few weeks catching up on everything SEO related thats happened in the last 12 months. A lot has changed, new tools, new google features, updates to the algorithm etc, however some things remain the same. I’ve read about 1,200 blogs that have aggregated in my Google Reader account, below are the pick of the bunch, if you think there are any missing leave a comment and I’ll add it to the list if its worthy.

Google Webmasters team is getting closer to SEO’s/Webmaster via the updates on the Inside search blog, Webmasters tools and hangouts on Google+

Google+ was launched, doesn’t seem that popular so Google are integrating existing products into it, Google Places and Authorship markup validation.

Social linking is important to Google but they don’t have access to some data, so they created Authorship markup to get their hands on such data.

Google still has poor results in some sections, the Venice update is serving some really poor results for local when it gets your location wrong (e.g. users not signed in) and exact match domains can still outrank authoritative sites.

Many people creating new SEO tools thanks to APIs leading to more ways to analyse data.

He’s the list of the blogs, videos etc.This is about 10% of all I’ve read.

Liked this post? Why not share:

One of the core services of any SEO campaign is to report back to the client on search rankings. Over the past five years reporting has shifted more to sales and conversion tracking, we now see another shift with reporting on touch points of user journeys. No doubt in the next five years we’ll see more changes in reporting SEO campaigns, however reporting on key phrases positions will always be a key aspect of SEO reporting.

I’m my last role I was part of a team that developed a web ranking tool that scraped search engine results, calculated rankings and reporting back to a client interface. Having your own tool has it’s benefits but for me will also be restricted by the need to have internal resource to initially create such a tool and the continues work to maintain. How many companies are able to pull programmers off paid work to work on internal projects, it doesn’t happen often!

Often costing at companies for programming time is calculated by an hourly or daily rate. A quick estimate to make a ranking tool could be calculated as:

£70 per hour * 7.5 hours a day = £525 per day.

You may need around 6 days of planning time, requirement gathering and meetings to get things started. – £3,150

To start the cost is minimal compared to building your own tool, $399 which is about £257 will buy you the Enterprise version. Even with an annual cost it’s still cheaper over many years than building your own ranking software.

As for features it has everything, different types of reports, no limit on key phrases, every search engine you could need is on there to record a site ranking, scheduled reports, scheduled ranking checks, upload reports to ftp, save locally, email to a client, reports come out in multiple formats, reports can be customised and so on and so on everything you could think of from an seo software package.

There’s even a keyword research tool which uses the webmaster tools API and SEMRush API to generate key phrase suggestion lists. That list then can be easily imported into a project.

As for scalability I’ve had it running 24/7 for 7 months with not one crash on a dedicated server, just under 1000 clients and around 10,000 key phrases.

If you have your own client log in center you can easily feed into that by having csv or txt reports sent to a server, your server then grabs the data adding into your client center. This is a set up I’ve used in the past. Advanced Web Ranking grabs the data, sends it off, your own software displays that data directly into your own reporting suite. There’s even a export rank data feature which will export all the data in a friendly format which then can be uploaded into your own tool.

Accuracy is spot on, and most importantly if Google changes their interface an update will be on it’s way pronto.

Once concern if your planning on changing to the Advanced Web Ranking is the time taken to migrate over from an old system. Features such as the import key phrases from a txt list makes the process quick and painless.

There’s a 30 day trail with no commitment to buy, so give it a try!

Liked this post? Why not share:

Over the past few months I’ve noticed more title tags and meta descriptions using different types of characters to stand out in the search results. PPC has led the way with uses of planes, bullet points, trade mark symbols. Problem is many adverts get disapproved by Google.

Organically it’s about testing what can and can’t indexed. So below I’ve made a list of characters that will get indexed in a title tag and display in the search engine results. Before the list a few interesting points from this experiment.

Using the intitle: command doesn’t work, for example try “intitle:£” in Google and it returns nothing. Yet there are plenty of title tags with the £ symbol.

A symbol can be hard-coded into a title tag but when you use a CMS it may try to convert it. Thus meaning to get some of the symbols into titles you’ll need to bypass your CMS or change the way it works.

If you have a symbol in the title tag that Google won’t index they’ll skip the character when displaying the title tag in the search result.

To conduct the test I:

11 pages linked to sitewide on another site of mine

Each page linked to each other

Title tag contacted the special characters, as did the meta description.

Waited for all the pages to be indexed and then viewed using a site command with inurl:test as the file names were test1, test2, etc

Here are the valid symbols with screenshots of them in the search results.

Theres were plenty that didn’t work including symbols of scissors, aeroplanes, hearts, clubs, spades etc. I’ve noticed some sites already use the trademark and copyright symbols, for example. GHD hair straighteners and Sony. What can you do with symbols above? Here’s a few mock ups that I’ve firebugged

888.com using the numbers 8 to stand out on the term online casino.

My own site using characters before and after the text.

Use the half price for cheap flights offers.

The Church of england using little crosses.

Using a number for Radio 1, pretty cool

Joking aside, over the past few months Google have made several changes to the 1st page has made it difficult for a standard organic ranking. Product, image, reviews, local, maps listings etch all stand out more than an organic listing. If you have an organic listing in a very competitive market you need to work your title tag and description hard to increase the CTR.

If you’ve liked this post and you have a site with a decent level of traffic why not try out using different characters, use Google Webmaster Tools to see changes in CTR, positive or negative.

If you find any other characters that work, please email johnpcampbell1985 (@) googlemail (.) com or leave a comment below or send me a message on twitter.

Liked this post? Why not share:

If you’ve checked the BBC news site this morning you’ll notice a new design, the navigation has jumped to the top, new headings and a new side bar. Another little change is the URL structure of the news articles. Old URL’s used to be locaed on the sub domain and use the folder path the article appears in e.g.

As we all know if you change your URL’s you need to implement 301 re-directs from the old URL’s to the new URL’s. And the BBC have it’s just they are all temporary 302 redirects (Click the picture below to see results from httpfox)

Could be a lot of PR lost if the re-directs stay as 302.

Not sure why I picked this article???

Edit: This is not effecting older articles where the URL remains the same, seems to be recent articles.

Liked this post? Why not share:

Back in the office for the new year and I noticed a new name advertising on the term “SEO” – BT.

There’s been a few posts recently about how you get what you pay for with SEO services, so it doesn’t look great for BT’s new SEO service know as SearchSmart which starts at a chirpy £74.99 a month. There’s a £100 set up fee and also a minimum 3 month contract.

For £74.99 I’m guessing that the key phrases selected are long tail and local.

It sounds pretty much an automated service (bar the audit doc) with tight control on any phone conversations with a consultant. If you need extra consultancy time it’s £100 an hour, a little expensive for the UK.

It’s hard to judge the service, so far they have only got one testimonial Brand X PR who haven’t implemented the work and guessing haven’t had any link building as they have no links according to YSE. But a quick check on the BT site and they don’t rank for their own content.

So they have a duplicate domain “http://www.latitudeorganic.com” which is strangely owned by “Scott Sargeant” who works for the Latitude Group.

In conclusion is BT SearchSmart Any Good? Well it’s seems it’s just SEO by Latitude.. If your going to resell at least do a good job of separating ties from the company who are going to do

Liked this post? Why not share:

I’ve blogged in the past about linking offline TV campaigns with online search, there have been some success and some failures. The last week I’ve noticed a campaign by Stella Artois (Mother London the agency behind the ads) who’ve used a slightly different slant on the ‘search online’ technique.

Rather than asking viewers to search online, Stella Artois asked them to search on YouTube. Different to the easier ‘search online’ used in the past but an idea that could deliver a better conversion with less chance of the user ending up at the wrong result.

Visibility

To gain visibility Stella have a PPC and a number one ranking on the natural SERP for ‘recyclage de luxe’, the PPC also shows for related searches such as ‘stella advert’.

At the bottom of the 1st page is the video from YouTube. That should rank in the main SERP’s soon.

On YouTube they have a the number one spot for “recyclage de luxe”.

Execution of Natural SERP

So it seems like a well executed plan. Why? – the page was published before the TV ads started to run.

Having the page up early gives it time to achieve the ranking. In this case the search term in question ‘recyclage de luxe’ had no previous competition but getting the page up early ensures the number 1 position. It’s hard to tell if this was planned by Mother London.

The video was placed online later, November 22, 2009, so just before the TV ads started. Ranking on YouTube is slightly different to the natural SERP’s you don’t need to have the video live for long to get it ranking in the search results. It helps but as there is no competition on YouTube for ‘recyclage de luxe’ putting it on late worked.

The Results

Nothing so far on Google insights, there should be a spike in searches for ‘recyclage de luxe’.

On YouTube the main video has 3,164 views, a 4.5 out of 5 rating from a total of 26 ratings. From the 25 comments so far 10 are positive, 5 neutral and 10 negative, so split down the middle.

However one comment did sum up my first reaction to the campaign;

“got told about this but it was real hard to find. Its very cool﻿”

Which may be due to having to remember ‘recyclage de luxe’ which is a French phrase. The ‘search online’ query needs to be something memorable and easy to spell. Past campaigns such as the Orange ‘I am’ or the Monsters V Aliens ‘MVA’ failed on too generic phrases but this campaign may fail due to the complexity of the ‘search online’ query.

Liked this post? Why not share:

This week Matt Cutts confirmed that natural search will start to look at the speed of your site as ranking factor. It’s something thats been in the pipeline, and an obvious change if you’ve been following the “let’s make the web faster” drive from Google.

I think it’s a great move, there’s nothing more frustrating than a slow Internet connection or a slow site. If you head over to the “let make the web faster” from Google there are instructions to how to make your site faster some are a little complicated such as compressing JavaScript and CSS, HTTP caching, minimizing browser reflow etc.

If you know what your doing then thats great but if your an average webmaster here are five simple tips to speed your site up for Google and users.

1. Optimizing Images

Optimizing images is a simple process, there are two methods. Reduce white space, rather than giving your image a boarder or space crop the image tight and then use CSS to create boarder and positioning.

Save in the correct format, rather than using JPG’s GIF’s can be sometimes reduce file sizes for logo and simple images. Using “save for web” setting on programs such as Paint Shop Pro or Photoshop will enable you get the best size while not compromising on quality. To get old images into a compressed format use a bulk process, remember to keep the file names the same and if changing extensions add re-directs to preserve image rankings.

2. Don’t use tables

When making the HTML output of pages don’t be tempted into using tables to display data. With a bit of skill you can recreate the same thing in CSS. A great article gives 13 reasons why CSS is better than table the number 1 being faster load times.

3. Navigation

Two points with navigation, a) how many items in the navigation can effect load time. Listing every single category and subcategory in some sites would create a huge navigation over 100+ items. Only link to the top levels on every page and other relevant pages, a good example of this in action is the BBC site, go into the sport sections to see..

b) Coding of the navigation is also important. Using Flash or JavaScript (all the code) is a no go, using css with a little JavaScript can create efficient drops downs, even better don’t use drop downs just a plain navigation will keep load times down. Remember that navigation is on every page so improving navigation improves load time on every page in the site.

3. Reduce loads form external sites

Each http request adds time to the load of your site, this includes loading items from a different site. Where possible host images on your own site, don’t have multiple tracking codes, try not to pull twitter / RSS feeds into every page.

4. Move External JavaScript and CSS to external files

By placing the JavaScript and CSS in an external file users and search engines don’t have to load it on each page load. You place the code in a file either .js or .css and link to in the head of the page e.g.

5. Get a decent server

Yes it costs money but if you’ve spent time on the last 4 points they might have no effect if your server is slow. To get a good indication of speed ask the company for some high traffic example site and go on them at peak traffic times. Ask if any are “digg proof” and the specifications of the server.

Over all these 5 points might make a difference but the objective is to make improvements to lots of different areas, added together they will make a difference to the speed of your site.

Liked this post? Why not share:

Hi! Welcome

My name is John Campbell I’m a SEO based in UK. I worked for Just Search and Amaze in Manchester, I'll be moving to London soon. I'm a bit of a mad football fan, keen traveller and like to dabble in affiliates too. I went travelling for a year in 2011/2012 hence no posts for a while!