Tom Anthony's Posts

Right now, we are nearing a point whereby the convergence of several related technologies, combined with their improving accessibility (infrastructure and cost) means we are not far away from some big disruptions in local search. People will be expecting search results far more specific to their current context than ever before... and they’ll be getting them. I’ve put together a simple and relatively typical story to illustrate some of the technologies (see section after the story).

A Search Story

Imagine someone who needs to pick up a gift for a friend of hers; she is wandering through London and searches for ‘jewellery shop’ via her phone as she walks. She gets a bunch of results for stores nearby, but isn’t happy and so refines her original voice search by simply speaking ‘show those with 4 stars or more’, and gets a subset of her original results a moment later. She is still unsure, so jumps on a rental bike and heads towards Oxford Street.

Recently I have found myself fairly frequently wanting to get links that are linking to a certain sub-section of a website (i.e. links to only certain pages on the domain). Reasons why this might come about:

I tend to use a mix of OpenSiteExplorer, Majestic, and Ahrefs when I get backlinks, but currently none of these services actually allow me to get backlinks in such a fashion. OSE does allow a ‘to this subfolder’ in the advanced reports section which sometimes does the trick, but otherwise I’m left to download all the links and filter them myself.

It is relatively standard practice nowadays to do keyword rank checking with tools such as SEOmoz, Authority Labs or Conductor. It just makes sense to us as SEOs to keep an eye on them, whether you are of the school that you should be reporting them to your clients/boss or not. However, we know that with rankings there are so many variables at play that it is more of an art than a science to react to them when you see big changes.

Rank tracking helps inform us of how our tactics are working, whether competitors are up to something, or if Google has been playing with the dials again. However, I’ve been thinking recently about what other things we should be routinely tracking, and which of these might be helpful in prompting more specific actions.

One thing that I know some SEOs do, on and off, but something I haven’t really done much of until now is tracking my competitors’ sites (their markup, structure and content). Sure I look at their rankings, and if their has been interesting changes then I might look at OpenSiteExplorer, Majestic of Ahrefs to establish whether they’ve been doing anything new on the link-building front, but if it is internal changes to their site then I probably won’t spot the exact changes unless it was something in-your-face (like a complete redesign).

In the 6 months since Google gave birth to the Penguin algorithm update it has had a dramatic effect on the SEO industry. For years Google’s rhetoric had been about quality of content and links, but they’d been unable to back up what they said you should (and perhaps more importantly shouldn’t) do with what the industry saw worked.

It was a cause of frustration for large numbers of white hat SEOs to be doing the sort of things that Google recommends (give or take), and still be outranked by the people spamming anchor text with low quality paid links. Wil Reynolds’s complained about exactly this when he spoke at Searchfest just 8 weeks before Penguin, stating that a client of his is “getting killed by a website who is just targeting tons of anchor text only links on GARBAGE sites and is KILLING my client in the rankings” (he later wrote it up for SEOmoz - get it here).

Most SEOs have a veritable plethora of tools they use day in and day out for the wide variety of tasks they usually need to concern themselves with. We have tools for scraping SERPs, crawling sites, analysing log files, checking HTTP headers, and about 1 million other things.

However, even with all these tools at our disposal I still get asked often about simple ways to check the main social metrics for a page. I don’t want to start a debate on what the ‘main social metrics’ actually are; we are just going to go right ahead and assume that if you know the number of Tweets, Likes and +1s then you can usually draw your conclusions from that.

I’ve broken the need for quick checks of these data down into 3 categories:

An instant check of the metrics for the current page.

A quick check of a short list of URLs which you can do via n easy accessible tool.