Article Categories

Welcome to Search Kingdom

Castles, keeps, moats... No, sadly we haven't got any of those, but we do have all the first hand knowledge you need to help your website to rank well in search engine results. No hype, no false promises, just clear advice, training or direct assistance to get your website found.

Archive for the ‘Google’ Category

Now Google have never said they were SEO experts for their own sites. In fact many years’ ago they did a ‘drains up’ look at many of their properties with a view to improving the way they interact with, well… with themselves I guess.

This is a good one though!

Google have finally got around to doing some ‘invited’ vanity URLs on Google+ and I think the way they avoided the usual land grab is to be applauded.

However, when you choose to accept their offer of a vanity URL (BTW, why the odd capitals in the URL?), the previous URL has a redirect, but it is a 302 (i.e. temporary) redirect! Check our old one in a header checker – https://plus.google.com/108616727453700388455.

As we have known for some time a 301 redirect is treated by Google the same as a normal link and the PageRank/link juice/weight/whatever… flow is the same through both.

So when you need to do a 301 redirect, you don’t have to worry about more PageRank dissipating than for a normal link. So why did I underline need? Well although 301s don’t hurt you any more than a normal link, if you change your URL structure and you don’t really need to and then redirect the old URLs with a 301. You are losing PageRank when you don’t need to.

Remember part of the reason PageRank works in the way it does is that there is always a percentage of power that never gets transmitted via any link. Some say this is between 10% and 15%. However, it is always worth remembering that whatever the percentage the link will not be equal to all those on the page, so the actual loss is based upon that fact.

If you click the link, go into the ‘Ad Preference Manager’ and delve further you can find a host of information about Google’s adverts and how they relate to you. Also, you can see what settings are there and what control you have. Some of these settings relate directly to you when you are logged in and some have an association with the browser you are using and its web history that is cookie tracked.

So who does Google think you are? Well, here is me in Google’s eyes based upon my browsing habits.

So, there you go. Google knows more about me than I thought. Not surprising really as I spend a lot of my time with him/her (question: is Google a she or a he in your eyes?). This information can get a bit skewed when you work for different clients (I don’t personally spend a lot of time on cooking sites…).

What’s your Google profile? What does Google think about you and who you are?

It occurred to me the other day that even though PageRank is a far lesser factor in Google’s algorithm than ever before, it is still the most fascinating. Also, it is and will always be the fundamental premise on what has made Google so successful. So even though it has taken a little bit more of a back seat in our thoughts, don’t ever forget this founding father.

So why the Bank of England? Well, the Bank of England regulate the flow and amount of money (well pounds anyway) in circulation. They also do things like ‘quantitative easing‘ (which we now unfortunately know so well), where they inject money into the system by buying bonds, etc.

We know that PageRank dissipates and weakens as it flows from page to page and site to site. This is part of the complex algorithm that regulates the PageRank in the system (you see where I am going with this?). So where does the master PageRank flow regulator sit? How regularly does it pump PageRank in to circulation, is it based upon the number of sites (or more likely pages) indexed, number of links found, etc. Does Google have a dial that alters the weight of certain links and then pumps some more PageRank into the system to flow to these?

In the same way that every page has a defined PageRank number, there must also be a defined amount of PageRank in the system at any given time. Mervin King meets Matt Cutts!

If you want to play this as a game (why not?), don’t read any further than this first paragraph and ask yourself the question “for each letter of the alphabet, which word/name do I think Google will come up with first?”. As a clue think company, before person.

Now on with the post…

Oh, to be one of these 26, eh?

If you hadn’t already guessed from the title of this post, here are the words/companies that Google thinks of first when you input a single letter into the Google.co.uk search box with suggest turned on and history cleared.

A is for Argos
B is for BBC
C is for Comet
D is for Debenhams
E is for Ebay
F is for Facebook
G is for Google
H is for Hotmail
I is for Ikea
J is for John Lewis
K is for KLM
L is for Lotto
M is for MSN
N is for Next
O is for O2
P is for Play
Q is for QWOP
R is for Rightmove
S is for Skype
T is for Tesco
U is for UCAS
V is for vue
W is for Weather
X if for X Factor
Y is for YouTube
Z is for Zara

Let’s face it ‘H’ for Hotmail is a lot easier to remember than ‘H’ for hotel.

Good content is king, not ‘fresh’ content or ‘chase the long tail’ content or ‘copied and rehashed’ content. No, just plain old good content. Why? Well, Google says so and anyone who ever access the web to try to find out something says so too… but mainly Google.

Whichever way you look at it the ‘Farmer’ or ‘Panda’ update from Google was coming from a long way off. Why? Well people thought they had found a loop hole in getting less than great content to rank well. You know the type of sites/pages? No? Well, have you ever looked for something like ‘get ketchup off of a carpet’ (beautifully worded I know, but intentionally worded for search)? If you have you could have some across a page like this…

How to get Ketchup off a Carpet

1. Ketchup is another name for tomato sauce and is used to complement many dishes.
2. Carpets are used on hard floors and come in a variety of different styles and colours.
3. Spilling ketchup on carpets can cause stains.
4. If you have split ketchup on your carpet and want to know ‘how to get ketchup off a carpet’ you should consult a specialist carpet cleaner.

Queue AdSense adverts…

Know what I mean? Did you think that Google would not deal with this type of site at some stage? There are other types of sites that Google have attacked on this recent update too, but they all mainly centre around ‘thin’ content or ‘content free’ content as I like to call it.

How have Google done it? Well, the update seems to be site specific and not page specific. This seems like a sensible first stab at this as it is a lot easier to create some rules that would apply to a domain with thin content rather than do this by page, which would lead to lots more work and a far deeper algorithm change. However, watch this space as it will come one day.

So, just to revise my initial words, ‘fresh’ content is king (update you website every minute of the day if you can) and ‘long tail’ content is king too (around 30% of Google searches every day are totally ‘new’ to Google). However, if it is not ‘good’ content then you are most defiantly on Google’s radar. Be warned…

Don’t worry, Google bans are not as common as you think. However, if you are doing some things that may run close to the wire, then this video from Matt Cutts may help you do what you need to do.

In essence, if you don’t play by Google’s rules, then Google will get you in the end. If you are in your business for the long term then, it makes sense to plan your SEO over the long term too. Google will NEVER stop being attracted by great links and great on-topic content. Anything else is up for debate (no matter what your SEO guru tells you…).

Thank goodness for Google! Even though our business as SEO consultants gets harder, at least Google are constantly mixing things up to enable those who specialise to be… well, specialists… Now, don’t get me wrong I don’t completely get ‘gooey’ eyed over everything about the big G, but if you like a boss (and let’s face it they are the boss) who constantly challenges you then they do this just fine.

Don’t Twitter and Facebook ‘nofollow’ their links? Yes, they do, but as Danny points out both Google and Bing get a feed from both without the ‘nofollow’, so in theory they can pass PageRank (or whatever Bing call it too).

Aren’t Twitter and Facebook links mainly done though a URL shortener? Yes, but do you really think that both Google and Bing can’t easily circumnavigate this?

Wait a minute, I just realised this article was turning into a mini Q&A with myself… just one more though…

What about Wikipedia links then? They are ‘nofollow’, but will they now help me rank better? Ah ha! Good question. Does the link pass PageRank? Probably not. Does it pass trust or TrustRank (can I trademark this?), I would say yes. Put it this way if you had a page that was hugely authoritative in its genre, was well linked to and had a bunch or axe wielding (not literal, but you know what I mean) custodians who made it pretty much spam free, wouldn’t you find a way of using this as a signal? Remember it is your search engine and algorithm.

OK one last question then… Why do these links not pass PageRank then? Because Google has said that if you can’t vouch for a link or it is a paid link then use ‘nofollow’. That doesn’t mean they can’t find some other way to use it as a signal, does it?

I have not really ever written about Google Caffeine. This was deliberate as I wanted to see how it panned out. And well… it seems to have done exactly what Google said it would i.e. provide a better and more up to date way for them to index. This brings all sort of advancements, but the main one is speed. Quicker indexing, quicker response, etc.

Did Google Caffeine change things for us greatly? No, not really, but the landscape has changed for Google to build upon, which is the part that does change things a great deal.

Tried Google Instant yet? Now that is a big change for all of us. Instant was in many ways made possible by the Caffeine update and it will greatly affect the way we access Google’s search results. Add this to the personalisation elements that have been happening for the last few years and the way we search and, most importantly, what we choose to access once we have searched has changed a great deal in this time frame.

Try Google Instant for yourself (currently you need to be signed in to your Google account). Did you type less? Did you type more? Did the top results that occurred half way through your query help you? Did it alter the way you typed the rest of the query? Will you turn Google Instant off? Did you use the suggested search less or more? What did you click on? Lots of questions and all of the answers can change each time we search.

Anything that changes the way we search will clearly affect those people who have a view on how best to affect the way Google orders their results. But that is it, we in the SEO world try to affect the way Google orders its results and not how people search. We can never affect that.

So for me Caffeine was and is always something that I had to watch and learn from, not worry about. The May Day update was the big news that happened around the time Caffeine was launched because that directly affected the way I need to think about SEO. As long as Google delivered Caffeine in the way it was meant to happen (with no mess ups that affected search results) then I was and still am prepared to watch how things develop. Sure, quicker results, quicker indexing, more ways to slice and dice the results (Instant), more pages indexed, but nothing that should affect the order of results.

Will Instant change what gets clicked? Yes. Can you affect it from an SEO perspective? No. Can I learn from it over time? Yes.

Oh, and by the way, I am only taking about SEO here. PPC? That is a totally different matter… Head term guys, watch your budgets and ROI! Remember Google search is all about being the best and getting ads clicked. If you view every change through these eyes life becomes far less complicated.

There are many (many…) sites on the web that still have a ‘nofollow’ based PageRank sculpting architecture. There is an element of ‘not broke, don’t fix’ about this, but it is worth bearing in mind that if you have this environment you are burning a great deal of PageRank that you could be channelling more wisely.

I am not going to try to cover this huge topic in just one post… What an uninspiring start, eh? Still, now that I have got your attention, let’s see where this leads…

Firstly, what is ‘quality’ in regard to website content? Wholly subjective, isn’t it? Just like the daily newspaper you buy, the ‘quality’ factor is whatever you judge it to be. Far easier to judge ‘quality’ by standards, like a newspaper that is poorly printed, has bad spelling, etc. would be judge as ‘bad quality’. Likewise, a website with copied, badly written, badly formatted content would be judged as ‘bad quality’. Your website’s ‘quality’ is judged by whoever eventually may read it and, crucially, whether they feel it has served them what they wanted or is interesting, inspirational, informative, etc.

Now, frequency and quality with regards to website content can get somewhat blurry when mixed. Is it better to churn out lots of, at best, mediocre content or deliver something good and insightful whenever you feel it is appropriate? Even if you are passing on information, do you pass on everything or only that which has really merit to your readership?

These frequency and quality questions, mainly come down to what is your website for and what are you trying to achieve. Is it for your own interest? Are you trying to sell something? Are you delivering important information? Etc.

Simple stuff, so far? Well, yes until you bring search engines into the equation. Then these pretty basic assumptions change and break up into a fantastically silly guessing game. Does Google like lots of updates? Should I change my home page regularly? Can Google look at my content and see if it is rubbish? Does it care?

Now is a good time to bring in a recent video from Matt Cutts about this subject.

So, is it any clearer? Now, I don’t think for moment that Google or Matt Cutts will ever be transparent enough to tell you the whole story. However, I also think that the steers they give us are never too far away from the direction we should be heading. The information above all else that has been communicated over the past couple of years from Google is that ‘producing great content will give you the best chance of getting good links’ (except they don’t always mention the ‘good links’ part. The rider is that the great content needs to be known about in the first place, which is somewhat of the Catch 22.

Does Google know if your content is good? Well, no not really. It knows if you are on topic, it knows if you have copied your content, it knows if it is link worthy, etc. etc. But, unless they do a hand sort, it does not know if you content is good, even then it won’t be subjective and will only look at the ‘bad quality’ that I mentioned above, but in a search engine’s case they are looking for ‘bad quality’ that tries to cheat them or us. The algorithms will pick up most of the ‘cheating’ Google elements and a great deal of the semantics elements, but will never pick up if your post is fantastic, but then it doesn’t need to, the web will tell it if it is.

So will frequency help me rank well? Yes, it will. For all the reasons Matt says and many others. But will frequency on its own help me? Somewhat, but not in real terms and certainly not without the other ‘trust’ and ‘popularity’ factors that Google puts above all others. More than anything ‘frequency’, as long as it is aligned with good elements of appropriate diversity, will help your ‘long tail’ exposure. For ‘head terms’ there is a much bigger reliance on ‘quality’ mixed with ‘frequency’ to bring link weight to your site as a whole, which will then, in turn, help your site (and it’s targeted key phrases) rank better. Frequency, without quality and diversity will not help you very much and also thin and spread your PageRank/trust weight at the same time.

If you are looking for search engine spiders to visit your site more then, frequency does help, in the same way that individual page improvements help. But, frequency will not help if Google is not really that interested in your site and even though the ‘supplemental index’ has long been forgotten about, the principals still play a part in what Google will and will not index and how it indexes your content.

This post was really meant to look at bit harder at the ‘fresh content’ mantra of SEO, where some people have taken Google’s words and built their own theory. Personally, I agree with certain elements of the theory, but average at best content and average at best links will only get you so far, and there is still a lot of effort and money involved in taking this path.

So is content king? Not in my opinion with regard to better search engine exposure. Google’s fundamental principal has never changed and links and citations are king. However, without quality, popular, authoritative or crucial content, links and citation will always be contrived. And in essence, that can only take you so far and nowhere near far enough in a competitive search engine ranking environment.

Back in the day when Google updates used to cause panic and elation in equal measures, it was always pretty apparent what had happened. In some ways it is a shame that Google are now very much in the ‘law of diminishing returns’ phase where many of their algorithm changes go pretty much unnoticed. However, within the last month we have had the ‘May Day’ update which is a little bit more than one of the usual tweaks and is worth mentioning.

The phrase ‘long tail’ when applied to search engine listings describes the countless phrases that are used by us all that fit outside of the ‘head terms’. A ‘head term’ would be something that is used many, many times by lots of different searchers e.g. ‘pizza delivery’, ‘mortgage quote’, etc. The ‘long tail’ are the less used, but multiple search queries that often use more qualifying words e.g. ‘negative equity mortgage advice company’, wheat free pizza bases delivery’, etc.

The (very) basic premise of the ‘long tail’ is that roughly speaking you will get the most of your traffic (or sales if you run a ecommerce shop) from the ‘head terms’, but these head terms from a diversity perspective will be much, much less in number than the multitude of different ‘long tail’ queries. For different niches this weighting can be very different and in fact the ‘long tail’ can be your most important traffic source and the one that leads to most sales.

So how does the ‘May Day’ update relate to the ‘long tail’? Well, Google have decided to tackle this type of query in more of an isolated way and try to more closely match the needs of the searcher in relation to the page(s) that are delivered.

Here is a video from Matt Cutts that talks about this change.

So how does this relate to your site? Well, the best way to evaluate your ‘long tail’ exposure is to run a search query report on your analytics programme from a relevant month and look at all the individual searches that bring traffic to your site that are relatively low in number (but are many when all added together) and contain multiple words. Than you can run a report from around mid-May onwards and see if this has changed either positively or negatively.

I am still evaluating what I think the triggers are for this change and how Google is making the judgment call on the relevancy and quality of the results it hopes to delivery for ‘long tail’ queries.

More on this in a future post, but in essence this could be a great directional change for Google. However, I am sensitive to those of you out there who have had a real and negative traffic hit from this change.

A nice new match type in Google AdWords called ‘Broad Match Modifier’ was launched in the UK this week.

Basically, it turns ‘broad match’ into something that is a lot more usable and less at the whim of what Google consider a ‘good’ match for your keyword.

With the new (ish) ‘search query’ reporting you can now get, using broad match has become a little more usable with the ability to modify (or turn off) the extent of your broad match keywords, after maybe using it during some of the initial stages of a campaign. This enables you to get ‘real’ keyword data with the ‘search query’ report and then use this to go after worthwhile AdGroups, etc.

However, some of the ‘broad’ match keyword grouping is somewhat dodgy and even the most scientific use of the match type always left me feeling a bit like I was donating money, relevancy and shedding quality.

Well, the ‘Broad Match Modifier’ really does help in many respects… how does it work? Well, you need to put a ‘+’ sign before one of your broad match words to ‘modify’ it. By modifying it, Google means that it will treat that word somewhat less broad than it used to. Here’s an examples…