The Evolution of Links

To provide relevance to users, it is essential for search engine algorithms to take into account the landscape of the web in order to reflect user behaviour and the wider patterns of what is relevant to people at a particular time. When Google’s PageRank algorithm was first created, the simple fact was that links were not easily given. There were far fewer websites online, and the barrier to publishing was far higher.

To SEOs, drunk on the knowledge that links meant ranks, the small number of link building opportunities were swamped, and an economy of link buying was created.

Submitting a site to thousands of directories, adding your link to thousands of guest books, swapping reciprocal links with other webmasters, and even buying links was essential SEO. It didn’t matter where your link was as long as it was there, and it didn’t matter what other pages were being linked to, as long as your’s was.

Of course, so many links were created that it became important to determine what was a good link and what was a bad link, and once again, Google helped out webmasters. Good links had a lot of PageRank. Good links were on a .edu domain. Good links were one way.

Once again, SEOs went wild for the good stuff, and found companies that would act as brokers. No need to seek out a link, just pay for it. Think about all the juicy number 1 positions that you can get with links. Fed up with links, why not get some blog posts too? Or paid directories, or banners, or blog rolls. Or automated link placement services that would provide you with millions of links without you having to lift a finger. Good times.

Google put out guidelines and created right and wrong, and by and large, the SEO community fell into two distinct camps, those who conceded that their link building activity was against the rules but did it anyway, and those who genuinely believed that their techniques perfectly within the rules, and did it anyway. Google banned a few websites for breaking their rules, and the SEO community who had believed that they were OK realised that they weren’t.

And the war continued, SEO’s found link building techniques that worked, for a while, and then the rules changed because everyone jumped on board with whatever worked and leveled the playing field, and wiped out any advantage.

So where are we now?

Well, at the risk of queering the pitch, and getting everyone on board the techniques that work now, we’re at a stage where Google can recognise a lot about the demographics who use each website, and as such can determine a much greater understanding of what the audience of a site is.

I used to talk about how links were like votes, and that you needed to make sure that they were from the right constituency. That used to be mean that the sites needed to be on the same theme. Now it means that they need to be used by the same people too.

By using tools like Alexa, you can look at the audience of your website. Mine looks like this:

Alexa Demographic Data for this site

So, my site is looked at by an audience that is largely aged between 25-34, has a great education, has more than an average number of women compared to men, and has children. Also, compared to the rest of the internet, more people look at this site from work compared to at home.

If I had a bigger sample audience, I’d probably have more data about it. The chances are that a client’s website has that data.

You can – and should – make use of this data to determine what sites you should request links from. If a website is relevant to your users, and is also a place where your users are likely to be then it makes sense for that website to have a link to you.

That is the kind of link that makes a difference. Stop worrying so much about links from Tweets, or Facebook likes, and start thinking about social media in terms of demographic relevance across websites. Google don’t need a social network of their own to incorporate information about users into their algorithm – they already know enough about what you like.