The Longevity of Links for SEO

by Jon Cooper

Note: if you just want to skip to my conclusions, you can find them here.

Recently, I had a bit of a breakthrough.

As we’ve expanded the agency, I was finally able to use our internal resources to build out & rank our own projects. I’ve always had the mindset of “drinking our own Koolaid”, and as we’ve gone down this path, I recently stumbled into a rabbit hole that gave me a huge burst of excitement and an increase in expectations for what we could do in the near future. But it came at a cost: paranoia.

Once the dust settled on the improvements we made, I took a major step back and realized that what we were building was more or less sitting on the fault line of a tectonic plate.

It could all come crashing down in an instant, all because of one critical assumption that I’ve made to date: that links will continue to matter.

I quickly realized that I needed to have a better gauge on the longevity of links beyond the tweets I happened to read that day. I’ve never had much cause for concern over the years regarding this issue (evidence of why is listed later), but if I was going to make a major bet over the next 12-24 months, I needed to know the parameters of what could go wrong, and this was one of the items at the top of the list.

I ended up discussing things over with a few trusted colleagues of mine, as well as reaching out to a few other experts that I trusted the opinion of in regards to the future of SEO. So I wanted to share with you my thinking, and the overall conclusions I’ve drawn based off the information available.

Separating Facts from Opinions

The main source of “facts” that the industry points to as a whole are statements from Google. Yet, there have been numerous instances where what Google is telling us is, at the very least, misleading.

Here are a few recent examples to illustrate in what way they are misleading:

1. In their “Not Provided” announcement post in October 2011, Google stated that “the change will affect only a minority of your traffic.” Not even two years later, Danny Sullivan was told by Google that they had begun work on encrypting ALL searches. The rest is history.

My thoughts: even when we get the truth from Google, it should be labeled with huge, red letters of the date the statement was made, because things can change very, very quickly. In this case, it was probably their intention all along to gradually roll this out to all searches, in order to not anger people too greatly all at once.

2. Google’s John Mueller made this statement a few weeks ago about 302 redirects passing PageRank. It implies that 302 redirects are OK for SEO. As Mike King quickly pointed out on Twitter, that’s very misleading based off most SEO’s prior experiences.

My thoughts: is it difficult to believe that 302 redirects pass at least 0.01% of the PageRank of the page? I don’t think so. So really, this statement isn’t saying much. It’s a non-answer, as it’s framed in comparison to a 404 (no PR passes) instead of a 301 (~90% of PR passes), the direct alternative in this case. So really, it doesn’t answer anything practical.

Take those two examples & realize that things can change quickly, and that you should try to decipher what is actually, concretely being said.

So, with that in mind, here are some recent statements on the topic of this post:

1. March 24, 2016 – Google lists their top 3 ranking factors as: links, content and RankBrain (although they didn’t state the order of the first two; RankBrain is definitely 3rd, though).

My thoughts: this isn’t anything new. This list lines up with what they indicated in the RankBrain initial news article in Bloomberg when they stated RankBrain was #3. All that was left to speculate, until now, was what #1 and #2 were, although it wasn’t too difficult to guess.

2. Feb 2, 2015 – Google confirms that you don’t necessarily need links to rank. John Mueller cites an example of friend of his who launched a local neighborhood website in Zurich as being indexed, ranking, and getting search traffic.

My thoughts: this isn’t very surprising, for two reasons. First, that the queries they’re ranking for are probably very low competition (because: local + international), and because Google has gotten a lot better over the years at looking at other signals in areas where the link graph was lacking.

3. May 5, 2014 – Matt Cutts leads off a video with a disclaimer stating “I think backlinks have many, many years left in them”.

My thoughts: as much of an endorsement as that is, a haunting reminder of how quickly things change is Matt’s comments later in the video talking about authorship markup, a project that was eventually abandoned in the following years.

4. Feb 19, 2014 – Google’s Matt Cutts stated that they tried dropping links altogether from their ranking algorithm, and found it to be “much, much worse”.

My thoughts: interestingly enough, Yandex tried this starting in March 2014 for specific niches, and brought it back a year later after finding it to be unsuccessful. Things change awfully quick, but if there’s any evidence on this list that can add reassurance, the combination of two different search engines trying & failing this is probably best. With that said, our main concern isn’t the complete riddance of links, but rather, its absolute strength as a ranking factor. So, once again, it’s still not all that reassuring.

Opinions of Others

Let’s now transition to the opinions of others in the industry. It could be argued that these can be a much better gauge on the reality of SEO than whatever Google is telling us (and I’d agree!).

The most substantial opinion piece to start off with is Moz’s Bi-Annual Search Ranking Factors study. Half of the study is based around a survey that was given to 150 experts. In the survey, questions were asked about the most important ranking factors, both for today, and for the future. Here are the results of current ranking factors:

And here are the results for predictions of future algorithmic changes (only linked, not embedded, because it’s quite long). For these, note that zero of the “predicted to increase in impact” factors were link-based. Furthermore, the only 2 in the “predicted to decrease in impact” were link-based.

As I mentioned earlier, I decided to touch base with a few specific people in the industry that I place a lot of trust in: AJ Kohn & Justin Briggs. Here’s what their thoughts were when asked about the future of links as a ranking factor:

Links are and will continue to be an important part of SEO for the foreseeable future because they remain a powerful way for Google to measure authority and expertise.

The link graph has been at the heart of Google’s search algorithm from the start. One of the more interesting videos Matt Cutts did related to separating popularity from authority. He makes the point that popular sites might include porn but people don’t often link to porn. On the other hand he says that many government websites aren’t very popular but they do attract a number of links.

In the same video, Cutts also discusses how the anchor text used in those links can help Google to better understand the topic for which it might rank. And there are numerous patents that delve into how much weight to give anchor text and how that might aid in establishing topical relevance.

Now, Google is getting better and better at understanding the meaning of content, but that doesn’t mean that links will suddenly lose value. They might matter slightly less but I generally see these improvements as being synergistic.

But let’s put all of this aside and look at the bigger picture and use some logic. Does Google still police paid links and other manipulative link schemes? Of course they do. And the only reason to do this is because links still matter.

Currently, and within the short-term, links are here to stay (at least in the traditional information retrieval of documents aspect of search, which is shrinking over time). An often undervalued aspect of links, in a very traditional PageRank sense, is that “link equity” is an input for URL discovery, crawl scheduling, crawl budgeting, crawl depth, and likely hundreds of other processes and checks. I see links as the first layer in rank determinations. The net effect is that their “slice of the pie” is getting smaller, but that’s not exactly what’s happening. Results may be put in order based on more traditional ranking processes, then search engines integrate usage data (CTR, bounce, bias), brand affinity, search sessions, query refinement, machine learning, localization, and personalization. The net outcome of these “re-sorts” is that the perceived weight of links goes down, but links are responsible for getting the URLs into the original consideration set for rankings.

The value of links in Universal Search has eroded, because search is about more than retrieving articles. Mobile, voice, entities, structured data, personal search, conversational search, predictive search, and apps have little dependency on links. Some of these technologies never refer to the link graph, with the caveat that many of these rely on the desktop index to run (or at least to “learn”).

When looking at SEO, I’m less concerned about the changing value of links and more focused on the declining importance of traditional, document-based search results in a company’s overall search strategy. However, we think of links in terms of digital PR and promotion. A marketing plan always has room for good promotion.

In it, Will talks about how RankBrain being added to the mix affects the future potential value of links for SEO. I will pull out the most relevant bit:

What this means in practice is that even after whatever change is made to dial up the dependence on RankBrain and dial down the dependence on the human-tweaked algorithm, I believe that we will continue to see link metrics be better correlated to rankings than any other metric we have access to.

In other words, RankBrain will be more important than all the individual signals in the human-tweaked algorithm (including links) but links will remain the dominant signal that RankBrain itself uses.”

My Own Opinions

Let’s take a step back. Have links stopped being an indicator of the quality & relevance of a website? Has a link from TechCrunch or the National Institute of Health stopped being relevant to the assessment of the legitimacy of a website? Has that changed?

I only see two main things that have changed in our understanding of links as a ranking signal:

That some links do a better job than others at indicating the quality & relevance of a website.

That there are things beyond links that can also indicate the quality & relevance of a given website.

Google has done a better job of understanding those two things since they first started. For the first item, that’s why you have Penguin. For the second item, that’s why you hear about things like unlinked brand mentions & social signals.

But the idea that links not being a signal in the future altogether is beyond ludicrous.

It would be discounting the foundation of what the algorithm is built upon. And that’s not important because of historical significance, it’s important because it’s based off how the Web fundamentally works. Links are just connections between things, and some of those connections hold more importance than others. Throwing out links altogether as a ranking signal would be the equivalent of disregarding recommendations from people that you trust.

So really, the argument over link-based factors playing a role versus no role at all, is dumb.

Note: so now that we’ve established this, when I talk about links in the context of the rest of this article, I will be talking about the links that Google WANTS to count, not all links on the Web.

If, so far, we’re on the same page, then the real question is how strong of a ranking factor links will be. There are two main things that will influence this.

The introduction of new factors.

The relative strength of each factor

The first is simplest to explain, so let’s start there.

The Introduction of New Factors

As new ranking factors are added to the algorithm, inevitably, dilution happens. There is only 100 percentage points that make up the entire decision making process behind an algorithm. It’s a limited amount of space. So the introduction of something new, even if it’s tiny, inevitably takes space away from all others.

And if the factor does its job and holds meaning, then that’s good. That means a smaller reliance on any one, single factor. That doesn’t mean just links. That also means things like content-based or user experience-based factors.

The concept of new factors being introduced into the algorithm represents an unknown. And I could never claim to have an accurate pulse on new things altogether that Google might be introducing into their ranking algorithm.

The Relative Strength of Each Factor

Note: I will usually be using the phrase “the concept of RankBrain” instead of simply the term “RankBrain”. This is because I only know that it’s using machine learning, and will describe it from the standpoint of what machine learning models do, in order to extinguish any confusion about me having any real idea of what RankBrain is & does, which I don’t, because not much is publicly known.

People are talking a lot about the concept of RankBrain, and for very good reason. It, without much doubt, dictates the future of the importance of individual ranking factors. But to illustrate why I think that is, I’ll back up a bit.

After reading all of the wild speculation about RankBrain, I noticed that there are a significant amount of people that still don’t know the basics of what machine learning does, the technology that RankBrain is said to be using. This is how Wikipedia describes it:

Machine learning explores the study and construction of algorithms that can learn from and make predictions on data.

In essence, machine learning is used to make predictions. It can’t help Google magically figure out what is the best result is for a user who is conducting a specific search. That would involve things like definitively knowing exactly what “the best” is, and outside of things like math equations or historical facts, is most likely impossible. So for now, Google can only guess, and get really, really good at guessing.

And that’s why we’re talking about predictions.

So let’s now focus on figuring out how to make an accurate prediction. Predictions are based off a set of factors. The “secret sauce” of machine learning is figuring out which factors are more important than others in determining what you’re predicting. As Will put it in his article, it won’t be humans doing this manually in the future, but rather “the machine tweaking the dials.”

That explanation helps to explain why links and the concept of RankBrain are not at odds with each other. They’re apples and oranges. It’s like saying historical forecast data and a weatherman are at odds with each other over predicting the weather. One is information and the other is an interpreter. They are two separate types of things.

So hypothetically, in the case of links, two potential things could happen when machine learning gains more control over the ranking algorithm:

The dial is turned back on links, realizing that they’re not as good of an indicator of the quality & relevance of a website than what Google engineers had previously given them credit for.

The dial is turned up on links, realizing that they’re a better indicator of the quality & relevance of a website than what Google engineers had previously given them credit for.

To date, I’ve never seen the second hypothetical situation talked about. And while its probability can be (justly) questioned, I think it’s an interesting scenario to discuss.

What if the hysteria around links in our industry has caused Google engineers over time to manually & mistakenly give them less weight than they deserve? What if their new machine learning models indicate that quality links (remember: Penguin is changing the game here) are actually a really good indicator, more so than previously given credit for?

I don’t have a clear idea on the likelihood of each of the two hypothetical scenarios listed above, but let’s be clear: links being dialed up as a ranking signal due to new machine learning models is as real of a potential outcome as links being dialed down as a ranking signal.

Now that we understand the potential outcomes that machine learning in a ranking algorithm can have from the standpoint of links, it’s now time to discuss the probability of each outcome happening. This is where the real discussion begins. There are two main ways that a dramatic change could happen in the given value of links as a ranking signal:

Google engineers, previous to machine learning, had done a poor job in determining the exact importance of links as a ranking signal.

Google engineers, previous to machine learning, had done a poor job in determining the exact importance of other ranking signals.

And as a result of either, when a highly accurate machine learning model is introduced, the correction is made. The limited space within the algorithm would be re-distributed.

So let’s now discuss each of the above two possibilities separately.

Links As A Ranking Factor

The first option sounds improbable.

Links are the oldest signal in their algorithm. These PhDs have had almost two decades to screw around with the dial. It’s very reasonable to think that a machine learning model is not going to significantly alter their importance as a ranking signal, as it would imply that these engineers had been horribly wrong after all this time, in one direction or the other.

But there is a very real scenario to consider. It involves Penguin.

We’ve all seen numerous examples of link spam, even in shockingly recent times. Those examples, coupled with the insane amount of time Google has taken in releasing the next Penguin update, shows that they’re still scratching their heads & don’t quite have it all figured out.

But in the context of this investigation, its importance here is significant. It’s a wild card. I’ll explain why.

Let’s assume, just for a moment, that Google had been right for placing their immense trust in links as a ranking signal. Let’s pretend that somehow we were able to divinely identify what really was the best indicator of a quality search result, and that at the top of the list of indicators was links. This is important because, soon enough, they’re going to find this out the more that they use machine learning.

So, if Google’s views on the importance of links as a predictor of a quality search result does not change, what will happen when they perfect the art of cutting through the noise & only identify and give weight to links that indicate a true endorsement of a website (a quality link)? If Google has been giving links as much weight as it has in the past, even when they didn’t fully understand which links were good & which ones weren’t, just how much further would the dial potentially be turned up once they’re near-perfect at this?

The conclusion I’m trying to draw here is that, once again, there’s a very legitimate potential outcome that links could INCREASE in importance as a ranking factor as they continue to refine Penguin and their overall analysis of link-based factors.

I think that’s a profound realization, and yet, once again, it’s not even being discussed.

Personally, though, I don’t think that this will happen any time in the near future. Here’s why:

Overall, Google still seems to be far off in correctly classifying links 10 times of 10 as either spam or not.

The last Penguin update has taken a while. This could be because they’re not happy with the results, or because they’re putting internal resources elsewhere. Both aren’t good signs for links, although other reasons could exist.

There are many other new signals that haven’t been tested & used to the extent of links.

Other Ranking Factors

Now let’s discuss the second scenario. Unfortunately, it’s a much more complex discussion than the first because:

1 signal is simpler to discuss than hundreds of other individual ones & all their various combinations.

We’ve gotten information about links publically from Google. For a lot of other signals, we don’t know much.

As a historically important factor, there have been a lot of studies & opinion articles published about links in the marketing community.

So with that said, here are my main thoughts about this group as a whole.

1. Time. It’s on the side of a lot of new factors Google has been rolling into the algorithm over recent years, at least in comparison to links.

Machine learning aside, even though I’m guessing they’re much more efficient at doing so in 2016 than in 2006, they still haven’t had relatively much time to mess with the dials of each, as opposed to something like links.

Additionally, for a lot of newer signals, it’s doubtful that they’ve cut through all of the noise for each, in the same way that they’re trying to cut through the noise in regards to links via Penguin. I assume that’s what is holding back a lot of UX signals.

Note: for a more concrete set of timelines around specific factors, checkout SEO By The Sea. Bill Slawski has done a great job surfacing Google patents (as they’re granted) that talk about some of these, and they all have a filing date, which is better than nothing.

2. Segmentation of ranking algorithms. The implication of an answer given by Google in an FAQ help doc about the mobile friendliness update is just one piece of evidence signaling a division in SEO, in which the concept of a singular ranking algorithm is dated.

Earlier examples of this concept are found with things like the Payday Loans updates, in which the organic results of certain industries were ranked differently than for other industries.

In most cases, especially with things like mobile, I fail to see much of an opportunity for links to be a beneficiary of these segmentations. I more so see it as links being more or less a “fall back” when they aren’t able to use factors that do a really good job for specific segmentations of searches (i.e. UX factors for a search done on mobile, dwell times for an investigative search, etc.).

With that said, there are a number of very interesting problems that Google has here. A few of them are noted further down in this write up of a recent Googler’s presentation on search.

3. The increasing complexity of the algorithm. Inevitably as more signals have been introduced, and the dials of each have been tweaked and re-tweaked 100s of times, and that each of those dials are no longer universal and are now segmented for different types of searches users do, the complexity has grown.

From what’s been said publicly by Google about machine learning, the feeling I’ve gotten is that they’re working on it, but that we shouldn’t expect things to happen quickly, and my guess is because of the level of complexity behind integrating this technology into all of the various parts of organic search.

Overall, it’ll be interesting to see just how quickly Google will move now & in the future as their algorithm becomes increasingly complex, especially when most of it seems to still be driven by humans, not machines.

My Conclusions

Because the above evidence listed in various places throughout this post is far from substantial, I’m only confident in my conclusions from the standpoint of where we are today, not 5 years from now.

Tomorrow is not a guarantee. As we’ve seen, Google can move very quickly. With that said, even if Google decided this very morning to move away from links as a significant factor, I highly doubt they could make a major change within a ~12-18 month timeframe, just because links are so foundational to their search engine.

The concept of RankBrain is not a major threat to links. I even think there’s a very real chance that it’s not even a minor one.

There doesn’t seem to be a golden knight to replace links. The most talked about new set of factors is UX, and I have seen more than a few examples of specific UX signals being easily manipulated, even more so than links.

The real threat is more foundational than links. Justin Briggs explained it best in his response earlier. The aspect of ranking a page organically in Google’s results has slowly declined in value, both because of other SERP features & search ads. There’s still a ton of money to be made, but we should work like we’re living on borrowed time.

Things do change quickly. But for now, I won’t be hopping off the link bandwagon in the near future.

Loved this post. I also work at a link agency, so I’m admittedly biased, but this is about my favorite breakdown to date of why links matter and will (probably) continue to matter.

Glad to hear you’re not hopping off the link bandwagon for the foreseeable future.

I was especially intrigued with the idea that Google engineers had undervalued links as a signal, and that Penguin could actually lead to an increase in link value in the future.

The bit about segmentation and algorithm complexity was also fascinating, and could very well help highlight the issues Google might be facing with machine learning and Penguin.

Although, Google does have more access to data than just about anyone, and machine learning is really only effective with tremendous data sets. So, in theory they should have more to power their machine learning than anyone else. I remember reading a theory that was part of why Google was willing to release TensorFlow.

In a recent presentation at SMX West Paul Haahr, a Google Ranking Engineer of 14 years, mentioned it’s extremely rare when searching to not be in some sort of live experiment. Sort of gives the scope of how much they test, and how often. Basically always.

There’s no doubt Google is going to continue to work to change for the better. They’re powered by the smartest people, the best technology, and the largest data set.

The question then for me really boils down to something you said about halfway through:

“But the idea that links not being a signal in the future altogether is beyond ludicrous.

It would be discounting the foundation of what the algorithm is built upon. And that’s not important because of historical significance, it’s important because it’s based off how the Web fundamentally works.”

Good links have a high barrier to entry. That’s part of why they’re a good signal. But what makes links really important is the same reason content is important online: they’re a fundamental core to the web. Links are why the web is called the web.

As long as links are the primary way people are connecting websites and navigating the web, then I think links will continue to matter in search algorithms.

Although of course Justin Briggs makes some compelling points as well.

Thanks again for the write up Jon. I’m sure it took quite a bit of time, thought, and care. Enjoyed the read.

Agree with Cory Collins on this one: “Links are why the web is called the web”

I often explain it to clients this way when they ask: “Will links keep their value in the future?” Linking documents to each other beyond the traditional rigid hierarchical folder structures is the unique thing about the web. These new ways to uncover relationships between (parts of) documents is what made the world wide web.

To stop using an attribute that is so fundamental to how this web of information works would feel real strange to me. For me, the only imaginable reason to abandon the usage of links (as a signal of quality, authority, trust & relevancy) would be high levels of noise and spam. But is not like there’s more manipulation & spam regarding to links than there is to any other ranking signal Google uses.

It’s hard to imagine the web changing enough that links are no longer the way people navigate and interact across the Internet. Not saying it’s impossible, but it would basically require a complete overhaul of the current established system.

Fantastic post, Jon and thanks for sharing your findings and standpoint with the rest of us!

My thoughts are that since links require a webmaster or site owner to review your content in order to decide whether it’s worth linking to or not, makes links the only *human* component of the algorithm, which can’t and shouldn’t be replaced by machines.

An algorithm that would be purely machine-based is equivalent to creating a software that will start reviewing books and movies, which seems inconceivable. Ultimately, a humans judgment and discernment is irreplaceable by bots.

With that said, I do believe that Google will get better and better at determining which links are actually earned based on the merit of the content and which links are just camouflaged in a 500 word SEO-optimized article.

Tomorrow is not a guarantee – Yap! Even though links remain to be a very important signal for Google, we all know that there is intent from Google to correct things (even before). Evolve or die they say! It’s funny how other people claim that they don’t do link building anymore – but they are pushing content to blogs and influencers like everybody needs to see it or link to it.

People who have very good foundation on link building can survive, while the others who have no clue on what they are doing should stay away and find something new.

I do believe that it’s going to be difficult for Google to turn their backs from links as a quality indicator. But we all have to face it, we got a target on our backs, Google is doing everything just to simply “Provide the best search result” for their users.

As a fan of The Walking Dead, here is a piece of note from the Season 6: “JSS” – just survive somehow!

Jon – outstanding as always. The thing I always feel needs to be driven home is that regardless of whether or not Google does or doesn’t ever find some other method (other than links) to rank results, links matter, MORE THAN EVER. Google is slowly but surely taking the organic results away from us, and that’s their right and so be it. People that have pursued links for Google rank have been missing the boat for a decade. You should be pursuing business development and physical world connections that make sense for your business, and (FOREHEAD SLAP), you end up creating the exact link profile Google wanted to see in the first place. Not to go all Zen, but the less you worry about your link profile for Google’s bots, and the more you work towards developing your business and links for people, the more likely you are to rank well in Google as a side-effect. You guys all know why I know this. It’s because I was doing linking thing for 4 years before Google even existed. And holy shi*, when Google arrived, all those sites I worked on in those years before Google arrived ranked in Google. Because my approach was to pursue relationship and industry earned links for traffic, branding and reputation. I’ve watched this happen over and over and over. It will never stop working, because it’s…real.

I want to step back though & play devil’s advocate. More links than not that end up driving the rankings we’re gunning for are not going to be the kinds of links that you described. Some will definitely be, and we’ll plaster that all over the client report like we’re amazing & somehow still only human, but a lot of the other links in a normal campaign are not of this type. Those golden nuggets do exist, but a lot of good link opportunities that still drive rankings would be overlooked in pursuit of just them & them alone.

Who knows, maybe the more they push people away from SEO the better quality SEOs that will remain. Link building isn’t scummy if people are doing it for legitimate reasons, and people will run out of link scams. Even PBNs are forcing people to create better content, so Google is winning at the end of the day. Removing links from Google feels impossible. Maybe in 20 years when they have enough voice search data to trump our concept of “relevant links” means. The problem is that content is being published too frequently to crawl and index. Realistically, I can see the web moving towards a pay-to-play arena, where things like SSL certs become necessary for links to have merit. I have secret hopes that Apple and Microsoft will wind up building a file search system to compete, using human signals (how the modern person searches for their “files”) then translate this into a search engine. There are lots of possibilities. Thanks for the inspiring post!

Thanks Justin! IMO you nailed a very important point. What I’m seeing, albeit not concretely, is that people are being pushed away from SEO faster than the opportunity to rank organically (described by Justin) is declining.

Hi Jon, I love the data backed theories and planning your writing has been taking lately. Thank you for composing this. Side note: I am using Firefox(yes it still use it) when one tries to share your post with your G+ button on the right —> the page scroll prevents access to click the send/post button…may be costing this great data some visibility if you care.

We often talk about this signal or that signal. But when you have a machine learning system that can mix and match different signals, you can really cut down on the noise.
Since Google analytics is installed on almost all websites, and Google Chrome browser is reporting back to Google, they know how people arrive at your website. If they just keep track of how many people arrived by link, and then the click data, and somehow apply that information to come up with a score. That would be very powerful in determining the authority of the website. Now that’s only two pieces of data. They have many more. Considering it’s a machine learning system, which I believe is throughout their search organization, they can really get a great understanding about the value of the website. If they only factor in link traffic, then they don’t care about link spam. It’s irrelevant.

Let’s never forget the smartest people in the world go to work for Google. Let’s never underestimate them. I think they’re using multiple signals to determine the quality of the website. But just these two would be immensely valuable.

The crazy thing is, this was a possibility years ago, and now looking at today, Chrome is dominating the browser market, so they’ve got even MORE data to do things like this as you talked about. Your head is definitely in the same place as mine Joe.

Just wait until we’re all using Fiber to connect our Chromebooks to the internet, browsing the web in Chrome on webpages with Google Analytics & Doubleclick tracking, then responding to emails in Gmail, attaching documents from Google Drive…

Great read. My opinion (which is probably not worth a whole lot), is that links will increase in value as the algorithm gets better at identifying what is spam and what is a genuine link – or at the very least not decrease in value.

To assume that links will lose their luster and/or be dropped altogether, is to assume that we will have a perfect (or near perfect) machine capable of identifying relevancy (which, as advanced as Google is, they aren’t there yet). I just can’t see any machine being able to identify relevancy to a human more than a human, at least not for a very long while. Not to mention the mess of subjective opinions on relevancy, which brings a whole new level of complexity (what may be more relevant to you, may be less relevant to me for the same search). The only constant variable that I know of, that can determine relevancy on this large of scale is the “voting system” that is links. As Google becomes better at sifting out the differences in counting those votes (those that should be counted, votes that shouldn’t be counted, votes that are worth more than others, social signals, etc), I believe the more relevancy we will see. Linking to someone is just a natural part of participating online, especially with social – (which Google is still far from counting accurately, but that is a different animal altogether).

I may be completely off and wrong, but to me, this is seems a far easier, and more achievable endeavor than creating a machine that can perfectly predict relevancy based on the person behind the search query (human-like AI). Links aren’t perfect, but coupled with on page factors, they are the best alternative at the moment.

Hi Jon, again a solid post about the future of SEO. I like the way you proceeded with your study about links.

What do you think about the metrics that Google has about user behaviour? Won’t that be good metrics to feed RankBrain with information? Machines are much better at processing such data in my opinion, so they might be able to make better use of it.

But if we look at the big picture, I think Google’s goal is clear: offer the best result for each keyword search for every user. Every time the algorithm improves, it obviously gets harder to influence the search results with SEO or any other technique.

Google engineers, previous to machine learning, had done a poor job in determining the exact importance of links as a ranking signal.
Google engineers, previous to machine learning, had done a poor job in determining the exact importance of other ranking signals.

It’s important to remember that in order to get a link from a reputable site, a webmaster has to read your content and deem it worthy of linking to. “Experts” have always trumpeted the end of links to be controversial and get some attention, but I agree that links remain one of the most important factors to SEO.