In the Google Inside Search blog, Google’s Amit Sighal published a post titled Search quality highlights: 40 changes for February that told us about many changes to how Google ranks pages, including the following:Link evaluation. We often use characteristics of links to help us figure out the topic of a linked page. Continue Reading

6 comments

Of the 12 that I pointed to in that post, the link analysis method that is my working candidate at this point (as the method that Google stopped using) is the local inter-connectivity approach, but we don't have access to the data and likely testing that Google does to know if the authority/hub type approach it uses actually improves the quality of search results, or if the re-ranking that an approach like phrase-based indexing actually provides more value.

With local do you mean linking from and to sites in your geographical area? If it so, that could also be extended to international Seo, where it is a sort mantra the "you must receive links mostly if not only by sites in the targeted country"

I like that Bill actually presented several less talked about methods. We all quickly guessed what the method might be based upon the few link analysis methods we know off the top of our heads - anchor text, nofollow etc. But that's like a hammer looking for a nail. When you actually see a list of the possibilities it does make you stop and reflect a bit more. Thanks Bill!

Hi Gianluca,
Local as in the top n (10, 100, 1000) documents that rank for a specific query term. Local inter-connectivity is a concept similar to the HITS algorithm in that it looks at how documents within that local subset of pages might be connected based upon links. So, for instance, a page that's linked to by lots of other pages within that local subset might be boosted in search results.
The local inter-connectivity patent was granted in 2003, and the book In The Plex notes that HITS-liked technology that Krishna Bharat (the inventor behind the patent) developed was implemented into Google's ranking system in 2003. It's the kind of shift that could potentially account for something like the Florida Upgrade that happended in 2003.
Google's phrase-based indexing also does a similar kind of re-ranking, but rather based upon co-occurring terms, and may do a better job of accounting for terms or phrases that have different meanings, and use a clustering approach to account for that. For example, a query for [jaguars] might have multiple clusters of co-occurring terms based upon jaguar as animals, jaguar as NFL football players, and jaguar as an apple software update. Pages can not only get re-ranked based upon their use of co-occurring terms, but also based upon providing a diverse set of search results.
Would re-ranking based upon both phrase-based-indexing and local inter-connectivity make things better or worse?

Well 'adobe' is still ranking for the word "Click here", which I guess is a sign that anchors still have their own value. And if no-follow is dropped or changed, i guess it will move a bigger part of the serp's.