Authority Concepts

SEO Fundamentals:

Understanding Page Rank Authority

Every webpage is given a little Page Rank according to the original formula by Google founder Larry Page.

Page Rank is assigned from Google to websites and then to webpage’s by hyperlinks. Subsequently the associated page becomes authoritative based on the links from internal and external sources.

This is one of over 200 ranking factors that helps creates the organic web index. Google turned links into a vote for the websites authority which has become one of the key ranking signals. Some tests indicate there are differing variations of this core concept such as, domain authority and page authority used to understand context. Sometimes a larger brands content is more authoritative so trying to measure authority by external metrics is a gauge used in SEO to compare websites.

How Page Rank Works

Page Rank is an ‘Iterative Algorithm’ so it doesn’t stay static; it’s passed back and forth depending on the links per page (a little like water being carried from page to page, site to site.)

The Page Rank formula has been documented as follows,

PR(A) = (1-d) + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn))

Page Rank changes overtime and re-calculated as the updates come out and back-link profiles change. It has become a little less important due to other ranking signals developed to rank websites as part of the modern search algorithm.

Google historically used Page Rank as a crawling signal on a logarithmic scale 1-10, assigning each website a Page Rank. This scale provides an indication of quality (per page) and an indication of how often the site
should be visited by Google’s Search Bots that update their index of the web.

Page Rank can be moved throughout a website to improve the strength of a page by using the newer ‘no follow link’ (2005) or just by removing the links on a page. This is ‘Page Rank sculpting’, The ‘no follow’ link doesn’t improve Page Rank it ignores it, keeping the link-juice on the page based on the links per page.

The information architecture / information hierarchy created as part of the U.X or U.I can have a significant affect on this at a design level. By changing the flow of link-juice at a design level, we’re able to create a taxonomy that supports the information hierarchy of a websites structure.

Another way we achieve this is interlinking content or navigation through the anchor text of a hyper-link.

In recent years we’ve seen changes to how the authority of internal and external links can affect the positioning of a webpage due to relevance within the results page. This has been a main cause of SEO practitioners over-optimising the anchor text of hyper-links.

Anchor text from links within the back-link profile helps to determining the topical relevance and authority of each website. Creating targeted links based on content has always been an important factor for on-site and off-site SEO.

Many Search Practitioners have over used this method by targeting the anchor text on external links too much to improve keyword related rankings. Google responded in 2012 with the ‘link spam penalty’ also known as ‘Google Penguin’ which analyses the anchor text density of links, to rate the quality of the on-site content.

When we consider the nature of the authority of a website we must think of trust. But how do you understand trust within an algorithm that can return a result in 0.3 seconds?

The quality of the links and the persona adopted as part of a link-building strategy.

There will always be a need to create accessible, quality content as the foundation of SEO, but quality online is subjective when we really try to define what quality content means.

Accessibility can be defined with the areas we as search professionals can control, such as on-site, page-level SEO i.e.

Title Tags

Meta Descriptions

Keywords

URL Structure

Internal Linking

Valid XHTML,

Server-side

Sitemaps,

Unique content

Google’s still only able to fully index and understand XHTML 1.0, as it cannot fully index technologies such as, Flash, Java, Ajax etc.