Category: Webmaster Services
(page 1 of 5)

Websites that have earned trusted status are often treated differently from those that have not.

SEO’s have commented on the double standards that exist for judging big brand, high-importance sites compared to newer, independent sites. For the search engines, trust most likely has to do with the links your domain has earned. If you publish low-quality, duplicate content on your personal blog, then buy several links from spammy directories, you’re likely to encounter considerable ranking problems. However, if you post that same content on Wikipedia, even with the same spammy links pointing to the URL, it would likely still rank tremendously well. Such is the power of domain trust and authority.

Trust can also be established through inbound links. A little duplicate content and a few suspicious links are far more likely to be overlooked if your site has earned hundreds of links from high-quality, editorial sources like CNN.com or Cornell.edu.

Google Site Query

Restrict your search to a specific site (e.g., site:webcraft.ws). Useful to see the number and list of pages indexed on a particular domain. You can expand the value by adding additional query parameters. For example, site:moz.com/blog inurl:tools, will show only those pages in Google’s index that are in the blog and contain the word “tools” in the URL. This number will fluctuate, but it’s a decent rough measurement (learn more from this blog post).

Google Trends

At google.com/trends, you can research keyword search volume and popularity over time. Log in to your Google account to get richer data, including specific numbers instead of simple trend lines.

Blog Search Link Query

Search links in a blog (e.g., link:webcraft.ws/blog). Google’s regular link query operator is not always useful, but their blog search generally yields high-quality results, sortable by date range and relevance. Learn more about the link operator in this blog post.

For search engine success

“Develop great content” may be the most oft-repeated suggestion in the SEO world. Despite its clichéd status, though, this is sound advice. Appealing, useful content is crucial to search engine optimization. Every search performed at the engines comes with an intent—to find, learn, solve, buy, fix, treat, or understand. Search engines place web pages in their results in order to satisfy that intent in the best possible way. Crafting fulfilling, thorough content that addresses searchers’ needs improved your chance to earn top rankings.

As mentioned, original content goes a long way with Google and your visitors. Copying other people’s content will result in a punishment from Google, which can crush your bottom line.

Want proof?

Remember when you used to find ezine articles in top Google rankings? You don’t see them anymore, and it’s no accident. They were one of the hardest hit by Google’s algorithm update, which aimed to prevent bad content from ranking highly.

Mahalo was a content farm that updated every day with new content, but it wasn’t original content. Google punished them for it, and that resulted in Mahalo needing to pivot their business.

But let’s take this a little further. Original also means originality. Your ideas should be original! Rehashing the same concepts or other posts over and over again is not original. If your content is played out, no one will link to it – and that defeats the purpose of writing content in the first place.

Here’s the train of thought that most website owners have (thinking that gets them in trouble):

“So it says here that we need to create a lot of content…OK…well how can we do this as easily and cheaply as possible?”

“Can we make a bot to scrape content and re-combine it into some form of gibberish that at least the search engines will read?”

Used by search engines

How do search engines assign value to links? To answer this, we need to explore the individual elements of a link, and look at how the search engines assess these elements. We don’t fully understand the proprietary metrics that search engines use, but through analysis of patent applications, years of experience, and hands-on testing, we can draw some intelligent assumptions that hold up in the real world. Below is a list of notable factors worthy of consideration. These signals, and many more, are considered by professional SEOs when measuring link value and a site’s link profile. You may also enjoy some further on the Moz Blog reading about search engine valuation of links.

Global Popularity

The more popular and important a site is, the more links from that site matter. A site like Wikipedia has thousands of diverse sites linking to it, which means it’s probably a popular and important site. To earn trust and authority with the engines, you’ll need the help of other link partners. The more popular, the better.

Local/Topic-Specific Popularity

The concept of “local” popularity, first pioneered by the Teoma search engine, suggests that links from sites within a topic-specific community matter more than links from general or off-topic sites. For example, if your website sells dog houses, a link from the Society of Dog Breeders matters much more than one from a site about roller skating.

Anchor Text

One of the strongest signals the engines use in rankings is anchor text. If dozens of links point to a page with the right keywords, that page has a very good probability of ranking well for the targeted phrase in that anchor text. You can see examples of this in action with searches like “click here,” where many results rank solely due to the anchor text of inbound links.

TrustRank

It’s no surprise that the Internet contains massive amounts of spam. Some estimate as much as 60% of the web’s pages are spam. In order to weed out this irrelevant content, search engines use systems for measuring trust, many of which are based on the link graph. Earning links from highly-trusted domains can result in a significant boost to this scoring metric. Universities, government websites and non-profit organizations represent examples of high-trust domains.

Link Neighborhood

Spam links often go both ways. A website that links to spam is likely spam itself, and in turn often has many spam sites linking back to it. By looking at these links in the aggregate, search engines can understand the “link neighborhood” in which your website exists. Thus, it’s wise to choose those sites you link to carefully and be equally selective with the sites you attempt to earn links from.

Freshness

Link signals tend to decay over time. Sites that were once popular often go stale, and eventually fail to earn new links. Thus, it’s important to continue earning additional links over time. Commonly referred to as “FreshRank,” search engines use the freshness signals of links to judge current popularity and relevance.

Social Sharing

The last few years have seen an explosion in the amount of content shared through social services such as Facebook, Twitter, and Google+. Although search engines treat socially shared links differently than other types of links, they notice them nonetheless. There is much debate among search professionals as to how exactly search engines factor social link signals into their algorithms, but there is no denying the rising importance of social channels.