9 Reasons Your Site Isn't Ranking In Search Results

Jayson DeMers
, ContributorI demystify SEO and online marketing for business owners.Opinions expressed by Forbes Contributors are their own.

Search rankings are tricky. Your site can be ranking very well one day, sending you reams of highly-qualified traffic. Then, just when you think you’ve cracked the Google code, your rankings plummet.

Or maybe, your site has never ranked well in the search engines. Maybe you’ve optimized your site the best you know how, but you’re still not ranking for your desired keywords, or any keywords at all.

This article will cover 9 of the top likely reasons your site isn’t ranking in search results. Whether you’ve experienced a sudden drop in traffic, or simply never managed to get your site ranking in the first place, this article will help you diagnose your ranking problems.

1. Your site is new

I can’t tell you how many times someone has said to me: “My site has been up for a month, and it’s still not ranking in the search engines!” Getting ranked for specific keywords often taken months. After 2-3 weeks, your site should have been indexed, meaning it is present in Google’s search results, but with no guarantee of ranking very highly within them.

To check to see if your site has been indexed, go to Google and type in site:yoursite.com to see all the indexed pages on your domain. If no search results are returned, your site hasn’t yet been indexed. While Google usually finds your site on its own, you can also submit your site manually just to be sure. Generally this is unnecessary, though; Google will find your site as long as it is linked to elsewhere on the Web, such as articles published on external publications, local directories, or even tweets posted on Twitter that include a link to your website.

2. You’re targeting high-volume, short tail keywords

Just 5 years ago, targeting short, high-volume keywords was the norm. Webmasters would create content based on these desired words or phrases, build some keyword-rich links back to their website, and watch their site climb the rankings for those keywords.

However, over the past few years, this strategy has become ineffective due to a collision of factors. Google released its Penguin algorithm in April 2012, which specifically targeted and penalized websites with too many keyword-rich inbound links, since they are almost certainly unnaturally acquired, which is a violation of Google’s webmaster guidelines. Additionally, with so many sites vying for these high-volume short-tail keywords, your chances of ranking anywhere close to the top are practically zero.

And to be honest, ranking for these types of keywords has become far less desirable. Short keywords will drive general traffic, whereas longer, more specific keywords will drive more focused, targeted traffic. Ranking for these “long-tail” keyword phrases will not only be easier, it will often result in much higher conversion rates. For more on this, see “The Rise of the Long-Tail Keyword for SEO.”

3. Your content sucks

“Thin” content is described as content that adds little value for readers. It’s generally published by webmasters who have read or heard that publishing content to their website is important, so they publish content for the sake of publishing content without much regard for its quality or value.

There are two types of “thin” content: the kind of flimsy, low-value content that can result in a manual action against your site, and the type that - while not ‘penalty worthy’ – simply offers no value to your website visitors. While you may not receive a manual penalty for the latter, it very likely won’t be receiving much search traffic, buzz, shares, mentions, or inbound links from other authors. Even worse, if your content sucks, that reflects poorly on anyone who does happen across it, which could kill your conversion rates. Google’s Panda algorithm, first released in February 2011, was aimed at enforcing higher quality search results by penalizing Websites that published too much “thin” content.

There are a number of ways you could have unintentionally blocked Google from crawling and indexing your site. However, the most common (by far) is through an error in your robots.txt file.

This issue most often occurs after launching a new site, or after having moved your site from one domain to another. To ensure Google has full access to your site, go into your robots.txt file and take a look around. If you see something like this:

User-agent: *

Disallow: /

You have blocked Google from accessing your entire site. Fortunately, removing this code from the file should rectify the situation completely. You should see your rankings return to normal in a matter of days or weeks.