Every brand and organisation wants its website to achieve the highest possible rankings across Google.

However, not all sites are created equally or maintained in the same way and it’s up to Google to determine which factors should shape its rankings. The search giant regularly updates its algorithm to ensure users find the sites that are most relevant to their query, so if your search presence isn’t what you’d like it to be, there’s could be some common ranking factors you’ve yet to meet.

With this in mind, here is a collection of the most common reasons why your website may be underperforming across Google. Some are tactics which may have once been used to increase search visibility but now, unfortunately, will likely have a negative effect on your rankings. Others are simply standards which Google expects your website to meet, which have to be complied with if you're looking to maximise your search presence.

Not mobile ready

Google has recently placed huge emphasis on ensuring that the websites it ranks are mobile ready. If your site is not responsive (i.e. it does not ‘fit’ to the screens of both mobile and tablet devices), your organic search rankings are likely to be impacted. Slow loading times across mobile devices will also likely have a negative impact on your positionings.

Purchased links

Getting websites to naturally link to your platform is great for improving your search presence. It’s not so great if you’ve ever bought links to try and do this; search engines take a very dim view of this tactic. If Google thinks that you’ve paid people to link to your site, or you’ve used automated link building programmes, your rankings will probably be affected.

Excessive reciprocal links

You may share and swap links with a like-minded organisation, whose content your users may find of interest; this is great for improving your own website’s rankings, and theirs. If, however, Google become suspicious that links are being swapped solely for the purpose of ranking, your rankings will likely be hampered.

Duplicate content

Duplicating content was once seen as a simple way to improve and cement a website’s rankings, but the major search engines quickly became aware of this. The more duplicate content your site holds, the less useful a search engine will deem it which, in turn, means it won’t rank as highly.

Tag overuse

Structuring your content correctly helps search engines better understand what your website is about. H1 tags, for example, are great for this; overusing them to try and get Google’s attention, however, isn’t looked upon too favourably.

Overuse of keywords

Keyword density rules must be adhered to if you want your website to rank favourably. Sure, keywords are needed to ensure your site appears as part of the correct searches, but content that contains a high keyword density, in an attempt to push rankings, will likely be shunned.

Hidden content

This one was a much-loved tactic in days gone by. Disguising text within a website to manipulate keyword weighting and rankings is now deemed completely unethical by search engines.

Too much anchor text

Linking certain keywords to reinforce authority used to be a pretty common tactic. The practice is now, however, strong discouraged and one of the main reasons many websites’ rankings have been affected in recent years.

Low quality content

Google’s primary aim is to provide its users with high quality content. If the search engine feels that too many pages of your site have been written for content’s sake, or your content is of low value to visitors, you won’t rank as highly.

High bounce rates

If users aren’t satisfied with the content that appears on your site, they’ll probably leave straight away. Google will assess your website’s bounce rate, as well as the length of time people are spending on your site, to determine whether it is most effectively serving user needs. If not, your rankings will likely be compromised.