The 5 Most Common On-Page SEO Problems (And Why You Might Ignore Them)

On-page SEO is important in a number of ways. Not only do well-optimized site pages improve your chances of ranking highly in search results, but they typically tend to provide an improved user experience as well. With so many on-page SEO factors to consider (meta information, links, URL structure, just to name a few), it's very easy to overlook some simple yet beneficial practices that could enhance your site's presence and usability.

According to a new study from popular SEO tool Raven, there are five clear standouts when it comes to the most widespread on-page SEO issues.

Images Present the Most On-Page SEO Issues

Raven has crawled around 200 million different webpages to acquire their data, and they've analyzed the over 4 billion on-page SEO issues they've found to determine which ones are most prevalent. Interestingly, a whopping 78 percent of all on-page SEO problems had something to do with images. Another prevalent issue was duplicate content, which was found on 29 percent of the crawled pages. The top five on-page SEO issues were:

#1 - Images with Missing Title Tags: The title tag helps search engines understand what the nature of an image is. It is also the text that's displayed to users whenever they hover over the image. Despite presenting a great opportunity for adding relevant keywords to a site, Raven found that the average website had 2,487 instances of missing title tags. To learn how to add title tags, check out this article on optimizing images.

#2 - Images with Missing Alt Tags: Alt tags are similar to title tags in that they help a search engine understand the nature of an image, but they also function as a text replacement whenever a browser may be unable to display the image. Raven found the average webpage to have 1,153 missing alt tags. Again, alt tags present a great opportunity for SEO keywords. Adding alt tags is explained in-depth in this blog post.

#3 - Links With No Anchor Text: - Anchor text helps a search engine understand what you're linking to, which is incredibly valuable - especially when the links point to other pages on your site. While it may seem unlikely, the average website had 181 links with missing anchor text. Simply having a link such as "www.example.com/dog-products" is not as informative to a user or search engine as a link with the same destination URL but a good anchor text such as "Dog Treats and Toys."

#4 - Meta Descriptions That Are Too Short, Too Long, or Missing: Every page on your site must have a meta description. This is the brief snippet of text that shows up below the page title in search results, allowing a user and a search engine to understand what a page is about. Most content management systems provide you with a place to enter a meta description for each page. Every meta description should be unique and should be between 150 and 160 characters. A well-crafted meta description on every page of your site can seriously increase your chances of ranking higher and getting users to choose your page in the search results.

#5 - Pages With Duplicate Content - Twenty-nine percent of pages had duplicate content issues, in which pieces of content are repeated verbatim on different pages of a site. Every page of your site must have unique content; otherwise, you run the risk of severely dropping in Google's rankings.

Look at Your Data Closely

Anyone who's used SEO audit software likely recognizes these issues. But it's important to thoroughly understand your site, as well as your SEO audit, before panicking at thousands of missing alt tags. For example, many SEO audits may report the number of errors per page, even if the same error occurs on multiple pages. So, for example, if you have a 300-page site that uses the same logo on every page, some SEO software may report 300 errors if that logo is missing an alt tag. In this case, all 300 errors could be fixed by adding an alt tag to one image.

Also, SEO audits often report scary duplicate content issues that might not actually be as dire as they seem. While content that's stolen or duplicated for dubious reasons is a serious offense, some SEO software may not understand the types of duplicate content that are actually acceptable. If your site has a blog, for example, some SEO software will tell you that every Archive or Category page contains duplicate content. While this technically may be true, Matt Cutts himself has said that these types of duplicate content issues probably don't affect search rankings:

If you're still concerned about this type of duplicate content, though, many consider it best practice to de-index Category or Archive pages using the nofollow attribute. Alternatively, you could use cannonical URLs to clarify these situations for search engines.

Other problems such as thin content should be taken in a case-by-case basis. For example, some SEO audits may flag a Sitemap page for having thin content when it's unnecessary or possibly even detrimental to add paragraphs upon paragraphs of text to such a page. So, whether you're trying to interpret an on-page SEO audit on your own or being pitched one by the pros, you can save a lot of time by having a thorough understanding of when certain SEO errors are actually relevant and when they're not.

1 Response to The 5 Most Common On-Page SEO Problems (And Why You Might Ignore Them)

jeffiepriyasays:

November 18, 2015 at 7:01 pm

Hi.
Lovely checklist and 'everegreen' recomendations.
I like especially number 8 (Remove 10 Low-Value Links), and I feel that especially many Small Businesses running their own eCommerce should have to respect it. I'm thinking to all those mega-menus with categories, sub-categories and so on of products maybe presents in every single page of the site.
Even though they can be graphically cool, they are a juice-drainer. Better to have a main menu just with the main products's categories (your eCommerce hubs) and then into each category page an exploded menu.
Apart this, there are many other ways in order to better organize the internal juice flow and help the indexation of the site. Because a correct architecture is maybe the most challenging on site optimization, IMO.