3 Technical SEO Audit Items You Must Consider

To the SEO newcomer, SEO might seem pretty straightforward. You do keyword research, you fix up the existing content on your site by implementing those keywords, you create and share new content that includes those keywords, and you look for link opportunities. That’s a good start, but in order to do a thorough job implementing and executing an SEO campaign there are a few more items that you need to review and analyze. The search engine spiders look at all aspects of a website when determining rank. Even if a website looks great on the outside from a user perspective (good content, easy to navigate) that doesn’t always mean that it looks great on the inside, or the back end of the site. Conduct a technical SEO audit to make sure that things are “looking good” from a search spider perspective.

It’s important to be aware of these three technical SEO items:

1. Duplicate Content

Duplicate content on a website is a big search engine “no no.” To the search engines it appears as if you are trying to dominate the search results for specific keywords and keyword phrases by creating multiple pages that are mostly, if not exactly, the same. Usually this isn’t what the website owner is trying to do. In many cases the website owner doesn’t even know that this duplicate content exists since it doesn’t affect the usability of the site and visitors might not even notice it. This tends to just be something that happens during the web development process, specifically when a website is being redesigned and URLs are changing. If you are creating a new URL for a page you will need to redirect the old page to the new one so that there aren’t two URLs out there with the same exact content that are essentially competing with each other in the search engines. A common error is that the homepage gets duplicated. There is an original version and a homepage.com/index or homepage.com/default version. This is also bad from an SEO perspective because it splits the link trust since people will end up linking to the /index and /default versions because they don’t know any better.

2. Broken Links

It’s important to keep track of your links and ensure that they aren’t broken. Broken links are bad for two reasons. First, it ruins the usability of the site for visitors. It’s very frustrating to click on a link that you think will answer your question or solve your problem only to land on an error page. Second, it’s bad from a search engine spider perspective. The search spiders crawl a site link to link and a broken link is like a dead end. Too many broken links tells the search spiders that the site doesn’t have good usability, which is taken into account when ranking sites. Generate a list of broken links and implement 301 redirects to send visitors and search spiders to an active page.

3. Anchor Text

Keyword anchor text linking was once an SEO “must” but it has lost some its power recently. Keyword anchor text linking is still relevant, but on a much smaller scale. If the search engines see that you are using the same keyword anchor text over and over it raises a red flag since it appears that you are trying to manipulate the results for those keywords. Run a report to look at your anchor text list and make changes going forward. Start linking to the brand or including full html links within content to keep the inbound link portfolio more natural.

About the Author

Nick Stamoulis is the President and Founder of http://www.Brick Marketing.com/. With over 12 years of experience, Nick has worked with hundreds of companies small, large and every size in between. Through his vast and diverse SEO, search engine marketing, and internet marketing experience, Nick has successfully increased the online visibility and sales of clients in all industries. He spends his time working with clients, writing in his blog, publishing the Brick Marketing SEO newsletter (read by over 130,000 opt-in subscribers!) and also finds time to write about SEO in some of the top other online publications,

2 Comments

It is also important to check your sitemaps, robots.txt, internal linking and img alt attribute thing to make sure that the on page optimization on a website is properly established and will not give the search engine bots to crawl your website.