Getting Half yearly or yearly website audits is now history ever since you have started to look into hundreds of factors for getting your website ranked on the first page. Knowing that Google gets more firm regarding penalties from link spam to content to code ratio. If your website is not as per Google specifications you may experience a dead drop in your keywords ranking and eventually traffic. To make sure that everything is perfect on the website and is according to Google standards one should carry out an SEO audit.

But, lets first understand what is SEO Audit, its type? And how frequently you should be auditing your website to ensure your website don’t lose your spot on SERPs.

What is SEO Audit?

SEO Audit is the analysis of a website based on the factors that helps the website to be visible on the search engines. It gives all the information about the website right from content optimization to minute details of canonical tags. The goal of every SEO audit is to find the reasons that affects the performance of the website along with identifying high priority to low priority fixes to optimize the website for search engines.

With good SEO strategy the website becomes visible on search engines which eventually helps you generate traffic to the website. The high traffic results in higher conversion of leads and ultimately sales. Undoubtedly, in today’s era online traffic in one of the important aspects of your business growth.

No SEO marketer can predict what’s next when it comes to google algorithms. Change is good when it’s about growing, so no complains when you see something brought by google in their new ranking factors. Ultimately its good for the people who uses the dominant search engine for all their requirements.

But as business owners or marketers, you got to be ready or at least be updated with what’s happening in the SEO world. And that’s why SEO audit of your website is important to be a part of your to do list every week, 15 days or a month depending upon how much your business rely on Google.

And If I had to tell you that you may need a quarterly audit to check these 200 rankings factors outlined by Brian Dean.

These SEO ranking factors are easy to understand for people that are practicing SEO for the last 2-3 years. You may need a couple of resources depending upon your what fixes you need in your website. Issues that related to indexation, redirection .htaccess file so you may need a web developer.

Now let’s, study a little about the benefits of SEO audit.

Benefit of SEO Audit

Know where you lack: SEO audit helps to identify the weak spots of the website that create hindrance in ranking high on Search engines. For eg. While carrying out audit you come across content duplication issue or speed is low. Finding out such issues helps in first step i.e. implementing a good SEO strategy.

Know what competitors are doing: Audit helps to find out the strategy or activities competitors are implementing, and which is helping them rank on Google.

A detailed report explaining the complete SEO strategy that will help you map out the way forward

So now when you have complete understanding on what is a SEO audit and what does it includes. Here is the comprehensive checklist that you must consider while doing the SEO audit. The analysis of the website is broken into 3 steps

On-page Ranking Factor

Accessibility

Off-page

On-page Seo Audit

The characteristics of a website page influences the search engine ranking. For each page you audit, and you check for page level characteristics and domain level characteristics for entire website.The page level analysis becomes important to identify places that needs optimization and the domain level analysis helps to calculate the efforts needed to make correction throughout the website.

URL Structure

The URL are entry point to the page. There are few things to keep in mind while analyzing the URL.

The URL should contain relevant keyword; it helps to define the content of the page properly.

The URL of each page should be unique and formatted properly. The bad URL format will have unnecessary folders, Underscore (__), question mark (?) etc.

For eg: http://www.example.com/index/folder/?badURL/Format_SEO/ The URL that search engine prefers have keywords in the URL with “- “separating the keywords and no unwanted folders are present For Eg. https://www.example.com/url-format-seo/ Here are some more examples for you to understand.

Content Duplication based on URL

Once you have analyzed the URL and fixed the issue the next steps are to check for content duplication based on URLs. URLs are the unique entry point to the website and they are mainly responsible for content duplication.Content duplication happens when two distinct URL points to same page, and search engine consider it as same page.

For example:

https://www.example.com/ and https://www.example.com/index is pointed to same page this leads content duplication

Titles

The identifying character for any page is its title. It’s the first thing noticed on Search engines and social media. Therefore, evaluating titles becomes important for any website audit. There are few things to consider while evaluating the page title

The length of title should be maximum 70 characters. If the length is more than 70 characters the title is truncated on search engine listing

Title tag should describe the content of the page. So, you not fooling the user in any way and will also help you have higher CTR’s

Title Tag is one of the important ranking factor on the page so make sure that you use the targeted keyword right in the starting of the title.

For eg. The target keyword for Minds Metricks is “Web Design Dubai”. We have made sure that we have used it in our title.

Make sure that you have unique titles throughout the website. You can check for duplication of meats in your webmaster account. When you click on HTML improvements under Search appearance.

Meta Description

The best practice for meta description is like that of Title Tags. Meta description not only is one of the ranking factor, but it also affects the click through rate of the page in search engine results.

The ideal length for a meta description is almost up to 230 characters. The meta description should be relevant to the page and should not be over optimized with keywords.

For Eg. The meta description of Minds Metricks homepage talks about the company and the services it offers. With all important keywords used properly in the given character limit

Canonical Tags

Canonical tag defines the source URL of a given page. Canonical tags are used to declare a single page as its own source or for duplicate pages to reference their source/originating page. The duplicate content issue arises due to multiple URL pointing to same page is tacked with the help of canonical tag.

Images Optimization

It is generally said a picture is equal to 1000 words for users. But it is different in case of search engines. For search engines pictures doesn’t speak, meaning search engines doesn’t read images. Therefore, it becomes important to give meta data to the images for the search engine to read it.

Header Tags

Header tags are the HTML markups used to distinguish heading and sub heading on a website page. The header tags are from h1 – h6. Header tags are generally keyword rich.

Accessibility

Your website doesn’t not exist if it’s not accessible by search engine bots and users. Below are the few things that you should keep in mind to make sure your website is accessible.

Robots.txt

The Robots.txt file instruct search engine crawlers to crawl the pages that are necessary and block the ones which are not important. Below is the robots file of Minds Metricks

It is suggested to manually check the robots file to make sure it is not restricting any of the important pages from getting crawled.

Wrong creation and submission of robots.txt file can give you nightmare with traffic and ranking. It has the power to throw your website from 1st page to 10 pages of google.

Bonus Tip

Always check webmaster for blocked resources that your website may have. Googlebot needs access to many resources on your page to render and index the page optimally. For example, JavaScript, CSS, and image files should be available to Googlebot so that it can see pages like an average user.

If a site’s robots.txt file disallows crawling these resources, it can affect how well Google renders and indexes the page, which can affect the page’s ranking in Google search.

We at Minds Metricks faced similar problem when our blocked resource count suddenly increased to 44.

We saw a considerable drop in ranking once the blocked resources were reported on 8th May 2018. We saw a drop of 4% in keyword ranking which impacted us hugely.

On recognizing the issue, we made changes in the robots.txt making sure that Google Bot can access all the resources. The no. of blocked resources gradually decreased helping us to regain our ranking.

Always keep an eye on your blocked resources to avoid complication like rendering and indexation issue.

Leading to drop in ranking.

Robots Meta Tags

The robots meta tag inform crawlers whether they can index specific page and follow its links. Make sure while auditing the website you check for pages that are blocked due to meta robots. This block in meta tags can impact the ranking of the page.

HTTP Status Code:

It becomes difficult for users and search engine to access the website if the URL returns to errors such as 4xx(http) and 5xx(server) errors.

Screaming frog is a tool that helps you to identify the http status of all the pages that exist on your site. The one showing 4xx and 5xx errors should be fixed immediately. If a broken URL‘s corresponding page is no longer available on your site, redirect the URL to a relevant replacement.

Also, while analyzing the website using screaming frog, ensure all your redirects are 301 and not 302 because most of the link juice is passed to their destination page by 301 redirects which is called as permanent redirection.

XML Sitemap

XML sitemap is the roadmap for the search engine crawlers to find all the pages on the website. There are few things that one must keep in mind while creating an XML sitemap.

There is a specific format for the sitemap that search engine expects. If the sitemap of the website is not in that format crawlers may not crawl the website correctly.

Submitting sitemap to webmaster is another important step. It notifies the crawlers about the location of the sitemap.

If these pages still exist on the site, they are currently orphaned. Find an appropriate location for them in the site architecture, and make sure they receive at least one internal backlink.

Below is the example of the sitemap of Minds Metricks

Off-Page Factor

Off-page factors support the on-page activities through external sources. They help to generate ranking from external sources. Creating backlink is one of the most important factor to rank your website. Here’s a quick idea on how much each factor accounts for.

Backlink Profile

It is highly important to analyze the backlink profile of the website as the site’s authority is determined by the quality of links associated with the website. There are different tools available to get the backlink data like Ahref, backlinko, Majestic SEO, blekko and open site explorer.

Here’s a quick snapshot of the dashboard from Ahrefs, which is a perfect tool and help you analyze a lot of things on your website like Backlinks New, Old and lost, Referring domains, Anchor Text profile, Organic Traffic and many more.

Authority

A website authority is determined by various combination of factors. To evaluate the websites authority SEOMoz has provided us by two important metrics viz. Domain Authority and Page Authority.

Page authority is the prediction of how well the page will perform on search engine. Whereas Domain authority predicts the performance for an entire domain. Both the metrics aggregate various features such as Moz rank, moz trust, quality and quantity of links, trustworthiness etc. to give an easy way to compare the relative strength of page and domain.

Social Engagement

With Web getting social, the success of the website depends on various social platforms on its ability to attract more social mentions. Each Social network has different form of social currency such as like, follow, retweets, +1 etc. The more social currency you have for your website the better. But you should also keep a check on the quality of the profiles that is sharing your website content on social platforms.

Conclusion

So, these are the entities that make an entire SEO audit, and the benchmarks set against each parameter vary as per the auditor, the nature of website, and industry. Now that you have understood what an SEO audit is, why not audit your website, and find the shortcomings that are keeping you from ranking higher in Google?

We at Minds Metricks help you to understand the shortcomings of your website by carrying out an extensive SEO audit for your website.