ON Page Optimization Tips and Techniques in SEO

ON Page is the most important part of SEO and optimization of which results for a website to be ranked well on search engines. ON Page basically defines rules or guidelines of search engines, which websites have to follow in order to be eligible for ranking on defined business related keywords.

Now, below mentioned terms defines what has to be done to make any website search engine friendly.

Index

It defines how many pages of your website gets indexed on search engine(or especially- Google). So you can check this by just simply writing, “site:” and followed by URL of a website on search engine.

For example : (site:http://www.abc.com) and after you write this in the search bar, you will get number and pages of website ”abc” which gets indexed.

By above process you will get to know, that how many pages gets indexed and how many remains. You also need to index those non-indexed pages in order to get result for those pages also.

Backlinks

This is the term basically used in OFF Page activities, but during the analysis of site we need to check how many backlinks a website has in order to get good ranking, as we all know that it is one of the most important factor.

There are many free backlink checker tools available in the market, however they are not so much effective but atleast you get an idea latest links. Like – Seo Profiler, Backlink Checker, etc.

Google Cache Status

It shows us last time a web crawler crawled a webpage or a website. With last time crawled, we can check how frequently search engine has been crawling a website, which is most important for a website to rank. Because Crawl rate decribes how much a website is crawled, and without which search engine will unable to find any new content in a website.

Domain Age

Domain age is an important factor, because it shows us how old a domain is. Because if a domain is new we couldn’t be able to get any result, reason being Google Sendbox Effect.

Domain Age shows us- Created on , Updated on, and Expired on status of a website.

Duplicate Content

As per the latest algorithm update, website with fresh and unique content will be ranked higher then other’s. So it’s a much needed step for a website to check their content. There are many tools available which provide this facility of checking content.

There are two ways of checking duplicate content –

one is to check content before posting, the best tool available for this facility is Plagiarism Checker.

Another is to check the hosted content, whether it’s duplicate or unique and for that you can check it via Copyscape tool.

URL Optimization

Many times the URL that is generated for a webpage in a dynamic site looks something like this:

http://www.abc.com/blogs/thread.php?threadid=12345&sort=date

A static URL on the other hand, is a URL that doesn’t change, and doesn’t contains characters or words that can’t be understood. It looks like this:

http://www.abc.com/forums/the-challenges-of-dynamic-urls.htm

Static URLs are typically ranked better in search engine results pages, and they are indexed more quickly than dynamic URLs.

Canonical Issue

It is also a kind of issue, in which your website and its related URL’s open with “WWW” and “without WWW”, and if a website is opening with both ways but every time URL’s is different, then this is a canonical issue.

For example : A website “abc.com” when writing like –“ www.abc.com “, it’s opening with www.abc.com “ but when writing like “abc.com” it’s also opening without “www” like “abc.com”.

So, in both cases Search Engine will understand these URL’s different websites, and will index them separately.

Broken Link

Broken links are those links that do not work because the destination has been deleted or the path has been changed. These links should be removed from website and from Search Engine also or we can redirect it to some other webpage, to make a website Search Engine as well as user friendly.

Meta Tags

Meta tags are the important information that we want to notify to Search Engines. Basically most important tags that we used in a website are- Meta Title, Meta Description, Meta Keyword and Meta Robot tags. In order to get best results from Search Engine, we need to update these tags as much as possible.

H1/H2 Tag

If a webpage content is distributed in headline and sub-headlines, then it should be mentioned in header tags i.e. H1, H2……..H6. We can use 6 types of header tags starting from <H1> to <H6>, depending on the types of headline we have in a webpage.

Alt Tags

Alt tag is used to save image in a webpage with a name. Normally we know that Search Engines are unable to crawl any images, so an image can’t judged by Search Engine by what exactly it is. In order to save a image in Search Engine database, we need to give alt attribute (Alt tag) to it so that it cab saved by this name.

Sitemap

Sitemap is a kind of indexing of a website, it maintains how many pages a website have and their URL’s. It is used, so that a Search Engine can easily get to know about all the pages and their redirecting URL’s.

Normally we use 2 types of sitemap in a website –

XML Sitemap – A type which is easily understandable and crawlable by Search Engines.

HTML Sitemap– A type of sitemap to be embedded in a website (in a Footer of website), so that a user can know about all the webpages.

Robots.txt

Robots.txt is a simple text file which is used to notify any of the Search Engine, to not crawl and index a particular page or many webpages.

That means, if you have pages in your website which is not ready to be crawled and indexed then you can notify it by creating a robots.txt file. By which Search Engine will not save low quality pages in it’s index server, so that on the basis of that your overall website will not be harmed.

Anchor Text Optimization

A website should contain Anchor tags targeting their internal pages. This is an ideal practice, by which you are urging a web crawler to crawl all links / pages in a webpage mentioned in Anchor tag. This is a best way to exhibit strong inter-linking in website.

404 Errors

404 errors is also known as broken link or page not found errors. It shows that how many pages of your website are not found. These errors should be removed in order to maintain quality of your website on Search Engine. These errors can be removed by 301 and 302 redirection techniques.

RSS Feed

If you want your visitor will get all updated news about your website, then you must create RSS Feed for your website. This is the only way by which you can enable E-Mail Subscription option in your website. In order to increase your retain percentage of visitors in website, then you must take the advantages of RSS Feed.

Categories

Subscribe Us

ABOUT DIGITALTRACKKER

DigitalTrackker is a blog for those who are looking for tips and solutions for online promotion and revenue generation technique.
Here at DigitalTrackker you can find ways to start and managing a blog, SEO, SMO, SEM or PPC, E-Mail Marketing.
You can read more about this blog on About Us Page.

Categories

Categories

Subscribe to our E-mail Newsletter

Enter your email address to subscribe to this blog and receive notifications of new posts by email