You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today!

If you have any problems with the registration process or your account login, please contact us.

An algorithm is defined as: In mathematics, computing, and related subjects, an algorithm is an effective method for solving a problem using a finite sequence of instructions. Algorithms are used for calculation, data processing, and many other fields.

Obviously, Google’s algorithm (or algo as I will refer to it from this point on) is a secrete and highly protected though some basic information can has been released publicly since the release of their patent.

The purpose of this article is to give you a basic idea of what Google is doing on their end when reviewing each website they spider. Although the algo has slight frequent changes, the basics have remained the same for years and seem to be a permanent structure. Reminder, this article is not going to put you #1 for your keyword, there is obviously a lot more to SEO then knowing the Algo, but using the Algo to your benefits will definitely get you on the right track to earning your rankings.

Google’s Considerations when a Spider or Robot finds your site:

Image Content – Originality and Quality

Note: Google cannot “See” images thru their robots and spiders so renaming images and including proper keyword rich descriptive Alt tags give you the flexibility to be original. If all your images are named ####.jpg with 0 alt tags, then all spiders/robots are blind to how relevant this content is to your site.

Know the keywords most fit for your niche or content and use that keyword frequently while not over saturating it. Simple <b>keyword</b> tags used randomly throughout your site is also good thing and frequently over looked by many site owners. This tells Google that you’re emphasizing this word for a reason and they instantly know whether or not that reason is valid and why you did it. Again, do not over use this method or you will be considered “Spammy” which you obviously don’t want to happen. 2 to 3 <b> texts are plenty.

Outbound links – It is critical that if you are going to have numerous outbound links to sponsors, or hosted image pages, etc that you use no_follow tags. All inbound links to other pages of your site are highly recommended to follow unless leading to a page you don’t want indexed, but the most frequent error I see webmasters making is linking to 100’s of sponsors and giving them all your link juice while seeing nothing in return. These days, it’s safe to have 75 or less outbound links which may seem like a lot to many, but quickly adds up in a short period of time.

Cataloguing Keywords – Google can instantly determine your keywords, their frequency and relevancy upon spidering your site. Frequent changes in these can make the site look inconsistent and unreliable so for all tgp/blog/tube owners out there, you need to get some good solid “Static” keyword rich text on your site.

Meta Tags – For years I have heard meta tags are dead and I not only highly disagree, the proof is in the pudding. Meta tags pull a lot of weight and are first to get spidered if your site is properly optimized.

Title – This is the most important yet most commonly abused or improperly written. Google has a standard for the amount of characters as well as the type of characters used. Keep it under 66 characters with spaces and relevant to your content while using keywords you’re seeking. In addition, and though Google will still list you with characters such as “&”, Google spiders do not see the characters as “&”, they actually read it as an “alt” which they are smart enough to code, but it still effects your title relevancy score therefore lowering your over all SEO score. In addition, it’s recommended to use a hyphen (-) in place of an underscore (_) for spidering purposes as well. Keep the title from looking spammy, too many keywords, misspelled words, too many characters and non-recommended characters are going to affect your rankings. Google loves quality, not crap.

Description – Google will place this info under your title in their listings. Make it dynamic, descriptive and keyword rich while staying under 150 characters. Do NOT use the same keyword more than 3 times max to avoid looking spammy.

Keywords – Google scans these, and though not critically important for Google, they do pull weight if kept under 800 characters with no more the 3 usages per keyword, but other search engines still rely heavily on them. I never overlook using them and when done properly, do assist in your rankings.

Inbound Links – Google wants to see that other related quality sites are linking into you. I cannot express quality enough. Do some research into every site you consider buying a link on, as well as network wide links. Stay away from “FFA” (free for all) link pages in which 1000’s of sites are linking from. This can be a huge blow to your rankings and cause detrimental damage to your listings. Make sure you apply slight changes to your titles and anchors as you build your links. Using the same phrases/anchors over and over again looks spammy and quickly picked up by Google. Below are a few things you don’t want to over look:

Content Relevancy

Age

Pages Indexed

Server side stats – how many server changes have occurred, how many dns changes, how many who is records, etc. Too many of any of these makes the site look unstable.

Ip’s and multi c-class servers – Anytime you’re getting links on multiple sites, look into their ip’s and c-class server setup. If sites are sharing an IP and you still want them to link to you, be sure to mix up your anchors and titles. Using the same Anchor/Title on multiple c-class IPs if fine, but don’t overdo it or Google will flag you for spam. Deep link where you can and as often as you can. To determine bulk PR, one of our go to sources is http://www.authoritydomains.com/bulk-ip-checker.php.

Pagerank – Page rank is not as important as many people think, though when valid gives you a quick idea of whether the site is properly managed. PR Zero sites can rank #1 on Google just as fast as a PR3+ but anytime a site has PR, this tells you that they have taken the time and effort to build their own links so chances are the site has some quality to it. Not all cases, but you will have to determine this on a site by site bases. A common question I get is how to tell if a site has Fake PR. Simple Google – info:sitename.com and you should see the listing for the given site. If a different site pulls up, they are pulling the PR from that site. To check PR and Bulk PR we use http://checkbulkpagerank.com/

Check HTML to assure they don’t have a no_follow on all links tag or you will get 0 juice from them. This is most commonly overlooked, especially when getting links from other blogs which can use no_follow plugins.

Whois Information – yes, believe it or not, Google will follow your site into your who is information provided by your hosting company which obviously knows your website, name, telephone number, email, physical address, how long your site is registered, etc. Since this is the case, I actually use meta tags that match my who is and have found that is pays off. An example of the meta info on one of our sites is as follows:

Headings - <h1> and <h2> tags are most important and <h3> is always a good solid bonus. Your <h1> should pretty much be the main keyword you’re going after. <h2> should be describing that site in 3 to 5 words and most video titles, section titles, post titles, etc can be your <h3> tags. You can use css to control the look of the headings to assure it fits the theme of your site.

Title and Alt tags – plain and simple, all Large images need alt tags and all links need title tags.

As you can see, and as many of you know, Google takes an extensive look into your site each time its spidered. SEO is a full time job for any site, and understanding the things Google looks for each time they visit your site is a good starting point to ranking your site.

About The Author:
Bobby Taylor, also known as 2bet, has spent nearly 11 years in the adult industry. In 2004, he successfully combined gaming and adult through Webmaster Poker Tournaments on 2bet.com. He credits the rapid growth of 2bet.com to successful search engine optimization, and moved solely into SEO in 2007. In 2008 SEO AP was publicly launched and recently in 2009, a sister company site, X RATED SEO emerged.

That's PageRank. The definition hasn't changed since Google was born. If you can remember Altavista before Google, it indexed based on keyword density alone. There was no weight given to backlinks. Google revolutionized search by factoring in backlinks. Essentially, each page has a global rank which is a number from 1 to the total number of pages indexed. What people usually refer to as PR is a number from 1 - 10 that tells you where the pages rank falls on a logarithmic scale (A PR2 is twice as good as a PR1, a PR3, is twice as good as a PR2, etc).

Now in the early days PR and keyword density were all that mattered. It was so easy to manipulate that I could spend $30 to catch an expiring PR7 domain about ceramic clowns, put gibberish seeded with porn keywords on thousands of pages and turn it into $10,000 in a month without doing anything else. If you wanted to outrank a site for a keyword, you could buy any domain with a higher PR than the #1 and match the keyword density. It would work. Every time.

Now obviously the modern ranking algorithm is much more advanced and the original PR concept counts so little that it's statistically insignificant. It basically means nothing. For whatever reason, it's still calculated you can still look it up. Because of this people tend to get the impression that either it's still a ranking factor, or it reflects a more modernized metric. This is not true.

There is no single metric revealed by Google that could be considered the modern equivalent of visible PageRank, but a few third parties offer their own metrics that are actually very useful. These companies basically crawl the web just like Google, gathering and calculating metrics as closely as they can to the actual Google algorithm. Instead of creating a search engine, however, they use the data to build a database of SEO metrics that are actually useful to us.