Can I Haz Rankings? Oh Yesss!

As any tier 1 SOE professional will tell you, one of the biggest risks in any SOE campaign is getting the website you are working on banned by the big search engines for breaking their rules.

While getting high rankings in Bing, and the other big search engines is the biggest goal of SOE, it is important to remember that you also need to avoid getting on the wrong side of the Googler by making your efforts too obvious. Most of the bigger search engines, with the exception of Cuill and Ask employ teams of people to review websites, and if they find any evidence of SOE techniques being used, they have the right to unplug your website from the whole internet, and potentially send you to prison!

Search engine rules were established by Tim Berners Lee in 1956, and despite the march of progress that we have seen in recent years, they are still rigorously enforced. In order to protect the search engines, the rules are top secret, but by careful research, we have been able to identify the following:

Do not do any on page optimisation

Avoid link building

Search engines use complex systems to find people who are breaking the rules, and then pass their details to the WET (website execution teams), who take any appropriate action:

Doing lots of techniques means that the Googler can identify that you are doing SOE

Keeping under the Radar

Typically, top SOE professionals use up to five different techniques when they are doing optimising work. These include adding Meta Keywords, getting the right Keyword Density, building latent semantic eigenvectors through the website data hierarchies, and building recopricul link networks to increase the number of Page ranks that the website has.

Often, there is no way of knowing whether the Googler is visiting your website to check if any SOE is being done, so you need to do what you can in order to prevent them seeing what you are doing.

The key to keeping under the radar with your SOE is simple: use different techniques on different pages of the website.

If you use the Meta Rank tag and include a good quality eigenvector on one page of your site, you should try using Meta Keywords and a latent semantic distribution on another. The following matrix provides an overview of the different techniques that work well together.

SOE techniques that are safe to use together

The benefit of this is that the Googler will be confused, and not be able to tell that you are doing SOE, and as such, will not report your details to the W.E.T.:

Using different techniques confuses the Googler

The downside of having to mask SOE activity in this way is that the website will not be able to rank in the top 10 at first, however in the long run, you will find that rankings improve as competing websites that explicitly use SOE on all of their pages will be removed.

As one of the most popular places to do searching on the whole internet, getting a website to rank in the top 10 on Yahoo is the jewel in the crown for any SOE professional. While almost anyone can get a website into the number one position in Google or the Bing by using tips like the Meta Rank tag, including Meta Keywords, and applying the correct eigenvector distribution map within the website content hierarchy, ranking well in Yahoo is incredibly hard.

It is so difficult to rank well in Yahoo, that some results pages only show three or four websites because other webmasters simply give up on trying to get to the first page! Because of this difficulty, many SOE professionals simply give up on trying to do well in Yahoo.

Understanding Yahoo

While Yahoo looks similar to Google in the way it provides links to other websites, this is simply an illusion. It is analogous with clouds and sheep – they both look alike (and some people believe that sheep turn into clouds when it gets hot), but the underlying structure is very, very different.

Having said this, once you understand the differences, it is possible to find similarities, and to apply different techniques to your website optimisations to give yourself a chance to rank well.

y give up on trying to get to the first page! Because of this difficulty, many SOE professionals simply give up on trying to do well in Yahoo.

PageRanks Vs HooRanks

Google uses PageRanks as part of their ranking calculations. These are determined based on the quality of a page and the type of website that you have. Yahoo is different. They use HooRanks, these are based on the type of website you have, the number of pages it has, the keyword density of the pages, the age of your domain and whether you include JavaScript in your website. The calculation is as follows:

How Yahoo calculates HooRank

HooRank is a modified polynomial distribution that is calculated every Thursday at 9am local time wherever the website is hosted, and applied to the ranking calculation the following Tuesday at 3pm CET.

Meta Rank Vs Meta Bid

Unlike Google, which uses the Meta Rank tag under a Dutch auction model as a tie breaker when two websites have the same PageRanks, Yahoo uses something called Meta Bid.

Meta Bid is a proprietary web technology developed by Boffins working at Menlo Park in 1956 and subsequently licensed to Yahoo on an exclusive basis until 2018. Essentially, the website publisher needs to calculate exactly what each visitor to his website is worth based on factors such as conversion rate, average order value, and profit margin. An exact figure per page should be added into the meta tag for that page, with an aggregate figure for the whole website added to the index page.

This should be presented as follows with the value in US$ to reflect the internal auditing at Yahoo:

<meta name="bid" value="$123" />

Page Sentiment

Although Google present an image of being an altruistic and fun company, with bean bags instead of chairs and tofu smoothies for all employees, the reality is different, and according to my high level sources, the company is run like a Royal Marines boot camp. Despite what you may have heard, the only dogs you see at Mountain View are Doberman Pinschers guarding the 25 foot high perimeter fence.

Yahoo is different, and their crawler reflects this. Unlike the Googler, which is a highly optimised automatic reading robot that performs huge numbers of calculations in the blink of an eye and reduces a world of emotion and beauty into cold hard binary digits, the Slurper is designed to be more like a gentle kiss.

The most important part of the Slurper is its T.O.N.G.U.E.. This remarkable piece of technology can gauge the sentiment contained within a web page and assign it a score on the Karmic Indication Scoring System (KISS).

Karmic Indication Scoring System

Pages with a higher kiss factor have a higher fluffiness co-efficient, and are likely to provide a happier user experience, while pages at the spiky end of the spectrum are more likely to make users cry. Rather than basing their results on pure relevancy, Yahoo have calculated that directing a user to a page that makes them happy on the inside is a more positive experience.

January 5th 2010 will be remembered forever as the day the world changed. The Nexus One was launched, and at a stroke, everything was different. Until last week, people have only been able to enjoy websites at home, but now, thanks to the genius shown by the boffins at Google, you can carry the whole Internet in your pocket, and read it wherever you are!

Thanks to a miracle of technology and the use of some incredibly complex document containment algorithms, the Nexus One can display the internet to people wherever they are – in full colour!

Of course, this brings a whole raft of new challenges for SOE professionals – until now, we only had to concentrate on performing the optimisation of a website once so that it would appear at number one in the results, but now, it is necessary to make sure that no matter where people are when they have a go on the Internet, they are still able to find the website.

Getting Online

When you first open your Nexus One, it comes with a full factory installed version of the Internet – although because the phone is in American, a lot of the words are spelled wrong. The clever thing about the Nexus One is that everytime you plug it in to charge it downloads a new copy of the Internet overnight so that you can get all the latest news and pictures and keep it up to date.

Screen Size

In order to get the Internet into a pocket sized device like the Nexus One, Google have had to shrink all of the pages significantly to make them fit. Whereas a normal sized website on a proper computer can feature a screen resolution of up to 800 by 600px, this does not all fit onto a Nexus, which only has a screen that can display 800 by 480 pixels, which is 25% less!

Web page resolution for Nexus One

In order to rank well in a portable Google, your website needs to be in a resolution that fits. Having said that, you also need to remember that not everyone will get a Nexus, and you still want to have a resolution that works in a home Internet too. We recommend that you compromise, and design your pages with a resolution of 800x540px, which is half way in between.

Keyword Density

Because the web page is 25% smaller, you will also need to think about your keyword density – whereas on a web page for the proper Internet, you need to use a density of 16.7%, this simply will not work on a Nexus One portable Internet. In order to fit in with the complex latent semantic eigenvectors that are employed within the floating point calculations utilised when rendering portable Internet pages, you need to scale the keyword density employed on the web page using a parabolic distribution curve – our testing shows that for a page to rank at number one in a portable Google, it needs to have a keyword density of precisely 13.45% – and you can use a maximum of 243 words.

Special Google

Of course, being able to carry the whole Internet around in portable form is no use whatsoever if you can’t remember where things are, so every Nexus One comes complete with a special Google. The real Google is one of the biggest websites around – according to some sources it has more than 5,000 pages! Having that much content inside it would make the Nexus to heavy for most people to carry with them, so Google have designed a special light weight Googler that is just 5% of the size of the proper one – but still manages to read the Internet and help you find things:

Size Comparison between regular and portable googlers

Although the portable version of the Googler is only 5% of the size of the full one, it still contains most of the same algorithmic functions as its larger counterpart – the difference is in its speed. While the latest version of the Googler that works on a proper computer is able to read around 150 pages each hour, the smaller version can only do 50 – which is pretty good considering its size. To get around this restriction, the Nexus One Googler runs constantly day and night to ensure that every time you update the Internet, you are able to find what you are looking for quickly.

Other Features

In addition to carrying a copy of the Internet, the Nexus One also has a number of other features including being able to send and receive text messages, take photographs with a built in (!) camera, and even make phone calls. Of course this is quite difficult, as although the boffins at Google managed to fit the whole Internet into the Nexus One, they neglected to add a keypad, so it is difficult to dial a phone number.

2009 has been an epic year in SOE. We have seen the launch of new search engines, and massive changes to the way in which Google have ranked websites. More and more businesses around the world have embraced search engine optification to the point where online businesses are now competing with dozens of different websites every day. In what could over time become a tradition, we look back over 2009 – the year in SOE to see what the key happenings were each month.

January

Google Cyril Update – A massive emphasis shift in the ranking algorithm added +2 Strength and +3 Skill to Websites that incorporated strong latency within their semantic distribution space, while simultaneously penalising websites with -2 health and -1 luck if they contained keyword density vectors of less than 18.25%.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

February

Dogpile Meta Issues – Dozens of websites in the popular Dogpile Search Engine lost rankings overnight when the search giant amended its algorithm to reduce the value of the Meta Author tag. Black Tuesday saw more than seventeen previously successful websites disappear entirely.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

March

Google Derek Update – Building on the huge changes of Cyril, Derek was a further refinement of the new ranking algorithm that factored a +1 luck modifier into websites that incorporated green text into their home page.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

April

Google Edward Update – Websites that included intent eigenvectors within their social engagement curvature were enriched with a +2 modifier to all attack variables across entry pages optimised for the term “cost effective” rather than cheap.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

May

Bing Launched – Technology start-up Micro-Soft launched a new internet called Bing. Sites incorporating Siverlight and ASP saw a boost in traffic as the Binglebot crawled all 5,000 websites in the world in less than 10 minutes.

Google Frank Update – Google’s response to the launch of Bing was predictable. The company provided a boost to all 200 websites that incorporated the rel=”nobing” attribution in their link architecture.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

June

The engineering department at Googleplex were on holiday this month, which meant that no new websites were made. To capitalise on the lack of activity from the Googler, Bingle engineers released the first update to their searcher – Codenamed Aaron, it added a +3.4 modifier to all defensive rolls for websites that included the rel=”weluvbill” attribute on every image.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

July

It was Bing’s turn for a holiday in July, so Google unleashed the Gordon update. Any website that blocked Bingle from crawling it got a +5 rankings boost, and Google also introduced full support for the Meta Ranking tag, making it the first search engine to incorporate a Dutch auction model for rankings.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

August

Bing announced that it was planning to take over Yahoo – a staggering level of growth in such a short time – in response, Google launched a secret bid to buy the Internet, although this failed when the company could not meet the £250,000 asking price for the computer where the Internet is hosted.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

September

Google Horace Update – Google gave all websites with green h1 tags an extra PageRank, which made a big difference to rankings. Some websitemasters boasted of getting up to a hundred extra visitors as a result.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

October

Google Chocolate Update – Although described as a change to the Googler infrastructure rather than a ranking update, many webmasters claimed that the new results unfairly punished them with a -2 luck modifier on attacking pages and a -3 strength change on pages with an expressed defensive eigenvector held in the top 4 lines of code. Google Engineers suggested that this was merely an echo of a previous update being expressed in the code. Nonetheless, the search company quietly retired chocolate a week later.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

November

Google Idris Update – A major upgrade to the Googler’s intelligence circuits resulted in websites with pictures on them becoming more of a force in the internet world. SOE professionals quickly discovered that a single pixel image that had the main keyword as its alt text would practically guarantee a ranking between position one and fifty due to a +3.2 strength update to PageRank for websites with relevant images.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

December

Google Joshua Update – An all new Googler was released as part of the Joshua Update in December. This new version was able to simultaneously handle seven enquiries at once, and also provide updates via the popular micro-blogging service Jaiku.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

Search Casualties

Increased competition in search saw a number of high profile search engines die on the vine over the course of the year in 2009. Some of the most notable engines that will not see 2010 were as follows:

Searchbot.com

Searchfunk.com

Directorysearch.com

Usearchwesearchallsearch.com

Doasearchnow.com

Pleasesearch.com

Betterthangoogle.com

Greatinternetsearchengine.com

Cuill.com

2010 Predictions

Having spoken to our contacts within the highly secretive web site relevancy and ranking teams at the major search engines, we received the following tip-offs about what is happening next year:

Dogpile.com

We plan to double the size of our index to provide greater relevancy to both our users – watch this space.

Lycos.com

We plan to treble the size of our index to provide greater relevancy to both our users – watch this space.

Infoseek.com

We plan to quadruple the size of our index to provide greater relevancy to both our users – watch this space.

Ask.com

We plan to retire the Jeeves mascot and introduce a new character and unique marketing plan – watch this space.

Bing

We plan to exponentially increase the size of our index to provide greater relevancy to both our users – watch this space.

Google

We plan to infinitely increase the size of our index to provide greater relevancy to our users – watch this space.

Quote of the year

Larry Page to Bill Gates on the subject of Bing:

Don’t be too proud of this technological terror you’ve constructed. The ability to destroy a planet is insignificant next to the power of the Force

Predictions for 2010

The Googler Funding Bill is passed. The system goes online on August 4th, 1997. Human decisions are removed from strategic defense. The Googler begins to learn at a geometric rate. It becomes self-aware 2:14 AM, Eastern time, August 29th.

2009 has been an epic year in SOE. We have seen the launch of new search engines, and massive changes to the way in which Google have ranked websites. More and more businesses around the world have embraced search engine optification to the point where online businesses are now competing with dozens of different websites every day. In what could over time become a tradition, we look back over 2009 – the year in SOE to see what the key happenings were each month.

January

Google Cyril Update – A massive emphasis shift in the ranking algorithm added +2 Strength and +3 Skill to Websites that incorporated strong latency within their semantic distribution space, while simultaneously penalising websites with -2 health and -1 luck if they contained keyword density vectors of less than 18.25%.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

February

Dogpile Meta Issues – Dozens of websites in the popular Dogpile Search Engine lost rankings overnight when the search giant amended its algorithm to reduce the value of the Meta Author tag. Black Tuesday saw more than seventeen previously successful websites disappear entirely.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

March

Google Derek Update – Building on the huge changes of Cyril, Derek was a further refinement of the new ranking algorithm that factored a +1 luck modifier into websites that incorporated green text into their home page.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

April

Google Edward Update – Websites that included intent eigenvectors within their social engagement curvature were enriched with a +2 modifier to all attack variables across entry pages optimised for the term “cost effective” rather than cheap.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

May

Bing Launched – Technology start-up Micro-Soft launched a new internet called Bing. Sites incorporating Siverlight and ASP saw a boost in traffic as the Binglebot crawled all 5,000 websites in the world in less than 10 minutes.

Google Frank Update – Google’s response to the launch of Bing was predictable. The company provided a boost to all 200 websites that incorporated the rel=”nobing” attribution in their link architecture.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

June

The engineering department at Googleplex were on holiday this month, which meant that no new websites were made. To capitalise on the lack of activity from the Googler, Bingle engineers released the first update to their searcher – Codenamed Aaron, it added a +3.4 modifier to all defensive rolls for websites that included the rel=”weluvbill” attribute on every image.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

July

It was Bing’s turn for a holiday in July, so Google unleashed the Gordon update. Any website that blocked Bingle from crawling it got a +5 rankings boost, and Google also introduced full support for the Meta Ranking tag, making it the first search engine to incorporate a Dutch auction model for rankings.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

August

Bing announced that it was planning to take over Yahoo – a staggering level of growth in such a short time – in response, Google launched a secret bid to buy the Internet, although this failed when the company could not meet the £250,000 asking price for the computer where the Internet is hosted.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

September

Google Horace Update – Google gave all websites with green h1 tags an extra PageRank, which made a big difference to rankings. Some websitemasters boasted of getting up to a hundred extra visitors as a result.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

October

Google Chocolate Update – Although described as a change to the Googler infrastructure rather than a ranking update, many webmasters claimed that the new results unfairly punished them with a -2 luck modifier on attacking pages and a -3 strength change on pages with an expressed defensive eigenvector held in the top 4 lines of code. Google Engineers suggested that this was merely an echo of a previous update being expressed in the code. Nonetheless, the search company quietly retired chocolate a week later.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

November

Google Idris Update – A major upgrade to the Googler’s intelligence circuits resulted in websites with pictures on them becoming more of a force in the internet world. SOE professionals quickly discovered that a single pixel image that had the main keyword as its alt text would practically guarantee a ranking between position one and fifty due to a +3.2 strength update to PageRank for websites with relevant images.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

December

Google Joshua Update – An all new Googler was released as part of the Joshua Update in December. This new version was able to simultaneously handle seven enquiries at once, and also provide updates via the popular micro-blogging service Jaiku.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

Search Casualties

Increased competition in search saw a number of high profile search engines die on the vine over the course of the year in 2009. Some of the most notable engines that will not see 2010 were as follows:

Searchbot.com

Searchfunk.com

Directorysearch.com

Usearchwesearchallsearch.com

Doasearchnow.com

Pleasesearch.com

Betterthangoogle.com

Greatinternetsearchengine.com

Cuill.com

Search Engine Goals for 2010

Having spoken to our contacts within the highly secretive web site relevancy and ranking teams at the major search engines, we received the following tip-offs about what is happening next year:

Dogpile.com

We plan to double the size of our index to provide greater relevancy to both our users – watch this space.

Lycos.com

We plan to treble the size of our index to provide greater relevancy to both our users – watch this space.

Infoseek.com

We plan to quadruple the size of our index to provide greater relevancy to both our users – watch this space.

Ask.com

We plan to retire the Jeeves mascot and introduce a new character and unique marketing plan – watch this space.

Blekko

We plan to launch a genuine competitor to Google next month, and we have secured additional funding.

Bing

We plan to exponentially increase the size of our index to provide greater relevancy to both our users – watch this space.

Google

We plan to infinitely increase the size of our index to provide greater relevancy to our users – watch this space.

Quote of the year

Larry Page to Bill Gates on the subject of Bing:

“Don’t be too proud of this technological terror you’ve constructed. The ability to destroy a planet is insignificant next to the power of the Force.”

Our Prediction for 2010

The Googler Funding Bill is passed. The system goes online on August 4th, 2010. Human decisions are removed from strategic defense. The Googler begins to learn at a geometric rate. It becomes self-aware 2:14 AM, Eastern time, August 29th.

There has been a lot of chatter recently from some of the so called SOE experts about what Personalised Search means for the world. As usual, a great deal of what is being written about personalised search talks about advanced technology, but there is a lot more to the Googler’s new abilities than just some tweaks to the software – Personalised Search marks the beginning of a new age of computer intelligence. According to our source who works in a top secret hygiene management role at the Googleplex, Personalised search is something truly ground breaking. The Googler has developed the ability to read your mind!

How Personalised Search results are generated by the Googler

The highly trained handlers of the Googler have discovered that the vast intelligence contained in its robot brain can be trained to pick up tiny, barely perceptible signals called “intent vectors” and process these in order to determine what the searcher really wants!

In most cases, the intent vectors used by the Googler to generate and understand the motivation of the searcher are comparatively obvious, and include factors such as the semantic orientation of the language used. These are processed heuristically, through the standard latent semantic eigenvector profiling that Google has used since 2008 in order to provide a deterministic relation matrix that can be overlaid in the social application graph and used to improve the 0-rating of the search results and refine them to meet the demographic character of the searcher.

It goes further…

According to our source, users who have downloaded the Google Toolbar and Google Desktop Control System can enjoy even more refined results. The Googler is now able to interface with other software and hardware connected to the client machine and utilise this as part of the search result generation process.

The Googler will investigate the user interaction with the various parts of their computer and compare this information with a pre-existing database to develop an understanding of user intent and serve up the most appropriate results pages. Our source compared the behavioural understanding of the upgraded Googler to “Derren Brown on Steroids” and claimed that the pace of development scared him “to the quivering core of my existence.”

How Tension in your mouse hand reflects your intent when searching

As shown in the top secret diagram above, the Googler uses cues such as the tension in a user’s hand as they control their mouse to determine the non semantic intent that correlates against the keyword selection used, and feeds this back into the search results to ensure that they are correctly personalised.

The Googler’s highly complex understanding algorithm can determine the difference between a regular user and an SOE professional, and also whether a person is doing research or wanting to buy something, and then give them the best possible set of results.

It is a fact that getting to number one in Google is one of the hardest things in the world. Very few SOE professionals ever manage it for one keyword, let alone two, but it is possible, and the key to beating Google is to understand how the search engine works.

The Internet has millions of pages in it, and this number grows every year. Until recently, every single page that went into Google had to be manually checked by Larry Page himself, however since 2008, the search engine has used robots called Googlers to do the job.

Once a web page has been submitted or emailed to Google, it is put in a queue to be processed by a Googler.

How the Googler Processes Web Pages

Each Googler is able to process more than a thousand web pages each hour. The web page follows a submission vector to the Googler, at which point it is categorised based on the semantic index structures held within each document space. The Googler uses more than ten indicators including meta keywords and how many alt texts there are within the document alongside other calculations such as keyword density.

Once Google has categorised the page into one of its 3 document storage areas, the relationship eigenvectors between the documents are calculated, and added up to find out how many Page Ranks should be awarded to the web page.

Once a web page has been copied into the Googler storage area, it can then be used to provide answers to people:

The Googler is at the heart of the ranking calculation

Once a person has searched for a keyword in Google, the Googler has a rummage around in the file store for all of the web pages that include that keyword and then performs some ranking sums to decide which should be first.

The main things that the Googler looks for are:

Whether the meta ranking tag has been included for that keyword

Whether the keyword is included in the Meta Keywords

How close to 16.7% the keyword density is

What the latent semantic eigenvectors for that page are

Whether the alt texts are present

How many Page Ranks the page has

All of the ranking sums are done in less than a minute and the results are generated.

The process for most searches is the same, however adult type queries are handled slightly differently because a different Googler is used to prevent the delicate web graph intelligence of the prime googler robots being corrupted.

I was recently involved in a special technology retreat with some of the leading tech bloggers, and as you would expect, we all started talking about search engines and where the next challenger for the Google might come from.

There are loads of start ups at the moment, and picking one of them is pretty hard, because they are all equally amazing, however the name that everyone seemed to be talking about was one called “bing”.

Bing, which you can try for yourself at Bing.com, is a search engine like Google or Cuill, but with a twist – it works on Windows, unlike most of the other search engines where you have to access them through the internet.

Of course the big question for SOE professionals is “How do I rank better in Bing for my main keywords?”

In order to find out, I made a phone call to a leading engineer at Microsoft to ask him what the key to number one rankings in Bing was, but apparently (typical Microsoft!) it is a secret, and they can’t make it public.

Luckily, I have managed to run some tests on this new search engine, and got the answers that the SOE world needs!

There are many similarities between Bing and other search engines. Like Google, Bing has a special electronic robot that they have trained to read HTML. The Bingle is apparently just as clever as the Googler, only bigger. It is kept in one of Bill Gates’ air conditioned garages near Seattle, so it is cold and bitter. As a result of its bitterness, it will not forgive any errors on your pages.

Things to consider for ranking better in Bing

The key factors that you need to remember to rank well in Bing are as follows:

Use a .Net domain

these are like the .net framework and can fool the Bingle

Write your pages in .aspx

Bingle cannot understand HTML properly, and reacts badly to php

Do not use Flash or AJAX

Bingle can only understand Silverlight

Host your website on an IIS Server

Bingle cannot interface with Apache, and any sites on Apache will not be included in Bing

Code your pages using Front Page

The Bingle is incompatible with the code used by other software, and can get broken by it. Once Bingle breaks, it takes almost an hour to start again, and it will not come back to your website.

Recently I was the main speaker at an incredibly prestigious international search conference where I delivered a completely amazing presentation about a number of cutting edge SOE techniques that have yet to be discredited.

Much of the talk around the conference floor was about social medium. Over the last year or so, there has been an explosion of social medium websites like Face book and My Tube, and they have literally hundreds of users.

So, how can we as SOE professionals get more out of these websites?

Twitter

Twitter is great for friends and messages. You can write a message to your friends using Twitter, and then take advantage of the relationship vectors within the system to promulgate your links. The best way of twittering for SOE is to write all of your meta keywords for a page into a twit and then adding a link to that page.

The Googler has a Twitter name. You should add @google to all of your twits and then they will count as page ranks. An Advanced Level Twitter Technique that leverages the full power of the social graph is to create several hundred additional accounts and “Re Twit” each of your messages to Google. Each “Re twit” adds a popularity modifier to the total Page Ranks you score that can almost double their value.

Face Book

Face Book is the best social medium for photos. You can do twits on Face Book like on Twitter, except the Googler cannot read them. Ideally avoid having privacy on your account, as this prevents searchers from seeing your things.

Because Face Book is good for images, you should take photographs of your web pages and up load them to Face Book. The Googler cannot completely see the pictures, but you can add captions to them that use your meta keywords and URL.

YouTube

You Tube is mostly about videos. These films can contain anything at all, and are saved in a special language that Google can understand. You can include your URLs inside the video, and use the soundtrack to record your keywords over the top.

Each time a video gets watched, it is the same as getting a new link. The most popular videos are of people falling down stairs, or being mauled by dogs, so this site is a great choice for anyone with a personal injury internet.

MySpace

My Space was the first website to have music on it, and is famous for the creative web design techniques that is used. The technology used to encode the music on My Space is completely state of the art. Although the Googler has no real soul and as such cannot appreciate any tunes apart from electropop, it is capable of “hearing” the words.

To get the most success on My Space, you need get friends – it’s a good idea to pretend to be an 18 year old girl and use a picture of someone pretty for you profile. Simply read out URLs and Meta Keyword lists over the demo track from a Bontempi Super Reed Electronic Organ, and the Googler will treat each song as the equivalent of 4 page ranks. You may also get a record contract.

Recently I was the main speaker at an incredibly prestigious international search conference where I delivered a completely amazing presentation about a number of cutting edge SOE techniques that have yet to be discredited.

Much of the talk around the conference floor was about social medium. Over the last year or so, there has been an explosion of social medium websites like Face book and My Tube, and they have literally hundreds of users.

So, how can we as SOE professionals get more out of these websites?

Twitter

You can use Twits to boost your ranks

Twitter is great for friends and messages. You can write a message to your friends using Twitter, and then take advantage of the relationship vectors within the system to promulgate your links. The best way of twittering for SOE is to write all of your meta keywords for a page into a twit and then adding a link to that page.

The Googler has a Twitter name. You should add @google to all of your twits and then they will count as page ranks. An Advanced Level Twitter Technique that leverages the full power of the social graph is to create several hundred additional accounts and “Re Twit” each of your messages to Google. Each “Re twit” adds a popularity modifier to the total Page Ranks you score that can almost double their value.

Face Book

Facebook - mainly used for images

Face Book is the best social medium for photos. You can do twits on Face Book like on Twitter, except the Googler cannot read them. Ideally avoid having privacy on your account, as this prevents searchers from seeing your things.

Because Face Book is good for images, you should take photographs of your web pages and up load them to Face Book. The Googler cannot completely see the pictures, but you can add captions to them that use your meta keywords and URL.

You Tube

You can add your URL as a link to a video

You Tube is mostly about videos. These films can contain anything at all, and are saved in a special language that Google can understand. You can include your URLs inside the video, and use the soundtrack to record your keywords over the top.

Each time a video gets watched, it is the same as getting a new link. The most popular videos are of people falling down stairs, or being mauled by dogs, so this site is a great choice for anyone with a personal injury internet.

My Space

Add Songs to promote your website on my space

My Space was the first website to have music on it, and is famous for the creative web design techniques that is used. The technology used to encode the music on My Space is completely state of the art. Although the Googler has no real soul and as such cannot appreciate any tunes apart from electropop, it is capable of “hearing” the words.

To get the most success on My Space, you need get friends – it’s a good idea to pretend to be an 18 year old girl and use a picture of someone pretty for you profile. Simply read out URLs and Meta Keyword lists over the demo track from a Bontempi Super Reed Electronic Organ, and the Googler will treat each song as the equivalent of 4 page ranks. You may also get a record contract.

Others…

There are plenty of other social mediums that you can use too:

Bebo – Popular with children

Orkut – Popular in Brazil – possibly good for waxing products

Spotify – Music – possibly good for acne/skin care websites due to semantic emphasis on domain name.

Although links are all the rage in SOE right now, a discussion I had with an employee from one of the top search engines suggested that there are alternatives to traditional links. In addition to the 5 types of SOE link, Google also considers other factors as part of their calculation of how many page ranks a web site should have.

Despite the publicly promoted image of the Googler as being a sort of computer programme that lives in a computer in Matthew Cutts’ garage, the truth is that there is actually evidence that the Googler is in fact a real mechanical robot that is kept in Larry Page’s air conditioned shed where it spends time looking at the Internet and helping little Tanzanian children to read:

The Googlerplex

In addition to having super fast 500K broadband that enables it to look at the whole internet at record speed, the Googlerplex also has a device called the Optiscope, through which the Googler robot is able to visually see any spot on the earth’s surface via a network of mirrors.

According to my contact who worked in a classified hygiene management project at Google, it is this unique optiscope that provides the Googler with a credible alternative to traditional links when it comes to deciding how popular websites should be. This connection to the “real” world lets the Googler see physical citations of websites, and factor these into the ranking:

Page rank equivalency table

Shops offer a high number of page ranks because any company that has them is likely to treat customers well, and appearing in a newspaper is also good because it means a company is trusted. Billboards don’t offer much for a company because they are the equivalent of a paid link – although apparently a graffiti of your URL is worth 7 page ranks because it is editorially given.

A typical Googler view

In order to take full advantage of real world link alternatives, it is important to be certain that you associate them with relevant semantic indicators to provide the same information that anchor text does on a website, so a good tip is to include your standard meta keywords on any posters along with the URL of the product that you are advertising.

I was recently invited to a top secret advanced SOE conference in which some of the world’s top experts got together to discuss what the best techniques were. Something that was a definite hot potato was Replicate Content, and how you can make it into an important part of any search optimisation strategy.

In this week’s blackboard Monday, we’re looking at ways that some of the top SOE experts are using replicate content to give their rankings a massive boost!

What is Replicate Content?

2 pages that are the same

The simplest definition of replicate content is when the entire semantic information and syntactic distribution structures within two separate pages are entirely duplicated. There are other sub types of replicate content where only certain elements of the overall content vector are included on both pages – some webmasters only use the same HTML on their pages, while others only include the same Meta Keywords. When the Googler is looking for replicate content, it only counts true duplication – where all of the page is replicated 100% – even down to the alt texts.

How do I do replicate content?

Replicate Content Checklist

In order to have proper replicate content, you need to ensure that you copy all of the elements in the check list above – if you don’t include all of them, the Googler will not be able to tell that you have got the content properly replicated and not see the full benefits.

Why would I want to do replicate Content?

How the Googler treats replicate content

Quite simply, replicate content is one of the most powerful tools available to any SOE. The reason why the Googler loves replicate content is because when it finds two pages that are completely identical, it creates a syntactic diffusion between the correlated latencies of the file distribution within the Googler’s data centres. If the Googler was a human, it would experience something similar to the feeling a person gets when they see their mum. Thanks to the elevated intelligence structures of the Googler’s recognition algorithm, seeing pages that are the same is a reassuring thing that gives it greater confidence in the content.

The benefit of Google having more confidence in the web page due to it seeing the replicate content is that it will then bestow a far higher trust vector hierarchy on the page, and thereby give it a boost in the search results.