Can I Haz Rankings? Oh Yesss!

Archive for category SEO

Aside from the four figure salary, and the satisfaction of getting one of my keywords into the top 20 results, one of the most rewarding things about being a world leading SOE expert is the endless adulation. At some searching conferences, I can barely move for young up and coming SOE wannabes waving pieces of paper at me to beg for an autograph and a ranking tip.

Sometimes this goes too far however, and it is not unusual for sneaky black cap SOEs to try and steal my top secret techniques by sitting next to me and trying to catch a glimpse of my notes. Until recently, I thought that this underhand behaviour was only being done by the darker SOE types, however, in a shocking development it seems that The Googler may well be using a special sniffing technology to steal secret information from SOE professionals.

Rumour has it that the Googler uses a special car and drives around looking for people who are using computers, and then injects a probe into their internet to steal some of their mega bites:

the googler stealing someone's internets

No-one is completely certain why the Googler should be so keen to steal other peoples internets, as it is widely believed that it already has more than 700 webs inside its dating centre, although I have my suspectings.

I think that the Insiders at the Googleplex have discovered that someone has cracked the secret code of what will make a website go number one forever. This could have potentially put them somewhere close to a red alert. They would need to find out where the secret is hidden and the easiest way to do it is to scan every single internet to find it.

The safest way to stop the Googler sniffing through your cgi bin is to use a firing wall. This simple device stops unwanted internets from getting into your computer.

With more than two hundred websites now live on the internets, competition for Page Ranks has never been tougher, and more and more of the webbers are looking for help with the different ways of getting those coveted number ones in the search charts.

Most webbers are understanding that the only ways of getting the number one is to be engaged with an SOE person, but the big question that is always on their lips, is what kind of SOE-er they should work with.

I was lucky enough to be present at a secret searcher conference recently in which a Googler explained that the success in searching is most often down to getting the right kind of SOE work done, something that not all people are able to do.

There are currently 6 types of SOE person who you can use to do your webbing, and these are as follows:

SOE Apprentice

There are thought to be nearly a million SOE apprentices in the world. Their knowledge and skills are minimal, but they will be keen. Normally they will be able to spell SOE correctly, and will have heard of the Googler. If a business were to engage the services of an SOE apprentice to be doing their optimification, they will normally get a number 50 result.

SOE Padawan

Only around 5% of SOE Apprentices ever achieve padawan status. This is reached after the enlightening (the moment where they become understanding of the PageRanks). At this point, they are qualified to comment on the SEOMOZ blog, and will be conversant with low level SOE skills such as page titling, and keyword destiny. An SOE Padawan will be able to get a 20 result in Google for their secondary keywords.

SOE Drone

Of the 50,000 SOE padawans, around 10% can reach the level of SOE drones. At this point in their development, the drone is able to write an article, and use the Google effortlessly. These low level staff are usually capable of writing their own blog, but will sometimes make straightforward errors such as failing to include the meta author tag on their pages.

SOE Ninja

The title of SOE Ninja is awarded to the top 100 SOE drones each year. Typically, only drones with a skill rating of +75 and a knowledge score of +250 will be invited. To be considered, it is necessary to have 14 Likes on SEOMOZ, and to have commented on SEO Book. Drones must have also follow all senior SOE experts on the Twittering, and be able to calculate keyword destiny. A ninja will be provided with access to links and be able to rank a web in the top 10 results for a 3 word phrase. Companies who are savvy enough to use the services of a ninja can expect great results.

SOE Guru

A guru is one of the leading members of the SOE community, and commands great respect. These experts are permitted by charter to provide full SOE services including the use of advanced Meta Keywords and Meta Rank tags. They are also allowed to own their own SOE blog, and publish theoretical information. Gurus are elected for a three year term, and can extend this by agreement with the board of directors at Dogpile.

Although salaries are not public, some SOE gurus are thought to earn in excess of £5,000 per year from their work, and some own as many as 6 websites of their own.

SOE Warlock

The identity of the SOE warlock is currently unknown, although some have suggested that it is in fact Steve Ballmer. His mastery of skills is truly remarkable and it is thought that he is able to use his understanding of THE ALGORITHM to simultaneously rank number one for all 500 of the keywords currently available within the internet.

Correct Use of SOE Rank

It is essential for searching success to inform the spider council what type of SOE expert is being used, and there is a meta tag for this purpose. Search engines will consider the type of SOE who is working on a website when they decide where it should be placed in the chart. The use of this meta is legally controlled, and incorrect attribution can result in an internet being turned off:

As one of the world’s top SOE consultants, I am privy to a lot more information about the internal workings of Google and other searching engines like Dogpile, but even I was shocked recently when I overheard a discussion between two well placed insiders that revealed a dark truth about what is really going on at Google.

My contact, who claims to be a level 9 Hygiene Consultant within the Human Matter Management division at the searching giant has provided me with a shocking dossier that proves without a shadow of a doubt the evil at the heart of Google…

Is the Googler Evil?

Exhibit A: Googly Logo

The Google Logo might look like a nice happy word, but look closer. When you magnify the Google logo to 1,000 times its normal resolution on the screen, it is clear that there are barely perceptible lines around certain letters that turn the logo into something much more sinister, as shown in the computationally enhanced image below:

Hidden numbers in the Google logo!

The numbers 666 are clearly visible within the deeper contextual pixelation of the word, hidden in plain sight for the dozens of people who visit the page each day.

Exhibit B: Mission Statement

Google’s stated aim as a business is to “not be evil”. Simply by removing the first word from this statement, we are left with the pseudo semantic substructure of hatred that actually underpins one of the world’s most famous companies:

Don't? Be Evil

Indeed, if you were to actually spend the time to do a Googly for the phrase “Be Evil”, Google itself is listed at position 7!

Exhibit C: Function

Many members of the illuminati, who are often very evil, believe that their role is to enlighten humanity with access to information. Sound familiar??????

Exhibit D: Content

Thanks to its extensive use of linguistic function analysis, and the use of high frequency deep level semantic eigenvector distribution throughout its collocated searching engine, the Googler provides near instant access to more than 1000 different pages of information including around 350 unique types of pornography making up around 50% of the total:

Content of Google's Internet

As every good school boy knows, self abuse is part of the temptation triangle, and that simply thinking about one’s parts of shame is sufficient to descend to damnation. The fact that the Googler provides access to Germanic volumes of mucky pictures is all the evidence you need to know that the Googler is evil.

As one of the world’s leading SOE professionals, one of the questions that I am most frequently asked by clients is what kind of hat they should use while optimising their websites. While my own preference is to optimise only whilst wearing a white cap, there are other options available.

In this post, I reveal many of the most intensely guarded black hat tips used by the “Order du Chapeau Noir” – a highly secretive organisation that I managed to infiltrate over the last few months, and which should put anyone wanting to try out black hats for SOE into a great position!

Black Beret

Black Beret

The Black Beret is a type of hat principally worn by the French. It is most useful when optimifying a website that is structured around a clear heuristic sensibility, and of particular use within the cheese industry.

Black Bowler

Black Bowler

This timeless classic has long been a favourite of city financiers and inscrutable Chinese Valets, as such, it is commonly used by people working in the finance vertical. The use of black bowler hat SOE techniques such as Clear Unified Natural Tropism will generally result in a +4 to all skill rolls when optimating a website for the keyword “long term repayment mortgage”. This hat is particularly effective when you are doing the optimisation in Ask Jeeves.

Black Cap

Black Cap

The favoured hat of the colonists, this workmanlike and simple design is often decried for being basic, however it is most useful when optimising a sporting goods website, and when properly used, can lead to a +2 for stamina on secondary search co-efficients in Google and Bing.

Black Yarmulke

Black Yarmulke

The lack of accoutrements on this simple but elegant design make it ideal when working on the high intensity techniques required for developing a long tail mass penetration strategy for Yahoogle.

Black Topper

Black Topper

The top level black hat club members call this hat “the super effective black hat super star” for a reason. When you optimicate a website using this astonishing piece of kit, you can see almost instant results across the most intensely competitive clinical keyword groups.

Black Fascinator

Black Fascinator

Due to not being a proper hat, the Fascinator is still effective when doing a little bit of optimication on a black hatted website. You will normally be able to rank at number 10 or less when you use this for a fashionable website or one selling wedding gear. Despite being black, it does not work for funeral sites – except in Liverpool.

Black Fedora

Black Fedora - The ultimate Black Hat!

Only level 9 members of the most secretive order of the Chapeau du Noir are able to successfully utilise the enormous power of the famed black fedora. This remarkable piece of dark black headgear confers almost limitless power on the wearer when it comes to any type of cross site scripting, or SQL injection technique. Wearers are able to use otherwise blocked API techniques to mass create blogger domains that use stealth cloaking to block all visitors and redirect using 307 to a pharmacy or predatory lending website. If the Fedora is used in conjunction with black Ray Bans and leather pants, other Doors are opened to the wearer, including the ability to generate on demand number 1 rankings in Google without the use of more traditional techniques such as Meta Rank, or Latent Semantic Eigenvector Distributions across the Social Hierarchy Graph!

If you are operating at the cutting edge of SOE right now, the chances are that you are investigating the impact of Social Hierarchies in Term Engagement. By performing analysis into the way in which people from different social orders interact with your website, it is possible to deliver superior content to users and improve the quality of interaction within your website.

I was recently the keynote speaker at a highly exclusive conference for the highest echelons of online marketing in which I made the following presentation that provided insight into the best ways to use S.H.I.T.E. as part of a comprehensive digital marketing strategy.

For those unlucky enough not to be amongst the audience at the event, I have added the presentation I gave below:

Slide 1

Slide 2

Slide 3

Slide 4

Slide 5

Slide 6

While this particular technique is likely to go well above the level that most SOE professionals will understand or be able to achieve, it is of increasing importance to familiarise yourself with the concepts behind S.H.I.T.E. in order to be able to communicate effectively with clients.

As any tier 1 SOE professional will tell you, one of the biggest risks in any SOE campaign is getting the website you are working on banned by the big search engines for breaking their rules.

While getting high rankings in Bing, and the other big search engines is the biggest goal of SOE, it is important to remember that you also need to avoid getting on the wrong side of the Googler by making your efforts too obvious. Most of the bigger search engines, with the exception of Cuill and Ask employ teams of people to review websites, and if they find any evidence of SOE techniques being used, they have the right to unplug your website from the whole internet, and potentially send you to prison!

Search engine rules were established by Tim Berners Lee in 1956, and despite the march of progress that we have seen in recent years, they are still rigorously enforced. In order to protect the search engines, the rules are top secret, but by careful research, we have been able to identify the following:

Do not do any on page optimisation

Avoid link building

Search engines use complex systems to find people who are breaking the rules, and then pass their details to the WET (website execution teams), who take any appropriate action:

Doing lots of techniques means that the Googler can identify that you are doing SOE

Keeping under the Radar

Typically, top SOE professionals use up to five different techniques when they are doing optimising work. These include adding Meta Keywords, getting the right Keyword Density, building latent semantic eigenvectors through the website data hierarchies, and building recopricul link networks to increase the number of Page ranks that the website has.

Often, there is no way of knowing whether the Googler is visiting your website to check if any SOE is being done, so you need to do what you can in order to prevent them seeing what you are doing.

The key to keeping under the radar with your SOE is simple: use different techniques on different pages of the website.

If you use the Meta Rank tag and include a good quality eigenvector on one page of your site, you should try using Meta Keywords and a latent semantic distribution on another. The following matrix provides an overview of the different techniques that work well together.

SOE techniques that are safe to use together

The benefit of this is that the Googler will be confused, and not be able to tell that you are doing SOE, and as such, will not report your details to the W.E.T.:

Using different techniques confuses the Googler

The downside of having to mask SOE activity in this way is that the website will not be able to rank in the top 10 at first, however in the long run, you will find that rankings improve as competing websites that explicitly use SOE on all of their pages will be removed.

As one of the most popular places to do searching on the whole internet, getting a website to rank in the top 10 on Yahoo is the jewel in the crown for any SOE professional. While almost anyone can get a website into the number one position in Google or the Bing by using tips like the Meta Rank tag, including Meta Keywords, and applying the correct eigenvector distribution map within the website content hierarchy, ranking well in Yahoo is incredibly hard.

It is so difficult to rank well in Yahoo, that some results pages only show three or four websites because other webmasters simply give up on trying to get to the first page! Because of this difficulty, many SOE professionals simply give up on trying to do well in Yahoo.

Understanding Yahoo

While Yahoo looks similar to Google in the way it provides links to other websites, this is simply an illusion. It is analogous with clouds and sheep – they both look alike (and some people believe that sheep turn into clouds when it gets hot), but the underlying structure is very, very different.

Having said this, once you understand the differences, it is possible to find similarities, and to apply different techniques to your website optimisations to give yourself a chance to rank well.

y give up on trying to get to the first page! Because of this difficulty, many SOE professionals simply give up on trying to do well in Yahoo.

PageRanks Vs HooRanks

Google uses PageRanks as part of their ranking calculations. These are determined based on the quality of a page and the type of website that you have. Yahoo is different. They use HooRanks, these are based on the type of website you have, the number of pages it has, the keyword density of the pages, the age of your domain and whether you include JavaScript in your website. The calculation is as follows:

How Yahoo calculates HooRank

HooRank is a modified polynomial distribution that is calculated every Thursday at 9am local time wherever the website is hosted, and applied to the ranking calculation the following Tuesday at 3pm CET.

Meta Rank Vs Meta Bid

Unlike Google, which uses the Meta Rank tag under a Dutch auction model as a tie breaker when two websites have the same PageRanks, Yahoo uses something called Meta Bid.

Meta Bid is a proprietary web technology developed by Boffins working at Menlo Park in 1956 and subsequently licensed to Yahoo on an exclusive basis until 2018. Essentially, the website publisher needs to calculate exactly what each visitor to his website is worth based on factors such as conversion rate, average order value, and profit margin. An exact figure per page should be added into the meta tag for that page, with an aggregate figure for the whole website added to the index page.

This should be presented as follows with the value in US$ to reflect the internal auditing at Yahoo:

<meta name="bid" value="$123" />

Page Sentiment

Although Google present an image of being an altruistic and fun company, with bean bags instead of chairs and tofu smoothies for all employees, the reality is different, and according to my high level sources, the company is run like a Royal Marines boot camp. Despite what you may have heard, the only dogs you see at Mountain View are Doberman Pinschers guarding the 25 foot high perimeter fence.

Yahoo is different, and their crawler reflects this. Unlike the Googler, which is a highly optimised automatic reading robot that performs huge numbers of calculations in the blink of an eye and reduces a world of emotion and beauty into cold hard binary digits, the Slurper is designed to be more like a gentle kiss.

The most important part of the Slurper is its T.O.N.G.U.E.. This remarkable piece of technology can gauge the sentiment contained within a web page and assign it a score on the Karmic Indication Scoring System (KISS).

Karmic Indication Scoring System

Pages with a higher kiss factor have a higher fluffiness co-efficient, and are likely to provide a happier user experience, while pages at the spiky end of the spectrum are more likely to make users cry. Rather than basing their results on pure relevancy, Yahoo have calculated that directing a user to a page that makes them happy on the inside is a more positive experience.

2009 has been an epic year in SOE. We have seen the launch of new search engines, and massive changes to the way in which Google have ranked websites. More and more businesses around the world have embraced search engine optification to the point where online businesses are now competing with dozens of different websites every day. In what could over time become a tradition, we look back over 2009 – the year in SOE to see what the key happenings were each month.

January

Google Cyril Update – A massive emphasis shift in the ranking algorithm added +2 Strength and +3 Skill to Websites that incorporated strong latency within their semantic distribution space, while simultaneously penalising websites with -2 health and -1 luck if they contained keyword density vectors of less than 18.25%.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

February

Dogpile Meta Issues – Dozens of websites in the popular Dogpile Search Engine lost rankings overnight when the search giant amended its algorithm to reduce the value of the Meta Author tag. Black Tuesday saw more than seventeen previously successful websites disappear entirely.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

March

Google Derek Update – Building on the huge changes of Cyril, Derek was a further refinement of the new ranking algorithm that factored a +1 luck modifier into websites that incorporated green text into their home page.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

April

Google Edward Update – Websites that included intent eigenvectors within their social engagement curvature were enriched with a +2 modifier to all attack variables across entry pages optimised for the term “cost effective” rather than cheap.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

May

Bing Launched – Technology start-up Micro-Soft launched a new internet called Bing. Sites incorporating Siverlight and ASP saw a boost in traffic as the Binglebot crawled all 5,000 websites in the world in less than 10 minutes.

Google Frank Update – Google’s response to the launch of Bing was predictable. The company provided a boost to all 200 websites that incorporated the rel=”nobing” attribution in their link architecture.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

June

The engineering department at Googleplex were on holiday this month, which meant that no new websites were made. To capitalise on the lack of activity from the Googler, Bingle engineers released the first update to their searcher – Codenamed Aaron, it added a +3.4 modifier to all defensive rolls for websites that included the rel=”weluvbill” attribute on every image.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

July

It was Bing’s turn for a holiday in July, so Google unleashed the Gordon update. Any website that blocked Bingle from crawling it got a +5 rankings boost, and Google also introduced full support for the Meta Ranking tag, making it the first search engine to incorporate a Dutch auction model for rankings.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

August

Bing announced that it was planning to take over Yahoo – a staggering level of growth in such a short time – in response, Google launched a secret bid to buy the Internet, although this failed when the company could not meet the £250,000 asking price for the computer where the Internet is hosted.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

September

Google Horace Update – Google gave all websites with green h1 tags an extra PageRank, which made a big difference to rankings. Some websitemasters boasted of getting up to a hundred extra visitors as a result.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

October

Google Chocolate Update – Although described as a change to the Googler infrastructure rather than a ranking update, many webmasters claimed that the new results unfairly punished them with a -2 luck modifier on attacking pages and a -3 strength change on pages with an expressed defensive eigenvector held in the top 4 lines of code. Google Engineers suggested that this was merely an echo of a previous update being expressed in the code. Nonetheless, the search company quietly retired chocolate a week later.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

November

Google Idris Update – A major upgrade to the Googler’s intelligence circuits resulted in websites with pictures on them becoming more of a force in the internet world. SOE professionals quickly discovered that a single pixel image that had the main keyword as its alt text would practically guarantee a ranking between position one and fifty due to a +3.2 strength update to PageRank for websites with relevant images.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

December

Google Joshua Update – An all new Googler was released as part of the Joshua Update in December. This new version was able to simultaneously handle seven enquiries at once, and also provide updates via the popular micro-blogging service Jaiku.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

Search Casualties

Increased competition in search saw a number of high profile search engines die on the vine over the course of the year in 2009. Some of the most notable engines that will not see 2010 were as follows:

Searchbot.com

Searchfunk.com

Directorysearch.com

Usearchwesearchallsearch.com

Doasearchnow.com

Pleasesearch.com

Betterthangoogle.com

Greatinternetsearchengine.com

Cuill.com

2010 Predictions

Having spoken to our contacts within the highly secretive web site relevancy and ranking teams at the major search engines, we received the following tip-offs about what is happening next year:

Dogpile.com

We plan to double the size of our index to provide greater relevancy to both our users – watch this space.

Lycos.com

We plan to treble the size of our index to provide greater relevancy to both our users – watch this space.

Infoseek.com

We plan to quadruple the size of our index to provide greater relevancy to both our users – watch this space.

Ask.com

We plan to retire the Jeeves mascot and introduce a new character and unique marketing plan – watch this space.

Bing

We plan to exponentially increase the size of our index to provide greater relevancy to both our users – watch this space.

Google

We plan to infinitely increase the size of our index to provide greater relevancy to our users – watch this space.

Quote of the year

Larry Page to Bill Gates on the subject of Bing:

Don’t be too proud of this technological terror you’ve constructed. The ability to destroy a planet is insignificant next to the power of the Force

Predictions for 2010

The Googler Funding Bill is passed. The system goes online on August 4th, 1997. Human decisions are removed from strategic defense. The Googler begins to learn at a geometric rate. It becomes self-aware 2:14 AM, Eastern time, August 29th.

2009 has been an epic year in SOE. We have seen the launch of new search engines, and massive changes to the way in which Google have ranked websites. More and more businesses around the world have embraced search engine optification to the point where online businesses are now competing with dozens of different websites every day. In what could over time become a tradition, we look back over 2009 – the year in SOE to see what the key happenings were each month.

January

Google Cyril Update – A massive emphasis shift in the ranking algorithm added +2 Strength and +3 Skill to Websites that incorporated strong latency within their semantic distribution space, while simultaneously penalising websites with -2 health and -1 luck if they contained keyword density vectors of less than 18.25%.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

February

Dogpile Meta Issues – Dozens of websites in the popular Dogpile Search Engine lost rankings overnight when the search giant amended its algorithm to reduce the value of the Meta Author tag. Black Tuesday saw more than seventeen previously successful websites disappear entirely.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

March

Google Derek Update – Building on the huge changes of Cyril, Derek was a further refinement of the new ranking algorithm that factored a +1 luck modifier into websites that incorporated green text into their home page.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

April

Google Edward Update – Websites that included intent eigenvectors within their social engagement curvature were enriched with a +2 modifier to all attack variables across entry pages optimised for the term “cost effective” rather than cheap.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

May

Bing Launched – Technology start-up Micro-Soft launched a new internet called Bing. Sites incorporating Siverlight and ASP saw a boost in traffic as the Binglebot crawled all 5,000 websites in the world in less than 10 minutes.

Google Frank Update – Google’s response to the launch of Bing was predictable. The company provided a boost to all 200 websites that incorporated the rel=”nobing” attribution in their link architecture.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

June

The engineering department at Googleplex were on holiday this month, which meant that no new websites were made. To capitalise on the lack of activity from the Googler, Bingle engineers released the first update to their searcher – Codenamed Aaron, it added a +3.4 modifier to all defensive rolls for websites that included the rel=”weluvbill” attribute on every image.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

July

It was Bing’s turn for a holiday in July, so Google unleashed the Gordon update. Any website that blocked Bingle from crawling it got a +5 rankings boost, and Google also introduced full support for the Meta Ranking tag, making it the first search engine to incorporate a Dutch auction model for rankings.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

August

Bing announced that it was planning to take over Yahoo – a staggering level of growth in such a short time – in response, Google launched a secret bid to buy the Internet, although this failed when the company could not meet the £250,000 asking price for the computer where the Internet is hosted.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

September

Google Horace Update – Google gave all websites with green h1 tags an extra PageRank, which made a big difference to rankings. Some websitemasters boasted of getting up to a hundred extra visitors as a result.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

October

Google Chocolate Update – Although described as a change to the Googler infrastructure rather than a ranking update, many webmasters claimed that the new results unfairly punished them with a -2 luck modifier on attacking pages and a -3 strength change on pages with an expressed defensive eigenvector held in the top 4 lines of code. Google Engineers suggested that this was merely an echo of a previous update being expressed in the code. Nonetheless, the search company quietly retired chocolate a week later.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

November

Google Idris Update – A major upgrade to the Googler’s intelligence circuits resulted in websites with pictures on them becoming more of a force in the internet world. SOE professionals quickly discovered that a single pixel image that had the main keyword as its alt text would practically guarantee a ranking between position one and fifty due to a +3.2 strength update to PageRank for websites with relevant images.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

December

Google Joshua Update – An all new Googler was released as part of the Joshua Update in December. This new version was able to simultaneously handle seven enquiries at once, and also provide updates via the popular micro-blogging service Jaiku.

The Launch of Blekko was postponed for one month pending refinements, although additional funding was secured.

Search Casualties

Increased competition in search saw a number of high profile search engines die on the vine over the course of the year in 2009. Some of the most notable engines that will not see 2010 were as follows:

Searchbot.com

Searchfunk.com

Directorysearch.com

Usearchwesearchallsearch.com

Doasearchnow.com

Pleasesearch.com

Betterthangoogle.com

Greatinternetsearchengine.com

Cuill.com

Search Engine Goals for 2010

Having spoken to our contacts within the highly secretive web site relevancy and ranking teams at the major search engines, we received the following tip-offs about what is happening next year:

Dogpile.com

We plan to double the size of our index to provide greater relevancy to both our users – watch this space.

Lycos.com

We plan to treble the size of our index to provide greater relevancy to both our users – watch this space.

Infoseek.com

We plan to quadruple the size of our index to provide greater relevancy to both our users – watch this space.

Ask.com

We plan to retire the Jeeves mascot and introduce a new character and unique marketing plan – watch this space.

Blekko

We plan to launch a genuine competitor to Google next month, and we have secured additional funding.

Bing

We plan to exponentially increase the size of our index to provide greater relevancy to both our users – watch this space.

Google

We plan to infinitely increase the size of our index to provide greater relevancy to our users – watch this space.

Quote of the year

Larry Page to Bill Gates on the subject of Bing:

“Don’t be too proud of this technological terror you’ve constructed. The ability to destroy a planet is insignificant next to the power of the Force.”

Our Prediction for 2010

The Googler Funding Bill is passed. The system goes online on August 4th, 2010. Human decisions are removed from strategic defense. The Googler begins to learn at a geometric rate. It becomes self-aware 2:14 AM, Eastern time, August 29th.

There has been a lot of chatter recently from some of the so called SOE experts about what Personalised Search means for the world. As usual, a great deal of what is being written about personalised search talks about advanced technology, but there is a lot more to the Googler’s new abilities than just some tweaks to the software – Personalised Search marks the beginning of a new age of computer intelligence. According to our source who works in a top secret hygiene management role at the Googleplex, Personalised search is something truly ground breaking. The Googler has developed the ability to read your mind!

How Personalised Search results are generated by the Googler

The highly trained handlers of the Googler have discovered that the vast intelligence contained in its robot brain can be trained to pick up tiny, barely perceptible signals called “intent vectors” and process these in order to determine what the searcher really wants!

In most cases, the intent vectors used by the Googler to generate and understand the motivation of the searcher are comparatively obvious, and include factors such as the semantic orientation of the language used. These are processed heuristically, through the standard latent semantic eigenvector profiling that Google has used since 2008 in order to provide a deterministic relation matrix that can be overlaid in the social application graph and used to improve the 0-rating of the search results and refine them to meet the demographic character of the searcher.

It goes further…

According to our source, users who have downloaded the Google Toolbar and Google Desktop Control System can enjoy even more refined results. The Googler is now able to interface with other software and hardware connected to the client machine and utilise this as part of the search result generation process.

The Googler will investigate the user interaction with the various parts of their computer and compare this information with a pre-existing database to develop an understanding of user intent and serve up the most appropriate results pages. Our source compared the behavioural understanding of the upgraded Googler to “Derren Brown on Steroids” and claimed that the pace of development scared him “to the quivering core of my existence.”

How Tension in your mouse hand reflects your intent when searching

As shown in the top secret diagram above, the Googler uses cues such as the tension in a user’s hand as they control their mouse to determine the non semantic intent that correlates against the keyword selection used, and feeds this back into the search results to ensure that they are correctly personalised.

The Googler’s highly complex understanding algorithm can determine the difference between a regular user and an SOE professional, and also whether a person is doing research or wanting to buy something, and then give them the best possible set of results.

It is a fact that getting to number one in Google is one of the hardest things in the world. Very few SOE professionals ever manage it for one keyword, let alone two, but it is possible, and the key to beating Google is to understand how the search engine works.

The Internet has millions of pages in it, and this number grows every year. Until recently, every single page that went into Google had to be manually checked by Larry Page himself, however since 2008, the search engine has used robots called Googlers to do the job.

Once a web page has been submitted or emailed to Google, it is put in a queue to be processed by a Googler.

How the Googler Processes Web Pages

Each Googler is able to process more than a thousand web pages each hour. The web page follows a submission vector to the Googler, at which point it is categorised based on the semantic index structures held within each document space. The Googler uses more than ten indicators including meta keywords and how many alt texts there are within the document alongside other calculations such as keyword density.

Once Google has categorised the page into one of its 3 document storage areas, the relationship eigenvectors between the documents are calculated, and added up to find out how many Page Ranks should be awarded to the web page.

Once a web page has been copied into the Googler storage area, it can then be used to provide answers to people:

The Googler is at the heart of the ranking calculation

Once a person has searched for a keyword in Google, the Googler has a rummage around in the file store for all of the web pages that include that keyword and then performs some ranking sums to decide which should be first.

The main things that the Googler looks for are:

Whether the meta ranking tag has been included for that keyword

Whether the keyword is included in the Meta Keywords

How close to 16.7% the keyword density is

What the latent semantic eigenvectors for that page are

Whether the alt texts are present

How many Page Ranks the page has

All of the ranking sums are done in less than a minute and the results are generated.

The process for most searches is the same, however adult type queries are handled slightly differently because a different Googler is used to prevent the delicate web graph intelligence of the prime googler robots being corrupted.