Search Engines Algorithm

Despite the huge sums invested in search engines, and their ubiquitous in front of Internet users, their algorithm is still something primitive which is based on keywords and various signals. Some are considering moving to a semantic search, ie a better understanding of the content of sites to associate it to queries in context. Could this work? To what extent can we understand what the user wants without a deep intrusion into his private life?

This study of search engines and SEO reports the current knowledge of algorithms and the effect on the ranking in search results pages. Google, Bing, Yandex, Yahoo, DuckDuckGo have common rules and peculiarities.
This knowledge of the algorithms is essential for the visibility of a site. Google evaluates them on the basis of statistics. If a site, perfectly interesting and useful, can be described by criteria generally attributed to spam sites, it will be penalized.

Ranking according to a Google patent
From a patent from Google about techniques for calculating search results,
it showsand why
a website goes to sandbox. Google attribute to each page a score to rank in the list of results that
are relevant to a search. The PageRank is only a part of scores of pages.

Criteria of ranking
Based on the safest sources, avoiding misconceptions on how the results are determined by search engines.

Google has made several tip sheets to the attention of webmasters, and held roundtables to inform them. All this information is listed below. Since the algorithm becomes more and more aggressive towards spams and uses approximate signals to identify them, we must know the way to cross this minefield.