Tuesday, December 07, 2004

Human editors and web search

Danny Sullivan (Founder of Search Engine Watch) says the solution to manipulation of search result rankings is to:

... involve human editors as part of the search equation. At one time, several search engines allowed human beings to make editorial choices about what would be shown in response to a query, to complement technological selections. Today, all the major services have sadly followed Google's lead in assuming all things can be solved through automation and search algorithms.

I assume Danny doesn't literally mean human editors hardcoding which results are returned for queries. How do human editors scale to billions of web pages? How do you do this efficiently and effectively, at low cost with high quality?

You might imagine that humans could provide canned responses to the most frequent queries. But this would only apply to a small subset of queries, and even this would be prohibitively expensive to maintain.

A more scalable and more common form of this is shortcuts where a search engine will detect particular categories of queries and return some results from a specialized data source. This is automated, of course, but humans are involved in identifying and creating the shortcuts.

I do wonder how much this debate of human vs. robots is a real issue. Truth be told, search engines have teams of good ol' humans analyzing data behind the scenes. These humans discover patterns in the data that are lowering the quality of the relevance rank, such as search engine spam, and change the algorithms to adapt.

Is this different than using "human editors as part of the search equation"?