A Single Search Index Would Speed Up The Entire Internet - A Zero Carbon Speed Boost

Posted by Tom Foremski - November 24, 2009

Yesterday I argued that a single search index administered by a non-profit could address issues around de-listing and indexing out-of-print books.

Google's founders supported the idea when they were at Stanford university.

A single index would allow Google and others to apply their analysis and their algorithms to the same data set creating a level playing field.

A single index would also speed up the entire Internet. For example, looking at my server logs for SVW, it shows 16 robots/spiders visiting the site. In total they are responsible for 37% of the hits and 45% of the bandwidth.

These spiderbots are creating their own index of the Internet and they are taking up nearly one-half of my bandwidth and slowing down my server. SVW has fresh content everyday so it gets more attention from spiderbots, but this is happening to tens of millions of other web sites every day.

Spiderbots are a huge drain on Internet resources

If there were a single search index that could be held in the public domain, administered by a non-profit, it would cut down on the spiderbots and speed up the entire Internet -- all without having to install any new routers, servers, or lay new fiber optic lines!

Google would still be Google as would Microsoft's Bing because their value lies in their analysis and ranking of the search index. But it could also spur innovation because startups wouldn't need to spider their own index. We could see all sorts of innovation in algorithms and applications.

Larry Page and Sergey Brin were once strong supporters of the idea that the search index should be run by a non-profit.