Featured Research

from universities, journals, and other organizations

Google Favored Over Other Search Engines By Webmasters

Date:

November 16, 2007

Source:

Penn State

Summary:

Website policy makers who use robots.txt files as gatekeepers to specify what is open and what is off limits to Web crawlers have a preference for Google over other search engines, say researchers whose study of more than 7,500 Websites revealed Google's advantage. "With the preference, Google can index some information which other search engines can't," according to one of the researchers.

Share This

Web site policy makers who use robots.txt files as gatekeepers to specify what is open and what is off limits to Web crawlers favor Google over other search engines, say Penn State researchers whose study of more than 7,500 Web sites revealed Google's advantage.

Related Articles

That finding was surprising, said C. Lee Giles, the David Reese Professor of Information Sciences and Technology who led the research team which developed a new search engine--BotSeer--for the study.

"We expected that robots.txt files would treat all search engines equally or maybe disfavor certain obnoxious bots, so we were surprised to discover a strong correlation between the robots favored and the search engines' market share," said Giles of Penn State's College of Information Sciences and Technology (IST).

Robots.txt files are not an official standard, but by informal agreement, they regulate Web crawlers -- also known as "spiders" and "bots" -- which mine the Web 24/7 for everything from the latest news to e-mail addresses. Web policy makers use the files found in a Web site's directory to restrict crawler access to non-public information. Robots.txt files also are used to reduce server load which can result in denial of service and shut down Web sites. But some Web policy makers and administrators are writing robots.txt files which are not uniformly blocking access.

Instead, those robots.txt files give access to Google, Yahoo and MSN while restricting other search engines, the researchers learned.

As an example, some U.S. government sites favor Google's bot--Googlebot--followed by Yahoo and MSN, according to the researchers.

While the study doesn't include explanations for why Web policy makers have opted to favor Google, the researchers know the choice was made consciously. Not using a robots.txt file gives all robots equal access to a Web site.

"Robots.txt files are written by Web policy makers and administrators who have to intentionally specify Google as the favored search engine," Giles said.

Not every site has a robots.txt file although the number is growing. Of the 7,500 sites analyzed by the researchers, about four in 10 had a robots.txt file--up from less than 1 in 10 in 1996.

That growth, which the researchers anticipate will continue, was one reason for the study.

The researchers didn't know what they would find when they set BotSeer on the loose to look at and index the content of the robots.txt files of the Web sites which spanned several market segments including government, newspaper, university and Fortune 1000 companies.

"Our intent was exploratory--to see if there was anything interesting," Councill said. Consumers with a soft spot for Google aren't affected by the bias. But consumers who prefer other search engines may be at a disadvantage.

"With the preference, Google can index some information which other search engines can't," Giles said.

This finding is described in a paper, "Determining Bias to Search Engines from Robots.txt," given at the recent 2007 IEEE/WIC/ACM International Conference on Web Intelligence in Silicon Valley. Besides Giles, the authors include Yang Sun and Ziming Zhuang, IST graduate students, and Isaac Councill, an IST post-doctoral scholar.

Story Source:

The above story is based on materials provided by Penn State. Note: Materials may be edited for content and length.

More From ScienceDaily

More Computers & Math News

Featured Research

Mar. 3, 2015 — By examining the forces that the segments of mosquito legs generate against a water surface, researchers have unraveled the mechanical logic that allows the mosquitoes to walk on water, which may ... full story

Mar. 3, 2015 — Major cities in the UK are falling behind their international counterparts in terms of their use of smart technologies, according to a new study. The research has found that smart cities in the UK, ... full story

Mar. 3, 2015 — To simulate chimp behavior, scientists created a computer model based on equations normally used to describe the movement of atoms and molecules in a confined space. An interdisciplinary research ... full story

Mar. 3, 2015 — Magnetic vortex structures, so-called skyrmions, could in future store and process information very efficiently. They could also be the basis for high-frequency components. For the first time, a team ... full story

Mar. 2, 2015 — A method for analyzing and predicting nature's dynamic and interconnected systems has improved forecasts of populations of Fraser River Sockeye Salmon, a highly prized fishery in British Columbia, ... full story

Mar. 2, 2015 — The odds of picking a perfect bracket for the NCAA men's basketball March Madness championship tournament are a staggering less than one in 9.2 quintillion (that's 9,223,372,036,854,775,808), ... full story

Mar. 2, 2015 — Scientists report that they could observe experimentally the current flow along channels at the crystal surfaces of topological insulators. The channels are less than one nanometer wide and extend ... full story

Mar. 2, 2015 — Organic light emitting diodes (OLEDs), which are made from carbon-containing materials, have the potential to revolutionize future display technologies, making low-power displays so thin they'll wrap ... full story

Mar. 2, 2015 — What if one day, your computer, TV or smart phone could process data with light waves instead of an electrical current, making those devices faster, cheaper and more sustainable through less heat and ... full story

Mar. 2, 2015 — 3-D printing could become a powerful tool in customizing interventional radiology treatments to individual patient needs, with clinicians having the ability to construct devices to a specific size ... full story

Featured Videos

Forensic Holodeck Creates 3D Crime Scenes

Reuters - Innovations Video Online (Mar. 3, 2015) — A holodeck is no longer the preserve of TV sci-fi classic Star Trek, thanks to researchers from the Institute of Forensic Medicine Zurich, who have created what they say is the first system in the world to visualise the 3D data of forensic scans. Jim Drury saw it in operation.
Video provided by Reuters

Related Stories

July 28, 2014 — If someone is thinking about suicide, the Internet often serves as a source of information. The dangerous thing about this is that content containing harmful information that can potentially ... full story

Apr. 12, 2011 — All of those Twitter tweets and Facebook friends may have value after all, according to researchers. Updates on Twitter, Facebook, LinkedIn and other real-time content sites could be worth more than ... full story

Nov. 25, 2010 — Search engines like Google have become part of everyday life, not least in the academic context. But if knowledge is power, then search engines themselves are gaining ground as power nodes in their ... full story

July 26, 2010 — College students trust Google so much that a study has found many students only click on websites that turn up at the top of Google searches to complete assigned tasks. If they don't use Google, ... full story

ScienceDaily features breaking news and videos about the latest discoveries in health, technology, the environment, and more -- from major news services and leading universities, scientific journals, and research organizations.