Cyber crooks are using one of Google Inc.’s own tools to poison search results with links that promote fake security software, a researcher said today.

“Malware distributors have abused Google Trends before,” said Craig Schmugar, a senior threat researcher at McAfee Inc. “But I’ve never seen them use it as aggressively as they are now.”

Google Trends, a tool the search giant rolled out last June, which measures and compares public Web site traffic across geographies and related Web sites and searches. It then highlights the most popular searches of the past hour.

At midday Thursday, for instance, the No. 1 search phrase, according to Trends, was “Obama budget.”

Scammers and malware makers are closely monitoring Google Trends to guide them in selecting search phrases and legitimate news content, which they then integrate into their own fly-by-night sites, said Schmugar.

The idea is to “game” Google into ranking their malware-hosting sites near the top on scores of high-profile, current events-related search results.

“I’m not talking about just a few sites,” Schmugar said. “I’ve collected a lot of them, with poisoned links [in Google search results] that are pretty high up, almost always in the top 10.”

News accounts recently abused by hackers have ranged from this weekend’s stories about a worm spreading on Facebook to the attack last week by a chimpanzee that left a Connecticut woman in critical condition, said Schmugar.

“They’re grabbing content from pages that are already popular,” he said. “They grab content from those pages and put it on their own site.”

More recently, Schmugar has monitored poisoned links ranked high on searches for “Gmail down,” a reference to the two-and-a-half hour outage at Google’s Web-based e-mail service on Tuesday.

As malicious sites share the same content as legitimate pages currently of great interest, Google’s ranking algorithms push those sites toward the top when people search for that news item or use those search strings.

Another reason for the high ranking is scammers name those pages with the popular search phrases it pulls from Trends.

“It looks like they’re following Trends, which refreshes every hour, and then reacting very quickly to produce their own sites,” said Schmugar.

The only common element he’s found so far among those sites is that they are all hosted on free site-hosting services.

“Some portion of this must be automated,” he added, to account for the quick reaction time to the hot searches and content touted by Trends.

All the poisoned links lead to sites that hit users with phony security warnings; those alerts then try to trick users into downloading a free antivirus program.

The download, however, is actually a Trojan horse that continues to dun the user with fake warnings. The only way that users can stop the messages, and to supposedly clean their PCs of infection, is to pay for the worthless software.

Distributing “scareware,” as the category is sometimes called, can be very lucrative. Last year, Joe Stewart, director of malware research at SecureWorks Inc., said he had found evidence that some hackers were making as much as $5 million a year from the practice.

Schmugar’s advice? “Look carefully before you click,” he said.

When it was unveiled last June, the Google Trend for Websites tool came under fire from early reviewers because it didn’t allow traffic on many of he company’s own properties, including Google.com, YouTube, Blogger and Picasa, to be measured.

In a blog post announcing the tool, R.J. Pittman, Google’s director of product management for consumer search properties, acknowledged one issue — that results compiled by the tool may not match other data sources tracking Web site traffic because the data is estimated and aggregated from a variety of sources.

But technology blogger Michael Arrington suggested that the inability to measure traffic on its own sites is a much bigger problem for Google Trends.

“Google simply isn’t able to use its own tools for estimating traffic — since by definition all the data is being gathered by Google for the product is from Google users (their toolbar, for example), the data for Google’s sites would be skewed to 100 per cent of all Internet users,” Arrington noted.

“It points out an inherent flaw in the product, and I’m not sure Google can easily solve it.”

Mashable blogger Adam Ostrow questioned whether the new Google tool will fare better than established traffic-measurement services like Alexa and Compete, which have been criticized for inaccuracies.

“Will Google, with its mounds of search and clickthru data, be able to do better?” Ostrow asked. “Maybe, or maybe not.”

Ostrow also pointed to Google’s acknowledgment that all results from the new tools will be estimated and will include aggregated data from anonymous opt-in data-sharing settings in Google Analytics.

“In other words, this means that sites that opt in to share their Google Analytics will have more accurate results than those that don’t,” he added. “Personally, I can accept the inaccuracy of third-party stat tracking services, but with some sites opting in to share their data and others not, it strikes me that the comparisons won’t be completely apples-to-apples.”

However, he added that Google’s information on geography, other sites visited and similar searches in the new tool are likely to be very accurate because of the huge volume of data gathered from the search engine.

The Google tool will only add to confusion in comparing Web site traffic, agreed Nathania Johnson, a blogger at Search Engine Watch.

Johnson used Compete, Alexa and Google Trends for Websites to compare three different search-engine blogs and got three different graphs from the tools.

“With all three, there are definite seasonal dips,” Johnson noted. “But these graphs may speak more to the popularity of Google, Alexa and Compete than they do of the Web sites you may search. Alexa makes the sites look like they’ve seen traffic decline, and Compete makes the sites look like the traffic has increased, beginning with a big jump last June.

Furthermore, Google Trends for Websites does not offer numerical values to give you a ballpark figure of how a site is performing. Alexa and Compete do.”

Still, Johnson acknowledged that Google’s tool likely will become the “most authoritative source for comparison data,” since it has access to far more data than either Alexa or Compete.