Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

coondoggie writes "The folks at Google are taking issue over spam and the quality of Google searches, which some claim has gone down in recent months. Today on Google's official blog, Principal Engineer Matt Cutts said, 'January brought a spate of stories about Google’s search quality. Reading through some of these recent articles, you might ask whether our search quality has gotten worse. The short answer is that according to the evaluation metrics that we’ve refined over more than a decade, Google’s search quality is better than it has ever been in terms of relevance, freshness and comprehensiveness. Today, English-language spam in Google’s results is less than half what it was five years ago, and spam in most other languages is even lower than in English.' Cutts also explained that the company has made a few significant changes to their method of indexing."

It would, if Google include an option to filter out entire domains from search results. Google could then simply monitor these domains and try and figure out why people take the trouble of filtering them out.

if you add -site:example.com it removes all hits from that site. I have noticed recently that I no longer have to exclude sites that I used to, such as those that just copy excerpts from other message boards.

Would be nice if you could set something in http://www.google.ca/preferences?hl=en [google.ca] once, and then you would never see these sites though. If someone knows a setting that I'm missing that will let you do this, please let me know here.

Not enough, damned expert's exchange! I had to download a grease monkey script and everything, seriously screw those guys.
And have you tried searching for a name of an executable? "HURRR CLICK HERE TO FIX XYZ.EXE RELATED ERRORS" NO! I just want relevant information on the damn executable not traps for computer illiterate people and the shovel ware the sites are peddling.

Oh please, you make it sound as if google had vans full of surveillance equipment roaming the streets, spying on everyone! Just more of the usual tin foil haberdashery from the conspiracy theory crowd.

Yeah really - and next some paranoid crackpot will be saying that Google is *scanning the air* recording electromagnetic signals from inside of people's houses. We need to get this population on better meds.

I use a Google Customized Search Engine (CSE) configured to promote StackOverflow and block ExpertSexchange. Here, you can try it out: www.google.com/cse/home?cx=007350804174195462206:7etfz1pyl-s . I've set it as my default search engine in Chrome and never have to think about it again.

They used to but they took it out in favor of only allowing people to star items. Why they did this is beyond me. Maybe it gave too much of a chance to game the system. But honestly, it was the only thing that made my searches relevant again. It was the only truly useful feature they've added since, well ever.

When I search for a few coding terms I dont want 10 different sites that have scraped their contant from stackoverflow, I dont want 10 representations of the same unanswered email, or 10 experts-

Would love to see this come back - I too was happy with the feature they introduced to downvote results.

There should be some way for users to down-vote these scammer parasites like efreedom.com which just serve content from other sites with adverts. Just as we collectively downvote spam, which ensures most people never see it, we should be able to do the same with search results.

You've got me thinking of what an "adblock" for google searches would look like. Hmm. If every google search POST is intercepted in the browser by this adblock tool and then stuffs a bunch of additional commands onto it:

Speaking of anecdotal evidence, just now I was going to google something about the Nyquist Limit, but I couldn't remember the "Nyquist" part. CD limit, CD 22kHz limit, digital sound limit... I finally gave up. Without the word "Nyquist" or knowledge of the Nyquist Limit, you just can't google to find out why you can't record a frequency higher than 22kHz on a CD.

In this case, it does. I don't care how well Google does by some relevance metric. I don't even care how good it is at returning useful results to you, let alone their virtual search user. I care about how good Google is at returning relevant results to the things that I search for. If it returns 100% accurate and useful results to the 100,000 most common search queries, but useless results to things I search for, I won't use it.

They are not necessarily reaching their goal of better searches, they are simply meeting their design specification. Their metrics are a model, a guess, at what better search results *may* be. If the metrics are off, then their results will probably also be off.

Yep Google's search results are totally fine and relevant, fresh, yadda yadda. In fact they are even better than they were years ago!

Oh Btw,".. we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content."

A typical problem companies have is measuring the quality of their products: By their metric, it's great! But per the user experience it's not. The users must be wrong.

The metric doesn't always capture the things that the users care about. Also, expectations can change. Better than five years ago may not be good enough

Based on my experience, Google's search quality is insufficient to make it useful for most purposes. It's plan B now. No search engine is much better, but plan A is to use better resources: Wikipedia, knowledge written or compiled by an expert, etc.

The metric doesn't always capture the things that the users care about.

Google’s search quality is better than it has ever been in terms of relevance, freshness and comprehensiveness.

I hate how searches with a word that has recently been in the news get flooded with crap from the newscycle. I wish I knew a way to tell google's filters that stale results are fine and/or better than fresh ones.

I hate how searches with a word that has recently been in the news get flooded with crap from the newscycle. I wish I knew a way to tell google's filters that stale results are fine and/or better than fresh ones.

Actually, if you read the blog post from Google linked in TFS, they aren't saying that "there is no problem" (as parent post's title suggested) or that "it's great" (as parent post's text suggested.)

They did say that their own metrics don't show the trend that various, mostly anecdotal, critics have claimed. But they also said that they view the spam that does exist as a problem, and they announced several steps to address it:

As we’ve increased both our size and freshness in recent months, we’ve naturally indexed a lot of good content and some spam as well. To respond to that challenge, we recently launched a redesigned document-level classifier that makes it harder for spammy on-page content to rank highly. The new classifier is better at detecting spam on individual web pages, e.g., repeated spammy words—the sort of phrases you tend to see in junky, automated, self-promoting blog comments. We’ve also radically improved our ability to detect hacked sites, which were a major source of spam in 2010. And we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content. We’ll continue to explore ways to reduce spam, including new ways for users to give more explicit feedback about spammy and low-quality sites.

As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content. In 2010, we launched two major algorithmic changes focused on low-quality sites. Nonetheless, we hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content. We take pride in Google search and strive to make each and every search perfect. The fact is that we’re not perfect, and combined with users’ skyrocketing expectations of Google, these imperfections get magnified in perception. However, we can and should do better.

This is not a company denying that there is a problem because their internal metrics don't match the problems being reported. It is a company acknowledging that there is a problem and committing to take action on it, even though their own internal metrics don't agree with their critics on the size of or trend in the problem.

Bottom line is that their 'metrics' are faulty. Who gives a damn about freshness when the content is irrelevant. Bottom line is that in recent memory its actually more difficult to find good results using google.

PS. No one cares about forum postings that barely scratch the surface of a subject, contain incomprehensible grammar, or just contain questions about your topic rather than relevant information. But if google doesn't even want to recognize that it is doing things that customers don't like they will

Have you TRIED any other search engine? These guys have been working hard to claw a 0.1% from Google. And along the way they have actually managed to produce some pretty nifty search algorithms. I have stopped using Google for 2 years now and have seldom been let down by my new search engine.

Have you TRIED any other search engine? These guys have been working hard to claw a 0.1% from Google. And along the way they have actually managed to produce some pretty nifty search algorithms. I have stopped using Google for 2 years now and have seldom been let down by my new search engine.

Naming your new one would have been useful, especially if its so great. By not naming it I just assume you are a an anti-google troll. Sorry if that is wrong but we all know there is an anti google campaign paid for by AT&T and MS.

I can't speak for the OP; but when Bing first came out I found a website that would perform your search on Yahoo, Google and Bing; displaying the results in an identical format on either the left, center, or right of your screen.

The catch was; you didn't know which was which until after you clicked indicating which had the best results.

I used it for a good three weeks. Prior to that, I was a Google fanboy. After that, I realized Google's search results weren't any better for me. They were all remarkably

Consider yourself lucky if Experts Exchange isn't showing up like a plague in your search results.

That crap where they show the googlebot one thing and regular visitors something very different (and awful), makes me wish Stackoverflow, etc. will end up putting the final nail in their coffin. In a pinch I've used the google cache to get at the information, but what they're doing is a shitty google cheat and they should've gotten the ban hammer a long time ago.

And thus begins the downfall of Google. Once you start drinking your own lemonade and stop listening to the people who use your product, you're on a greased downhill slope.

Discounting claims in the press is not the same as stopping listening to the people who use your product. I've seen no evidence presented that google has stopped listening to the people who use their product. AFAICT, your argument proceeds from an unjustified assumption.

See, this is where Google goes off the rails and starts to believe its own press. Cutts said, in effect, "Our search engine tells us that our search engine is doing just fine." Yeah, well, ultimately Google's search engine isn't the center of the universe and the ultimate authority on everything. The users are. If the users say that the quality of search results are going down, then they're going down. Period. Google better figure out how to change their evaluation metrics to reflect what users are se

Every time I search for something these days I get some ridiculous set of non-results due to the fuzzy matching. I search for "TIPC layer3" google nicely finds me results about TCP Layer3 because google thinks I must have typo'd something. This happens constantly with one or two letter off searches where the search results I get are adjusted because the alternative ranks higher.

Google's search is not getting better, it's getting more and more 'Clippy' every year.

And I'm out of moderator points. Between the "oh, you're looking for something obscure... here's something that's spelled similarly" mentalality, and constantly returning pages from 2003 about technical subjects, it's pretty hard to find anything on Google that I care about. Except for using them to find large corporate sites.

Add the fact that spam copies are constantly higher than the original, and I see no solution.

There is a classic story told to students of statistics about drunk man who is looking for his car keys under the street lights. When a passer by asks him if he dropped his keys by the street lights, the drunk answers ‘no, but the light is better here’.

Seems to me that you need to refine your queries. I think people have come to expect Google to find relevant stuff with an almost magical and eerie accuracy. Now that Spammers have caught on, it's time to understand how Google can help you refine queries. Use +, -, ", site:, intitle:, etc.

The other thing is that while Google does have some issues with spam (specifically around rebroadcast content), I'm not sure the other search engines are better. Bing has its own set of issues, Yahoo is Bing, and none of t

I search for "TIPC Layer 3" and get your post as result 1, something about tips on a 3 layer cake second, and the wikipedia entry for TIPC third.

I search for "scrotwm" and the home page is result 1 and the man page result 2.

If I search for "TPC Layer 3" I get stuff about "closed loop transmit power control" with a "Did you mean: TCP Layer 3" at the top acting as a link to that search. So no changes in the actual search done, but a hint that I might have spelled something wrong.

Typing things in correctly apparently is harder. Why should I play on hard-mode because I know how to search correctly?

You can't argue that google wants to limit 'bad' searches -- the search-while-you-type feature obliterates that argument. They don't care about the number of searches you do, and seemingly less and less about the quality.

What about an "Elite" search engine? "Made by geeks/nerds for geeks/nerds."(I lost track of the political correctness, pick either or your own.)

The guy who wants drivers, the guy who wants the KDE results, the guy who wants the scrotwm, my advanced search examples, on and on. We don't want to buy things. We're out to search for ruthless hard info.

In this context, spam means web sites that don't actually contain any real content, just junk text, lists of keywords, etc., together with paid links or banner ads and the like. They won't answer any question you may have, unless you are asking to see more spam. There is more and more of this crap, and it dominates some web search queries.

In this context, spam means web sites that don't actually contain any real content, just junk text, lists of keywords, etc., together with paid links or banner ads and the like. They won't answer any question you may have, unless you are asking to see more spam. There is more and more of this crap, and it dominates some web search queries.

I think that according to google, these days, Spam is defined as advertising not profiting google itself:(

expertsexchange results are actually pretty good (or well they used to be...have not used them in a while).

The key is realizing that if you scroll to the very bottom of the page...all of the answers they are asking you to sign up and pay for are already there. Maybe they have changed it but you used to be able to get the full text of the answers by just scrolling down or using google cache (or a user agent switcher to pretend you are google)

I mean, I'm sure I'd like to have an "expert" perform my "sexchange" if I want one, but I was just looking for help solving a programming bug.

I also appreciate that some sites try to help me with my searching, but I'd rather have them provide answers instead of giving yet more search results.

Yeah, when you could customize searches, I always removed "experts exchange" results from my searches, but I don't think google got the hint. They still come up all the f'ing time, and I don't think you can customize searches anymore. Never mattered anyway.-Taylor

Google has a dilemma. If their search engine takes you directly
to the place you want to go, they don't make any money. For a
good analysis of this, see "Google Sucks All the Way to the Bank" [isedb.com],
by Jill Whalen She is,
unfortunately, right. It's essential for Google's success that
some of their own ads be more relevant than their search results.
Part of their revenue comes from sending users on
a side-trip to AdWords-heavy pages. We've measured this [sitetruth.net], using a
browser plug-in which reports AdWords appearances to us.
About 36% of domains with AdWords (counting domain names, not traffic) are what we consider "bottom feeders", junk sites with a commercial
purpose but no identifiable business behind them.

On the local search front, spam in Google Places is even worse
than in their main search results. This, though, appears to be
due to ineptitude, not malice. Google added a business search
system to Google Maps a year or two ago; that's what Google Places
really is. You've been able to go to a Google Maps page and search
for businesses for some time now. Few people knew this.

Then, in October 2010, Google merged the map search results
into their main search results. "Places" results suddenly got
top billing in Google. The "search engine optimization" (SEO)
industry swung into action, and began spamming Google Places
on a massive scale. (We have a paper on this [sitetruth.com], which has been
mentioned by Techdirt, the New York Observer, etc. It's an amusing
read.) Recommendation spamming, which had been going on for a while
at a low level, grew substantially once recommendations started
affecting Google search results.

This, incidentally, is why Blekko won't work. If they get
enough market share to matter, techniques will be developed to spam
them into meaninglessness.

Stopping web spam is technically quite possible. We do it
by finding the business behind the web site, and doing some automated
due diligence. We check business records, SEC filings, BBB ratings,
and Dun and Bradstreet to verify business legitimacy. We down-rate
most of the junk. We try to err in the down-rating direction, taking
the position that it's the job of a company to demonstrate their
legitimacy by using their real name and address on their web site,
which has to match real-world business records. Our demo site demo site [sitetruth.com] for this
shows what search is like if you take a hard line on spam.

Our approach requires more of a hard-ass attitude than
Google's business model can perhaps afford. With Bleekko
making Google look foolish, though, and Bing slowly improving,
Google may have to actually do something that works, even if it cuts into revenue from the spam.

Blekko is more about indexing non-spam sites: the slashtag feature is set by users. A site becomes relevant to a slashtag by feedback and a bit of automation hybrid. That said there is still a lot of work to be done on Blekko on their current model and how it will scale efficiently and without Facebook.

Stopping web spam is technically quite possible. We do it by finding the business behind the web site, and doing some automated due diligence. We check business records, SEC filings, BBB ratings, and Dun and Bradstreet to verify business legitimacy

I've switched to other search engines; from my experience, Google provides too many tangential and corporate references when I do research.

Also, how does Google "know" that their search results were valid? I'll often do a Google search, click a couple of links, and after being disappointed, I'll go to another search engine where I get more useful results.

What bugs me the most are searches on technical or medical topics, where Google give me a dozen "harvester" results -- e.g., I get sites that have stolen conversations from other message boards, and reported them along with tons of ads. Yuck! There must be dozens of hundreds of sites, all with broken answers to questions about JavaScript and/or medicines.

Just because evidence is anecdotal doesn't mean it should be blithely discounted. If I say "Ouch" at being cut, that means the injury hurt me; the pain is quite real even if no one else has felt it.

I think that the solid consensus among the people I know that track such things is that the spammers are winning and the quality of search is going down. I know that this is my own experience. That may or may not mean that Google is slacking off, but I don't think that perception comes from thin air.

I don't think it is. I (and apparently quite a few responders here) am seeing worse results now than ever before. Anything remotely close to what I search for tends to start around the third or fourth result (not including sponsored results).

I did have a conversation here on/. recently about how I search. I refuse to do full text questions, and rather search using keywords I know will get me the results I want. Maybe that's the way google has been improving lately?

FWIW, when I search something very specific (error messages with quotes around it) google is spot on for me.

I (and your mom) have typically found what I am looking for in the first page of results. I've posted two such recent searches here. I am not a guru of search terms, although I have some small skill in this area. Perhaps you could post some example searches?

A quick example is googling the following"fender 5 string american jazz bass"The first link is one for sale on musiciansfriend, the second is a link to fender's site. This one doesn't shine as me having to sort through several responses, but is the first I tried.

"sharepoint implementation" gets a little more interesting. The first link to synergyonline is for consulting services. The second is a blog post with top 10 pitfalls. The third is a best practices article from techtarget. The fourth, a pdf fro

I've certainly noticed the quality of searches going down recently, at least for less common searches. I regularly search for oddball system files, software, drivers, etc, the first few pages of results are often very scammy looking sites devoid of actual content and what I am looking for is a dozen pages in. Often these results trump even official big company web sites. Heck while half asleep I used Google to search for OpenOffice, clicked the first link, clicked a big download button, and when trying to install it later I realized whatever I downloaded was certainly *NOT* OpenOffice. (Don't know what it was, I deleted it quickly)

In the last few years, I've found search results have been dominated more and more by content mills like associated content, ehow, hubpages, about, and others; or some low quality Q&A page, like yahoo answers. The pages are hastily written and edited, and low content. The articles are also typically written by someone without any relevant knowledge or experience - so the information is common knowledge or wrong.

If google's metrics say quality is up, but their users think quality is down, then google's metrics need to be revised to match user experience more closely. I've started using duck duck go [duckduckgo.com] because they block content mills, and thus I think their results are as good or better than google, even without the complicated algorithms and all the data google has accumulated.

In my own experience spam on google is constantly getting worse and more fustrating to deal with... I expect it for searches where there is not likely to be any hits but it is also starting to creep into top spots in situations where there is more dense information available.

I remember back in the day people working logistics used to run algorithms to maximize profits for store supply chains but their efforts actually lost a great deal of revenue as algorithms did not understand human factors and how peop

If I was at google, the very first thing I would implement would be a double robot:
- the classic one, identified as googlebot
- another discreet one, identified as IE7 (or whatever is the most common browser at the time), with the page rendered by IE7, blurred a bit and then OCRed.
The two are then compared, and if they are far from matching, dump the pagerank in the bit bucket. This way you eliminate hidden text, white on white and see text in GIFs.

Ah. I am not a windows user, and don't help that many problems on windows, but I remember that garbage site the last time I had to rebuild a windows computer, IIRC I wound up finding the manufacturers website and using "site:devicemanufacturer.com my device" as the search string. Otherwise I got nothing but malware and spam.

I knew people were not making up their stories about the horrible results, I just haven't been hitting them.

Linux system admin has gotten a lot better over the last couple years, and mys