I read with interest this article. I do feel that this racist/facist content is for losers, but it is an interesting debate.

Quote:

"Google is making an editorial decision on who it carries and who it doesn't," Kaplar said. "News organizations have editorial discretion over what they run and don't run. No one can force them to run something if they don't feel like it."

I feel that the more that Google does this kind of thing, however, the more they lessen their goal to provide as much possible information. Let's say someone is writing a paper on racism, shouldn't this content be available to them at Google? Perhaps labeling the content as "offensive in nature to many" would be a better policy?

I don't think this has anything to do with freedom of the press. Google is not a government entity, its a business. It's not censorship if you or I make a website and choose not to publish material we dont agree with, it's not censorship if Walmart chooses not to sell music that it deems offensive.

Just because Google is a huge company and the biggest search engine doesn't obligate them to include anything on the basis of free speech. Let them publish their material elsewhere. I wouldnt publish garbage like this or link to sites that did. I fully support G in this move.

Can't say I'm a fan of the neo-Nazi lit, but it does seem that a lot of bloggers think that any Google News policy should be according to the bloggers' own political/personal biases. That is what underlines this argument, IMO.

The problem is not whether Google should include or expel neo-Nazi material. Rather, the problem is Google's persistent, geeky, stupid assumption that by using machines, the humans at Google are no longer responsible for the behavior exhibited by their machines.

If the machines at Google rank neo-Nazi material the same as inoffensive mainstream material, assuming the keyword density and everything else is equal, then this is a sin of omission and a value judgment by humans at Google who program their machines. It's not the fault of the machines, it's the fault of those who are too lazy to exercise control over the machines.

It's true that human editing, or programming around every new problem, doesn't scale as well as some catch-all robot algo spun out by a Google PhD. Nevertheless, every employee and shareholder at Google is partially responsible or what is produced by their machines. If Google cannot make human editing, labeling, and judgment scale at a minimally-acceptable level, or they cannot improve their software to be more selective, despite a $49 billion market capitalization, then Google should turn over their assets to someone who can.

The problem is not whether Google should include or expel neo-Nazi material. Rather, the problem is Google's persistent, geeky, stupid assumption that by using machines, the humans at Google are no longer responsible for the behavior exhibited by their machines.

If the machines at Google rank neo-Nazi material the same as inoffensive mainstream material, assuming the keyword density and everything else is equal, then this is a sin of omission and a value judgment by humans at Google who program their machines. It's not the fault of the machines, it's the fault of those who are too lazy to exercise control over the machines.

It's true that human editing, or programming around every new problem, doesn't scale as well as some catch-all robot algo spun out by a Google PhD. Nevertheless, every employee and shareholder at Google is partially responsible or what is produced by their machines. If Google cannot make human editing, labeling, and judgment scale at a minimally-acceptable level, or they cannot improve their software to be more selective, despite a $49 billion market capitalization, then Google should turn over their assets to someone who can.

humans make mistakes. so do algorithms programmed by humans.

any gateway to the web will bring up some rubbish.

the very act of creating value and having users means people will try to exploit you

The man who thinks he knows something does not yet know as he ought to know.

Join Date: Jun 2004

Location: Here. Right HERE.

Posts: 621

My opinion is that Google is free to do what they want with their search results, including eliminating sources. The idea of a disclaimer is not bad but this is a society that likes to get offended, if you put an "offensive" disclaimer on this, somebody will be yelling for such a disclaimer to be put on other search results that *they* want to be offended by, such as religious content, sites about fat chicks, political sites, etc.

Rather, the problem is Google's persistent, geeky, stupid assumption that by using machines, the humans at Google are no longer responsible for the behavior exhibited by their machines.

Actually, by using machines they move away from dumb subjective bias, and try to create a platform of objectivity.

For some reason I'm only just beginning to really appreciate the approach, and I really respect Google for it. Especially when there is a constant - perhaps growing - pressure on Google to conform to one subjective ideology or another.

Having said that and liability aside, don't facists have a right to be represented in the news?

Sure. No one is stopping them from appearing in any news outlet that wants to run stories on them. One such outlet has chosen not to.

Maybe this is a dumb question but: What possible benefit can an "enlightened" culture gain from divisive, destructive ideas like this? Why would we ever, under any circumstances, want to disseminate this information?

>Why would we ever, under any circumstances, want to disseminate this information?

I don't disagree, its not a culture that I have any love for. On the other hand are you saying it is something that should never be seen?

I really don't want to drift into Politics, it is not the job of a webmaster, but are you saying that back in the day [if we had a time machine] Google news shouldn't have shown any "free Nelson Mandala" news. Or maybe any Ghandi stuff [great film, never seen the guy since, wierd]. Maybe they should have blocked anything from Martin Luther King too?

I really don't want to drift into Politics, it is not the job of a webmaster, but are you saying that back in the day [if we had a time machine] Google news shouldn't have shown any "free Nelson Mandala" news. Or maybe any Ghandi stuff [great film, never seen the guy since, wierd]. Maybe they should have blocked anything from Martin Luther King too?

Want me to go on, I can, ad infinitum if you wish.

I dont see the correlation between the ideas expressed by these people and racist ideas that basically have no function but to create an atmosphere of division and perhaps even advocate violence. I'm as much an advocate of free speech as anyone, information wants to be free and you really cant stop it. But sometimes noble ideas don't pan out the way we hope they will (take your pick of any religion for an example). Common sense should always win out over rules carved in stone.

The weakness is my argument is evident even to me: who decides what is too destructive? I wouldnt give that power to the government. Maybe leave it in the hands of citizens? I don't see too much blatantly racist, hateful material in popular media...so maybe they arent doing such a bad job of it.

Maybe this is a dumb question but: What possible benefit can an "enlightened" culture gain from divisive, destructive ideas like this? Why would we ever, under any circumstances, want to disseminate this information?

You can throw that charge at most political/religious ideologies, though.

The question is not if Google should make political editorial judgements or not - they already do so! The question is: What do they include and what do they not include.

If "my" newspaper starts to report directly from neo nazis and the like I find another newspaper. It's that simple. Google can do what they want and so can I. Except, in some countries there are laws about this and Google, like any other global player, have got to learn that.