Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

KiloByte joins the ranks of accepted submitters, writing
"To appease 'morality' watchdogs, Wikipedia is contemplating the introduction of a censorship feature, where images would be flagged for containing sexual references, nudity, 'mass graves,' and so on. At least in the initial implementation, it is supposed to be 'opt-in.' However, with such precedents as the UK censoring artistic nudity, Turkey censoring references to the Armenian genocide or China's stance on information about the Tiananmen massacre (note that any sensitive photos, like the Tank Man, are already absent!), I find it quite hard to believe this feature won't be mandatory for some groups of readers — whether it's thanks to an oppressive government, an ISP or a school."

I remeber working at my university's CS help desk, deleting email after email of usenet abuse reports. There was some masterful trolling going on in those days and I was happy to have a position to not only be audience to it, but to help it continue.

To think future generations will be denied that by oppressive governments makes me sad.

My main concern with implementing this, even if it always remains "opt-in", is that you're basically creating a API that censorship tools can exploit.

It would be possible for a proxy or national firewall to redirect all requests for Wikipedia through a particular Wikipedia account where they had set the "hide obscenity" flag. So all users within that country, by default, wouldn't see the "offensive" stuff. The hard work of tagging, categorizing images, and rewriting the HTML, would have been done by Wiki

Wikipedia's image filter would just hide images per default, you're still able to see them with just one click, at any time.

This in no way helps oppressive governments. It is about a client-side cookie and that way the client can control everything at all times. (There's not even a way for a school to hide all images, since you can always override your filter settings by clicking on the image placeholder)

If an evil government tried to filter images, they'd have to prevent pictures from actually being sent o

I agree. This post seems to be an inflammatory 'chicken little' knee jerk reaction (phrased in keeping with the OP). It isn't censorship if you are doing it to yourself. More like a service for people who want to narrow their view.

Yes, that assumption is common among those who lack the kind of paranoia necessary to turn a simple opt-in filtering system in to a plot to forever prevent users from fapping to the fine quality close-ups of vaginas Jimbo has assembled for our lonely pleasure.

That this article got posted leads me to make three assumptions.

1) You trolled Soulskill good. Now, Soulskill being a bit of a plank means that you don't get many points for this.

2) Soulskill was hired to post paranoid speculative shit as "news". If th

On the flip side, this is enabling censorship groups (or whatever you want to label them). While on a personal level, humans are generally good, but the whole of humanity can do some pretty horrific things. Censoring these ugly facts about ourselves certainly don't do anything to help bridge the wealth/income gap.

This in no way helps oppressive governments. It is about a client-side cookie

The important thing is the database of "bad images" exists and requests for an image can be checked against the DB relatively easilly (otherwise the blocking feature wouldn't work). All other details of how the blocking system works for opt-in users are irrelevent to those who plan to use it as a datasource for censorship.

If an evil government tried to filter images, they'd have to prevent pictures from actually being sent over the internet.

Once the "evil government" has the ability to check an image against the list of "bad images" it is easy for them to build a proxy that blocks any image on the "bad images" list and redire

It's better to have a ReallyPedia than one that's censored, or needs opt-in or "I'm 18, show me the image". To get around the age of majority problem, it ought to be vetted for access. Images are, and history is, what it was. Those that fear history don't learn from it.

BT, the main ISP in the UK, recently opened everyone's home wireless routers up to allow any BT customer free access to any other BT router. When I queried this I was told we'd "been opted in" to the scheme, but could opt out if we wanted to. So for BT at least, "opt-in" means anyone who hasn't specifically opted out, and you aren't allowed to opt out until you've already been opted in.

The way I understand the problem is that some articles show explicit pictures, which may offend some people. Honestly it has happened to me sometimes to see pictures of illnesses or war crimes which did upset me (granted, I have a very low threshold for these things).I don't see how it would be bad to hide these pictures by default, with a little button "view" next to the caption.Of course, if the goal is to delete these pictures altogether, then I'm all against it.

The way I understand the problem is that some articles show explicit pictures, which may offend some people. Honestly it has happened to me sometimes to see pictures of illnesses or war crimes which did upset me (granted, I have a very low threshold for these things).

Good, anyone who can see pictures like that without getting upset is most likely a psychopath.Censoring those pictures are however not the answer, stopping the actions that are depicted in those pictures are.

No matter how much you censor, bad things will still happen and covering your eyes and ears just because you don't want bad things to happen is irresponsible.

This isn't just about morality or oversensitiveness. Those images can be a problem if they pop up on your screen at work. I have legitimate reason to be utilizing Wikipedia at work, but the last thing I want is some obscene image deciding to display on my screen just as a supervisor or someone with nothing better to do than to complain walks by.

This is actually a good idea. The image is still there and fully available, but it gives you a one click warning to decide if it's appropriate to have it open a

Oh noes, you're offended by reality. Get over it. Just like everyone else. It's supposed to be an encyclopedia of FACTS. Fact is, war, famine, sex, drugs, and vulgarity are part of the life we live in. Should we start adding censors to articles on "Intelligent Design" because it's offensive to the people who know it's not factual?

Oh noes, you're offended by reality. Get over it. Just like everyone else. It's supposed to be an encyclopedia of FACTS. Fact is, war, famine, sex, drugs, and vulgarity are part of the life we live in. Should we start adding censors to articles on "Intelligent Design" because it's offensive to the people who know it's not factual?

Getting upset or offended by stuff is human nature. I bet seeing pictures of the little girl ripped to pieces by the neighbors pet dog in Melbourne the other day would be upsetting, maybe even more so if you are a parent (or maybe that's just me), and maybe even more so if you've ever been attacked by a dog yourself.

People aren't robots, and whether rational by your standards or not, some of reality scares people. Don't you wish there was a pixelation mask over goatse with a caption "wash your eyes after cl

I really don't know how else to say it, but seriously - real life is horrid sometimes. IMHO, if you want to know about the subject, you should have to see it, unvarnished and as it really is. If you can get a grasp as to how horrible something truly is, maybe it'll motivate you to help try and prevent it from happening - be it a disease or a war crime. I could understand not having it shoved into your face when you're not looking it up, but at the same time... you're looking it up. If you want to know about

I support it -- as long as a third party does it. It could be done browser side with GreaseMonkey. Someone can host a mirrored censored Wikipedia. Multiple people can host censored Wikipedia's. Different groups can censor whatever they don't want their members to see. For example a fundamentalist Christian/Muslim Wikipedia could remove all references to biological evolution and astrophysics (the whole big bang thing.)

A system whose purpose is to curate what amounts to a record of the entire record of kno

If the goal is opt in, that means censoring information is added to the metadata to the image, and a client-side javascript application dynamically masks the image based off settings stored in a local cookie. There is no need to store any client information server-side.

What makes you think that? As far as I can tell, such an opt-in censorship tool will involve the categories associated with each file's description page. Militant anti-non-free-content users who think no picture at all is better than a non-free picture, such as the one who wrote this essay [wikipedia.org], will likely push for non-free to be the sixth (you think) category.

We go through this with Google and China every so often. What I worry about isn't having stuff blocked, with a nice big notice that something was removed, but about having content replaced, so that when you go look at stuff about Chinese unrest from the USA it says "chinese know all about this stuff" and when you look it up from China it says "everything is wonderful".

Choice is not censorship. As far as I can tell, Wikimedia is considering to add the option for users to block images that have been flagged as potentially offensive. Since Wikipedia covers many aspects of humanity, some of them scary, it makes sense to enable users to filter some of the more graphic aspects of this. Remember, the articles themselves will not be blocked. I think this would make Wikipedia more useful for kids that might not have the tools to deal with looking straight into another person's guts just because their reading up on surgery.

Many other sites, such as DeviantArt, block nudity by default, and to view it you must register an account and turn the filter off. Even though this is opt-out and a bit extreme, calling the practice censorship is ridiculous.

Choice is not currently censorship. As far as I can tell, Wikimedia is considering to add a future requirement for users to block images that have been flagged as potentially offensive. Since Wikipedia covers many aspects of humanity, some of them scary to the ignorant, it makes sense to enable ignorant users to filter some of the more graphic aspects of this. Remember, the articles themselves will not be blocked for now. I think this would make Wikipedia more useful for adults that might not have the tools to deal with looking straight into another person's guts just because their reading up on exactly that.

Many other sites, such as DeviantArt, block nudity by default, and to view it you must register an account and turn the filter off. Even though this is opt-out and a bit extreme, calling the practice censorship is predictive.

Children usually don't mind seeing new things, until they are taught by adults that the imagery/context is "dirty" or bad from an adult reaction to seeing it.

Although some are saying let's not call this a "slippery slope" yet, when do you? History has shown as soon as tools such as this are created, a government or organization somewhere requires their application. This is true to such a degree that a local library often has patrons shocked that their internet computers are not filtere

I was talking about the other issues raised and even discussed in the summary, i.e. "I find it quite hard to believe this feature won't be mandatory for some groups of readers â" whether it's thanks to an oppressive government, an ISP or a school." I don't expect you to RTFA, but you could at least browse TFS.

We ask the Executive Director, in consultation with the community, to develop and implement a personal image hiding feature that will enable readers to easily hide images hosted on the projects that they do not wish to view

The foundation wants a filter, the community has no way to stop such a feature.

The Wikipedia foundation is totally corrupted as the Open XML [wikipedia.org] article has demonstrated. Furthermore the Wikipedia is completely biased towards the United States. When you have an article on a 19 century parlor song from France the wikipedia article will concentrate on the American entertainer which covered it in the 40ths.
Now the Thai King gets control over our speech, censors pictures for us, and it's so easy for the rogues because you could actually corrupt Wikipedia admins and they would put your enemi

Furthermore the Wikipedia is completely biased towards the United States.

Are you talking about Wikipedia in general, or the English Wikipedia specifically? Wikipedia acknowledges its systemic bias [wikipedia.org] as a problem. The bias you speak of is caused by the tendency of people to contribute to the Wikipedia for their native language and to cite sources written in their native language. And by far, the biggest concentration of English speakers on the Internet is in the United States, and sources written in English tend to cover the views of people in anglophone countries more than others.

I thought, based on my interpretation of English Wikipedia's policy on verifiability [wikipedia.org] and on writing about works [wikipedia.org], the production history and reception history were the most important things on which to focus because they are the most verifiable things about a work.

Actually, the first question on the referendum is basically "how important would this feature be?" If you are eligible to vote and you are against the feature, you can certainly vote "0" on that item, "?" on the rest, and include any specific comments in the free-text field at the end. Enough "0" votes and the board may decide to postpone or rethink the feature.

I got the email for the referendum. Let's not say "OMG slippery slope!" quite yet, ok? If this continues to be voluntary, I have absolutely no problem with it. I won't personally turn it on because very little offends me, but if someone else doesn't want to view pictures of genetalia on their respective articles, I can understand that.

The problem is that the moment such a filter is implemented, organizations that already use other methods to get rid of images they don't approve, will demand such filter to be enabled on their whim, without allowing people to opt out.

Wikipedia's image filter would just hide images per default, you're still able to see them with just one click, at any time.

This in no way helps oppressive governments. It is about a client-side cookie and that way the client can control everything at all times. (There's not even a way for a school to hide all images, since you can always override your filter settings by clicking on the image placeholder)

If an evil government tried to filter images, they'd have to prevent pictures from actually being sent o

The problem is that the moment such a filter is implemented, organizations that already use other methods to get rid of images they don't approve, will demand such filter to be enabled on their whim, without allowing people to opt out.

As long as the organization's you are referring to are not ISP's then I'm fine with this too. If you're on someone else's equipment (eg: work), you are subject to their rules.

Just thought I should point out it's not China that's responsible for the Tank Man photo being missing from the Tienamen Square massacre article. It's good old western copyright law. The Tank Man photo is copyrighted and not freely licensed so Wikipedia can only include it as fair use. Fair use on Wikipedia is held to very strict standards; fair use images can only be used on articles where the image is otherwise indispensible. So you can find it over at Tank Man [wikipedia.org], which is specifically about the photo.

Wikipedia is only as accurate as the scholarly and mainstream media sources that each article cites. Feel free to remove anything inaccurate that lacks a citation, as long as you mention in the edit summary that what you remove lacks a citation. If your problem is with inaccuracies in the sources themselves, as I've been told is the case with Wikipedia's article about PSP homebrew, then I guess that's a fundamental problem with Wikipedia policy.

Yes, I know the artist claims the painting is not a representation of what actually happened at Tiananmen (it's just "inspired" by Tiananmen), but it's pretty clear the men are being executed, I don't care if they're laughing or not. That artist is incredibly brave.

This isn't "censorship" or "blocking" of images. If you read the text and look at the mockup, this is an opt-in feature to keep a person from accidentally viewing controversial content when simply clicking around on Wikipedia. There's a "show content" button right where the photo would normally be! Nothing is being kept from anybody.
Say you're on break at work and you run across a word you don't know. If you type that word into Wikipedia and it ends up being some sort of genital mutilation or something, y

I took several minutes to read this 2 days ago when I first saw the news (2 days... slashdot, what's happened to you?) and it actually looked damned uncontroversial and careful.

First, I'd say calling this censorship is a red herring.

Censorship = removal of information without recourse or alternative.

Opt-in filtering = giving parents and the squeamish a way to preemptively hide images, with user-controlled overrides.

The categories sought for filtering is also intended to be peer-managed within wikipedia, which should prevent this from becoming a tool for governmental / corporate / ISP censorship. IOW, if users guide the categorization of data (tagging images as sexually explicit, violent, etc) then a gov/corp/ISP can't 'sneak in' the censorship of an article on Turkey, Israel, Net Neutrality, Codomo, China-vs-Taiwan, China-vs-Tibet, Egyptian unrest or whatever.

The call for comments generated by Wiki* also discussed their desire to make whatever they do overridable.

(disclaimer: I think I've edited wiki* a few dozen times, but doubt it was anything censor-worthy).

I'm thinking this should be sort of like the spoiler text used on many image boards, where initially all you see is black, but if you scroll over it you can see what it actually is. I think this would work perfectly for Wikipedia; if you didn't want to see it, you didn't have to. This way it means it would be censored for anyone who didn't want to see it, and anyone who does want to see it would just have to hover over it with their mouse and it would become visible.

AdBlock is done on the client side, this means the government can't issue a law or make a deal with Wikipedia to force it upon you. They'd have to disable SSL access and run man-in-the-middle proxies on all traffic.

So what? All this is is a tool, to allow people to control what they do or do not want to see. That is a perfectly legitimate thing. The fact that maybe some government could abuse the tool is completely immaterial. Ever since man first sharpened a stick every tool created could potentially be abused by someone. If that happens, complain about the abuse and the abusers, not the tool.

I don't want to see hi-res photos of Wikipedia editors' genitalia or nasty skin diseases at the top of an article (when an illustration would suffice) for the same reason I don't want to Wikipedia to change over to magenta text on a lime-green background. There's an issue of aesthetics and readability here.

Since stylized depictions of the the editors' genitalia or skin diseases would not be horrifying they would also clearly not be accurate. Don't see why Wikipedia should design levels of filters to obscure knowledge when users can just instruct their web browsers not to load images from the site.

Aesthetics are totally subjective. People respond completely differently to the same sets of images, and that has to be taken into account. Wikipedia for the most part operates by working towards a consensus rather than explicit rules, and should strive to be inclusive without compromising the information contained within its articles. If you flippantly ignore the fact that a large number of readers are offended or revolted by certain images prominently displayed in Wikipedia articles (whether you agree

Cut it out with the reactionary rhetoric already. It's an opt-in filter that allows people who so choose to read about "controversial" subjects without being confronted with graphic images of hardcore blood, gore, pornography, etc. - and there will be categories of filters, so it may even allow Muslims to read about their prophet without having to see depictions of him, without depriving others of access to those images. This seems like a good thing.

Your Honor, the client's defense crumbles in light of this new evidence. He clearly enjoys seeing photographs of lewd nature. If he did not, why would he have set the cookie on Wikipedia that allowed them to be viewed on his screen?

(totally ignoring the possibility that maybe the guy blocked images of dismemberment while viewing an article about war one time, and didn't block other things because they weren't onscreen)

I object to using Wikimedia Foundation funds to develop and implement what should be a commercial operation. Sure, no one would object to choice, but also no one should feel entitled to be handed the entire Web nicely categorized into "naughty" and "nice" without compensation. Please feel free to host abridged or censored mirrors of Wikipedia - as long as you don't demand Wikimedia Foundation to fund it.

This is not bad, because the system, as proposed, is basically going to be just an improved and integrated reimplementation of an already existing feature.

You may have heard of it. It's called "AdBlock Plus".

That's essentially what the Wikipedia community has been telling people to use. Offended by pictures of prophets? Ask your local friendly religious WikiProject if they have a handy ABP list or user CSS file for you to use. Offended by sexual content? Yeah, it's a lot of blocking, but it can be done.

If pictures of the topic offended them, why were they on the topic in the first place? If you don't want to see pictures of vaginas, maybe you shouldn't look up vaginas?

Indeed. Because clearly there is no better way of teaching someone biology* than to simply show them a bunch of pictures of different people's vaginas. </sarcasm>

The main problem is that even though medical illustrations are both prevalent and often better at highlighting small details (artists can control contrast of areas very easily to show off such details), there are many exhibitionists out there who add self-made images just because they can. Most people don't post these images for their encyclo

Be really blunt about it to chase off the sensitive fucks who start this shit in the first place. The way to react to PC bullshit is with scorn and open hatred because courtesy is wasted on such people.

"This site may offend you. If it is possible to offend you then go the fuck away and never come back because no one here needs you as a viewer."

The Tank Man is still there on the wiki page linked in the summary.
A simple click through would have revealed that.
While I think this would be awful, and make wikipedia even less desirable a destination, it is rather in line with their "notability" censorship. That said, no need to inflame the issue by claiming the article on the Tiananmen massacre has already had the Tank Man picture removed.

While I think Wikipedia has strayed greatly from its original goals and principles, I thought one of the most important ones was maintaining a "neutral" point-of-view in articles. How is marking certain images as "offensive" showing neutrality? If an image is illustrative to the content of an article and it is legal to be used, then whether or not some people find it offensive ought to have no bearing on its inclusion. I think that the inevitable debates over whether or not an image is offensive will ser

How about not show the image until someone clicks on something that makes it visible? This way, if someone goes to the page on "ejaculation", they don't, by default, see an animated gif of some dude ejaculating. But if they REALLY WANT TO, they can click to view it. Either way, they still have access to the text that describes the biology.

They should have a "notability filter": instead of deleting so-called "non-notable" articles, they get added to the filter so that deletionists can see the nice, clean, austere Wikipedia they've always dreamed of, while the rest of us get the real thing.

I hate to sound like a think of the children type of person.But Wikipedia is a great tool to start your research with, and really good for education. However I can expect kids using Wikipedia in schools to view imagery that they are not allowed, and inappropriate for a school.

It isn't as much they are forced to view the data, but kids being both curious and wanting to show to there peers that they are cool, will use such imagery to cause trouble and get cool points because they can gather a bunch of curiou

It's your responsibility to avoid what you don't want to see.The other way round makes no sense because they world is full of pussies, each with unique sets of "crap they don't want to see".The sum of all those sets is EVERYTHING that exists.

Of course, what gets censored is decided on your behalf by some corrupt authority or other, which is why censorship is always evil.