ISTANBUL, Turkey — I’m pretty sure that most attendees at the Internet Governance Forum (IGF), which took place here last week, would agree that the Internet should be open to virtually anyone who wants to share or access information. But there are plenty of people at the IGF who would also argue that blocking certain types of content is appropriate.

I suspect most people who attended, for example, support efforts to block child sex abuse images (child pornography), which is illegal in most countries, but some — including, apparently, the government of Turkey — would go a lot further.

The Turkish government has blocked thousands of websites based on a law aimed at “protecting children” from harmful content. But some of the sites that have been blocked include content that doesn’t appear to harm children as much as the political reputations of government officials, including former Turkish prime minister and current President Recep Tayyip Erdogan, whose government cut off access to YouTube and Twitter earlier this year after those sites reportedly exposed alleged government corruption.

Twitter and YouTube are back online in Turkey but, according to Turkish Internet rights activists Yaman Akdeniz and Kerem Altiparmak, “Between May 2007 and July 2014 Turkey blocked access to approximately 48,000 websites, including some that were critical of the Turkish government.”

A lot of those banned sites do contain pornography, acknowledged Nate Schenkkan of Freedom House, a nonprofit watchdog group that advocates for political and civil liberties around the world, “but many also contain political speech or focus on such issues as sexual orientation that the government might disapprove of.” Even pornography, said Schenkkan, shouldn’t be blocked for the entire adult population in the name of protecting children.

Whether it’s a good or bad idea, the decision to block porn sites, hate sites or even sites critical of a government, is usually based on deliberate decisions. But there are also cases of sites, pages or messages being blocked accidentally.

One of the workshops I attended at IGF was titled “Internet blocking: When well-intentioned measures go too far,” where the panelists pointed out that filters designed to protect users from spam, malicious software, unscrupulous advertising and other undesirable content, have also inadvertently blocked legitimate content.

Spam filters are an example. Filters designed to hide junk mail are far from perfect.

Even Gmail’s filters are capable of false positives as I discover every time I check my spam folder, only to find some legitimate messages — sometimes from friends, family and colleagues — among the thousands of junk messages that it catches on a daily basis.

Overblocking can also happen on a mass scale. The session’s moderator, Paul Vixie, pointed to a case earlier this year saying that Spamhaus, a nonprofit that maintains databases of known spammers, wound up accidently blocking the entire country of Sweden.

There are plenty of examples of unintended blocking of sites by filters designed to protect children from porn and other “harmful content.” In July, the Open Rights Group tested 100,000 sites and found that 19 percent were blocked by British Internet service providers’ filters, including some sites that don’t contain pornography, violent images or advocacy of hate, self-harm or other criteria typically cited as being harmful to children. I operate SafeKids.com, a website designed to help parents protect their children online, but some articles on this site have been blocked by filters because they contained words like “pornography” or “xxx,” even though the site itself doesn’t contain porn, but actually provides advice to parents on how to help their kids avoid it. Filters have also been known to block sites about breast cancer, sexual health and sexual orientation.

Vixie and other panelists pointed to numerous other examples of the unintended impact of blacklists maintained by non-governmental organizations, which compile lists of websites, pages, servers and domains, which are believed to be housing or facilitating spam, malicious software or anything else that can be deemed harmful to the Internet.

The panelists at this IGF workshop were not arguing against all policies that block certain types of content. For example, there was no objection to ISPs blocking access to pages known to contain illegal child pornography based on lists maintained in the U.S. by the National Center for Missing and Exploited Children (whose board I sit on) and in the United Kingdom, for example, by the Internet Watch Foundation.

Although neither of these organizations publish their lists of sites with illegal content, both say they are extremely careful about checking their lists frequently for false positives and have mechanisms in place where site operators can appeal if they feel their content is being blocked by mistake.

What the IGF panel did call for is greater transparency on what is being blocked and for organizations that create these lists to be extremely careful and judicious about what sites they add to their lists. They also urged Internet service providers to be cautious about taking action against sites, simply because they may be on such lists.

Like a lot of my fellow IGF attendees, I want a well-lighted Internet that’s friendly to all its users, including children. But I also want an Internet that is free from censorship, government control and arbitrary or unintentional blocking of content, even if I find it distasteful. Freedom can be messy but the alternative is unacceptable.

More in Technology

A Mountain View entrepreneur who developed a device to help patients with chronic obstructive pulmonary disease, or COPD, captured the $20,000 first prize in the 19th annual Big Bang! Business Competition at UC Davis.