Tavares veterinarian fights child porn on Facebook

11:33 p.m. EST, November 6, 2012|Lauren Ritchie, COMMENTARY

Child pornography is not the usual fare that users see on Facebook, the social-media giant used by more than 1 billion people each month. Most people with profiles on Facebook are connected only to other people they know, and pictures such as grandchildren in their Halloween costumes are the norm.

But Facebook has a darker side, and about 465 people who were originally organized loosely by the international hacker organization called Anonymous have made it their mission to track down and eradicate child pornography on the site.

Until recently, D'ara Klein, a Tavares veterinarian, was one of them. However, both Florida law and Facebook make it difficult for the average citizen to help abused children by searching for and reporting profiles on which the illicit material is posted. After learning that her search-and-report mission may violate the law, Klein has stopped.

But her experience, she said, has been eye-opening and troubling, and she wants parents to know what their children might find while innocently looking for pages on Facebook about their favorite cartoon characters or teddy bears.

Let's take a look at how this works.

Klein, 45, said she first got involved when a local friend told her about child porn available on Facebook. She went to a page for activists called Let's Out Pedo's on FB and saw that its organizers were seeking volunteers to help search for the exploitative material and report the pages to authorities and to Facebook itself.

Working through one of four groups associated with the activist Facebook page, the mother of three daughters, ages 8, 6 and 19 months, signed up and began learning how to troll through Facebook to find child pornography.

It didn't take long.

Child porn isn't obvious on Facebook, but it is there, and one of the most troubling aspects is that it can be accessed though pages for which a child might naturally search. For example, some pages use the names of popular cartoon characters. So, a child looking for pages about teddy bears or favorite characters could suddenly be staring at something a parent really doesn't want him or her to see.

"At first, I didn't think it was a big deal," Klein said. "Then I saw the evidence. As I delved deeper, I found videos of children being sexually abused."

Klein said a video of a man having sex with an infant sent her spinning into full search mode and robbed her of her sleep.

Using a fake profile to protect herself from harassment, Klein quickly figured out the technique of the perverts: They post a photograph or video of a child and leave it to be seen by any Facebook user for, say, 10 minutes or so. Word spreads fast on Facebook, and the sickos who want to view child porn flock to the page and download the image quickly.

Minutes later — poof! — it's gone.

Still, she and her colleagues reported the pages to the Association of Sites Advocating Child Protection and to the National Center for Missing & Exploited Children. They also let Facebook know about the pages so that computer experts could dig back in time to find what was posted.

What these porn monitors are doing, however, isn't as simple as it sounds.

During an email exchange between Klein and Lake sheriff's Lt. Linda Green about the photograph of the little girl being abused, the lieutenant wrote:

"We know your group has good intentions, but Florida law makes it illegal to intentionally view child pornography. The law can be found in Florida Statute 827.071(5). These laws are, of course, strictly enforced and do not offer exceptions for citizens wishing to assist law enforcement. The only exception to these laws pertains to law-enforcement officers engaged in official investigations. We appreciate your efforts, but want to make sure you're operating within the law."

That's what prompted Klein, who has no desire to violate the law, to stop searching.

So where does Facebook stand on all this?

A Facebook spokesman said the company has a "Safety Team" made up of hundreds of employees worldwide who review inappropriate material around the clock. The company set a goal of examining reported material within 48 hours.

It also uses a program called PhotoDNA, which is designed to recognize abusive pictures of children and prevent them from being uploaded, said Fredric Wolens, a Facebook spokesman. The company reports instances of exploitation of children to the same authorities as the 96-member group called LOP1 for which Klein volunteered.

But Tonya Morrow, administrator of the group, said Facebook has "declared war" on LOP1. Last week, after being asked about the reporting activities of some group members, Facebook shut down the accounts of about five members.

Wolens said the company will close any account that is fake if the holder declines to provide proof of identity. In addition, some of those searching for the porn share links of the illegal material with other members of the group, another Facebook no-no, Wolens said.

"While I'm sure intentions are good, it is a violation of our terms [of service] to use a fake name and to share this content for any reason, which caused the block," Wolens wrote in an email. "We want all users to use the report links to notify us of suspicious or abusive content.

"The danger comes when people use fake accounts and distribute this material among themselves, as it becomes increasingly difficult for us to identify the real perpetrators and adds an additional level of complexity to our investigations."

Klein and Morrow said Facebook could create a sort of hot button to report child exploitation, which then could be examined immediately by members of its safety team.

It would be an excellent way to more firmly clamp down on the people around the world who treat children like disposable items.

Lritchie@tribune.com. Her blog is online at http://www.orlandosentinel.com/laurenonlake. Lauren invites you to connect with her on Facebook at http://www.facebook.com/laurenonlake.