YouTube targets terrorist videos

Google deleted roughly 640 YouTube videos flagged for terrorist intonations in the last half of 2011, but that may only be the tip of the iceberg.

Google deleted approximately 640 YouTube videos flagged for terrorist intonations in the last half of 2011, according to information detailed in the company's biannual Transparency Report.

The company removed the videos after facing mounting pressure from law enforcement officials from the United Kingdom, who suggested that the videos violated YouTube's community guidelines, which prohibit dangerous or illegal activities, such as bomb making, hate speech, and explicit violence.

While the expulsion is grand in scale, chances are that the recently removed videos are just the tip of the iceberg.

According to Aaron Zelin of the Washington Institute for Near East Policy, alleged terrorists and moderators of jihadist websites have a tendency to respond to such regulations by posting more videos from new, unmonitored accounts.

"[It's] a whack-a-mole type of thing where especially activists in the West create 20 or 30 YouTube accounts, and they primarily use one and then if somebody flags it, they just take it down and go to the next one," Zelin told CNN. "It's sort of this cat-and-mouse game that you're playing."

Sometimes, according to New America Foundation counterterrorism expert Brian Fishman, it's the same exact videos getting posted on new channels.

YouTube tries to counter that influx of newly uploaded videos with a tag-it-yourself method that allows YouTube subscribers to flag any video they deem a violation of community guidelines.

Those flagged videos are sent to a YouTube moderating team that determines whether or not the video is suitable to remain on site.

Every minute, YouTubers upload 72 hours of video. That's three days worth of content every 60 seconds, 103,680 hours every single day. How much of that actually falls under the site's embattled copyright guidelines? It would take an army of judges to find out.