(Newser)
–
Few people watched the live broadcast of helmet cam footage showing the Friday shooting at one of two mosques in Christchurch, New Zealand, according to Facebook. The 17-minute livestream had fewer than 200 views at its end, and no one reported it during that time. Facebook says the first user report came in 12 minutes later, by which time the stream had about 4,000 views. But though it was deleted by Facebook within minutes, copies quickly spread, reports USA Today. Within 24 hours, Facebook had removed 300,000 videos and blocked another 1.2 million at upload. YouTube was in a similar boat. A team of "incident commanders" worked through Friday night removing tens of thousands of videos with original shooting footage as they were uploaded sometimes as quickly as "one per second," reports the Washington Post.

Many of the videos were altered visually and in length "to defeat YouTube's ability to detect" them so that "for many hours, video of the attack could be easily found using such simple basic terms as 'New Zealand,'" per the Post. "This incident has shown that, especially in the case of more viral videos like this one, there's more work to be done," says YouTube's chief product officer, Neal Mohan. "We continue to work around the clock to prevent this content from appearing on our site, using a combination of technology and people," adds Facebook VP Chris Sonderby. That's apparently not good enough for some advertisers. The New Zealand Herald reports major New Zealand brands including Burger King are set to pull their ads from Facebook and Google. (The role of social media in the attack will be investigated.)

Identifying a problem immediately is not possible due to all the legal gun shooting videos but more needs to be done to halt live streaming of this type of violence. Live streams with guns or extreme violence need to pulled in all cases and only after action gun video allowed that has been screened. No one needs to see killing in real time for any reason.