David Meerman Scott, a well-known marketing strategist, coined the term "newsjacking," which he describes as "the process by which you inject your ideas or angles into breaking news, in real-time, in order to generate media coverage for yourself or your business." The concept makes sense, and we all know that a great way to gain relevance online is by leveraging hot topics and news items that are beginning to trend-but it's a competitive, and fast-moving, field. How do content marketers stay on top of the relevant trends and news in their industries to ensure they're curating and communicating fresh, engaging content?

Social Censorship

Leave it to NPR to get me thinking. I recently heard a story that asked whether some images or videos, like that of the recent execution of journalist James Foley, should be censored by media outlets such as social media sites. The New York Post was vilified for running a picture taken of Foley just before he was murdered, while Twitter and YouTube both scrambled to remove videos of the event and suspend user accounts. Should social media sites have the right to censor content like this or any content at all?

I started thinking about the implications of a censored social media landscape. The old journalist in me says these sites should stay hands-off. The First Amendment guarantees us freedom of speech, no matter how depraved that speech is. But a bigger part of me says Twitter and YouTube did the right thing, and they probably should have done more.

Think about all the internet trolls out there ruining blogs and forums one terrible comment at a time. Remember what happened to Zelda Williams when her father, Robin Williams, passed away? Trolls bombarded her with horrific tweets until she suspended her account. People like this aren't necessarily saying or doing terrible things because they really believe it (I hope), they do it to get a reaction out of others. It makes them feel important.

Is it the social media sites' responsibility to shut them down? The debate surrounding social media censorship is confusing at best-after all, users are ultimately subject to the sites' Terms of Service. An article from June on the Harvard Law Review called "The Brave New World of Social Media," explains the core issue. Marjorie Heins writes about section 230 of the Communications Decency Act (a commonly sited piece of legislation), noting that, "social media sites like Facebook and search engines like Google do not have to censor anything. In fact, one major aim of section 230 is to discourage private-industry censorship, so that free speech can prevail on the Internet, and those actually responsible for criminal or tortious speech, rather than the pipelines through which they communicate, can be prosecuted or sued. But section 230 does not prohibit private censorship."

So if these sites aren't responsible for policing what we see and read, what happens when what someone is saying online causes harm to another person or is just flat out inappropriate? Should these sites censor comments, and if they should, where would we draw the line?

Content companies have a special stake in this debate. Not only do you not want social media sites censoring your posts, but the whole thing raises a different question for those of us who do business on the web. While working in publishing, I've come across many instances where a consumer leaves multiple poor reviews just because they can. Sure, it's their prerogative to do so, but saying a book is terrible seven different ways isn't exactly necessary. But censoring those unsatisfied customers would be a mistake on our part.

Take for example the hotel in Hudson, NY that charged guests $500 per bad review they gave on Yelp! They weren't just getting rid of bad reviews, they were penalizing the reviewers. This backfired when the media got hold of the story and what would have been a few bad reviews turned into national news.

When one of the books my company publishes gets a negative review on Amazon, we certainly don't delete it. Instead, we work on the core problem and hope that other consumers aren't swayed by a few grumpy reviews. Maybe that means responding to the customer's comment directly, maybe that means changing what the book promises, or maybe that means refunding the reader's money. This is a better approach than just shutting down disgruntled customers. You're not going to be able to please everyone-that is the nature of the internet and business-but sometimes just engaging with your customers and addressing their concerns is enough to turn angry customers into a brand advocate.

Fonolo's customer service blog shows just how much companies can lose by not taking their customer service obligations seriously. One statistic notes that, "Reducing your customer defection rate by 5% can increase your profitability by 25 to 125%." And while you may not be able to fix the problem each customer is facing right there and then, engaging in a dialogue with customers can lessen the frustration they are feeling. After all, many customers just want to be heard so they know their perspective means something to a company.

Social media sites find themselves in a different conundrum. For them, censorship isn't about silencing critics, but protecting its users. Should sites like Twitter and Facebook avoid banning content from its site, and hope that people can decide for themselves what is worth reading or viewing? Not as far as I'm concerned. Sure, we can click the "I don't want to see this" option in our newsfeed to hide posts we don't like, but some things you just can't un-see. There are too many terrible things out there being shared--and too many people with bad judgment--not to allow a little social censorship. Sometimes, shielding your customers from objectionable material is the best way to build their loyalty and trust.