The growing stream of reporting on and data about fake news, misinformation, partisan content, and news literacy is hard to keep up with. This weekly roundup offers the highlights of what you might have missed.

“Humans can be successfully manipulated through social bots.”Chengcheng Shao, Giovanni Luca Ciampaglia, and others at Indiana University, Bloomington, analyzed 14 million tweets spreading 400,000 claims during and following the U.S. presidential campaign and election and found that “accounts that actively spread misinformation are significantly more likely to be bots.” Also, “humans do most of the retweeting, and they retweet claims posted by bots as much as by other humans. This suggests that humans can be successfully manipulated through social bots.” The paper offers a couple ideas on reducing bot activity; here’s one:

An alternative strategy would be to employ CAPTCHAs challenge response tests to determine whether a user is human…Their use to limit automatic posting or resharing of news links could stem bot abuse, but also add undesirable friction to benign applications of automation by legitimate entities, such as news media and emergency response coordinators.

It’s not just Twitter and Facebook! Not surprisingly, WhatsApp is a conduit for fake news too — at least in Kenya ahead of its upcoming general election, writesAbdi Latif Dahir for Quartz.

Analysts have labeled the spread of information on these messaging apps as ‘dark social,’ given that their effect cannot be measured or questioned publicly. Government officials in Kenya are also closely watching chatter on these apps, recently accusing the managers of 21 WhatsApp groups of spreading hate.

“The media are a strategic asset, just like oil and gas.”Dana Priestlooks at “Lessons from the Europe’s fight against Russian disinformation” for The New Yorker.

In most of Europe, where hoax news stories and Web sites with bogus articles are muddying the digital pipeline of reliable information, political leaders have publicly reaffirmed their faith in the mainstream media and urged them to do a better job exposing imposters. With the help of journalists and researchers, the European Union’s East Stratcom Task Force has published thousands of examples of false or twisted stories in its weekly Disinformation Review, available in eighteen languages.

Can Russia Today’s fact checking be taken seriously? Poynter looked at RT’s “FakeCheck.” Four months in, it’s only fact checked 16 stories, most of which had to do with “Russia’s image abroad or its foreign policy.” But, Alexios Mantzarlis and Anastasia Valeeva write, the selection bias isn’t the biggest problem: “The bigger problem is that it mixes dubious fact checks among the legitimate ones, leading to unproven or poorly sourced conclusions.” For instance, “Rumors about alleged Russian meddling in the Maltese elections were addressed by referring to the Russian Embassy’s statement on the matter. Allegations that Wikileaks had ties to Russia were ‘debunked’ by pointing to a quote by Julian Assange. In both these case the evidence comes from self-interested sources.”

How stories become true. Professor of sociology Francesca Polletta and Jessica Callahan at the University of California Irvine look at how “the rise of right-wing media outlets and the profusion of user-shared digital news” have changed storytelling (paywall), asking the question, “How was a story of middle-class whites pushed aside by a parade of minority groups, abandoned by the government, and treated with disdain by liberals made real?” They write:

Conservative media commentators often styled a personal relationship with the viewer or listener, in which allusive stories reinforced the bond between speaker and audience. The growth of user-shared digital “news” stories also worked to reinforce bonds of political partisanship. However, here, what was important was a style of storytelling. By sharing, liking, and commenting on outrageous stories — and by determinedly not questioning their factual accuracy — people signaled that they were savvy, scrappy, and clearly on one side of the partisan divide…

We miss the fact that people often interpret outrageous stories as evidence of a broader phenomenon; that stories about the way the world used to be often conflate history and nostalgia; that people’s relationship to media commentators affects what they take from the stories they hear; and that stories may have political impact less by persuading than by reminding people which side they are on.

Owen, Laura Hazard. "“Stories may have political impact less by persuading than by reminding people which side they are on”." Nieman Journalism Lab. Nieman Foundation for Journalism at Harvard, 28 Jul. 2017. Web. 22 Feb. 2019.

APA

Owen, L. (2017, Jul. 28). “Stories may have political impact less by persuading than by reminding people which side they are on”. Nieman Journalism Lab. Retrieved February 22, 2019, from http://www.niemanlab.org/2017/07/stories-may-have-political-impact-less-by-persuading-than-by-reminding-people-which-side-they-are-on/

Chicago

Owen, Laura Hazard. "“Stories may have political impact less by persuading than by reminding people which side they are on”." Nieman Journalism Lab. Last modified July 28, 2017. Accessed February 22, 2019. http://www.niemanlab.org/2017/07/stories-may-have-political-impact-less-by-persuading-than-by-reminding-people-which-side-they-are-on/.

Wikipedia

{{cite web
| url = http://www.niemanlab.org/2017/07/stories-may-have-political-impact-less-by-persuading-than-by-reminding-people-which-side-they-are-on/
| title = “Stories may have political impact less by persuading than by reminding people which side they are on”
| last = Owen
| first = Laura Hazard
| work = [[Nieman Journalism Lab]]
| date = 28 July 2017
| accessdate = 22 February 2019
| ref = {{harvid|Owen|2017}}
}}