Russian operatives used Twitter and Facebook to target veterans and military personnel, study says

Russian trolls and others aligned with the Kremlin are injecting disinformation into streams of online content flowing to American military personnel and veterans on Twitter and Facebook, according to an Oxford University study released Monday.

The researchers found fake or slanted news from Russian-controlled accounts are mixing with a wide range of legitimate content consumed by veterans and active-duty personnel in their Facebook and Twitter news feeds. These groups were found to be reading and sharing articles on conservative political thought, articles on right-wing politics in Europe and writing touting various conspiracy theories.

In some cases, the disinformation reached the friends and families of military personnel and veterans as well, the researchers said. But it was not always clear who was creating the content. Twitter, for example, makes it easy for users to hide their true identities.

"The social networks mapped over Twitter and Facebook include both genuine accounts created by U.S. military organizations, by service personnel and veterans themselves, and by groups seeking to influence those users," the report says. "Some of the accounts are managed by known trolls or are pro-Putin accounts sharing significant amounts of Russian-oriented content."

The report by Oxford's Project on Computational Propaganda, which has been studying ways that fake news and propaganda reached Americans during the 2016 election and its aftermath, is the first in which the group sought to explore the spread of disinformation on both Twitter and Facebook, and also how links are shared back and forth across these platforms.

Facebook and Twitter both declined to comment on the report, advanced copies of which were shared with the companies by the researchers.

To examine the kind of information that reaches military personnel and veterans, the researchers analyzed content between April 2 and May 2 on Twitter including popular hashtags such as #GoArmy or #Iraq to determine what users of these hashtags posted. In some cases, said Philip N. Howard, an Oxford professor who co-authored the report, known Russian trolls were using those hashtags to draw attention to content they were promoting.

They researchers also tracked information on several military-themed websites and used the traffic to these sites - along with the Twitter data - to determine what Facebook accounts promoted similar content on publicly available pages. That yielded maps of online interaction showing, for example, that accounts that linked frequently to veterans and military issues also in many cases linked to content related to Russia.

The kind of information shared by and with veterans and active-duty personnel span a wide range, with liberal political content also common, though not as common as conservative political content. The online military community, the researchers found, also shared links about sustainable agriculture, mental health issues such as addiction, and conspiracy theories.

No one subject dominated the online content flowing among these communities, but the largest individual categories dealt with military or veteran matters. Russian disinformation was a smaller but significant and persistent part of the overall information flow.

"The very idea that there's aggressive campaigns to target military personnel with misleading content on national security issues is surprising. It's disappointing," Howard said. "Because they're opinion leaders, they get more attention from governments and people who spread misinformation."

The other authors of the report, titled "What is the Audience for VetOps: Social Media Operations Against U.S. Military Personnel and Veterans," were John D. Gallacher, also of Oxford, and Vlad Barash and John Kelley of Graphika, which uses social media data to analyze online relationships and influence.