Why we made this change

Visitors are allowed 3 free articles per month (without a subscription), and private browsing prevents us from counting how many stories you've read. We hope you understand, and consider subscribing for unlimited online access.

Trolls left forensic fingerprints that cybersecurity experts used to find other disinformation campaigns both in the US and elsewhere.

Human activity leaves all kinds of traces, some more obvious than others. For example, messages posted to services such as Twitter are obviously visible. But the pattern of tweets from a user over time is not as self-evident.

Various researchers have begun to study these patterns and found that they can identify certain types of accounts, particularly those that post in high volume. For example, accounts that post continuously, 24 hours a day, are unlikely to be operated by humans. Instead, this is a clear signal that a bot of some kind is at work.

Humans also generate specific patterns, albeit less obviously than bots. In particular, accounts that post high volumes of tweets often do so in a pattern whose unique signature forensic analysis can identify.

One corpus of interesting tweets encompasses the messages posted by Russian trolls attempting to influence the 2016 US presidential election. Now researchers have analyzed these to search for any unique fingerprints they might contain. The idea is to use these fingerprints to identify other disinformation campaigns by the same trolls that have gone unnoticed. But is this possible?

Today we get an answer thanks to the work of Christopher Griffin and Brady Bickel at Pennsylvania State University. These guys’ forensic analysis has identified a unique signature in these tweets and used it to find evidence of other disinformation campaigns. “We identify an operation that includes not only the 2016 US election, but also the French national and both local and national German elections,” say Griffin and Bickel.

Unique behavioral fingerprints are hard to identify because of the sheer volume of data on Twitter. A vast number of human users share similar behavioral characteristics and so cannot be easily distinguished. However, the behavioral signature becomes more distinctive as the volume of messages increases.

That’s why the Russian trolls are identifiable in this way. Griffin and Bickel downloaded a database of 200,000 Russian troll tweets gathered by Twitter and obtained by NBC News. They then analyzed the tweets by the most prolific users—those who posted more than 500 times during the election period.

The researchers examined the way these users tweeted over time and how they differed from other Twitter users. They also looked for communities within the database and then created word clouds of their tweets showing the most commonly used words.

This threw up a surprise. The analysis revealed seven communities that each use different word clouds. Four of these communities were clearly focused on topics such as the US Tea Party movement and African-Americans.

But two of these word clouds consisted entirely of words in Russian and German. Griffin and Bickel analyzed these further to show that the timing of the tweets spiked in the run-up to the German national election in 2017 and the local Berlin election in 2016. “The Berlin state election was significant because Chancellor Merkel’s party was beaten by right-wing populists,” say the researchers.

The team also found a similar spike in activity in the build-up to the French national election in 2017, although this involved only 588 messages. That’s too small for detailed analysis, but Griffin and Bickel speculate that it points to the existence of another group of trolls, as yet unidentified, who targeted France.

That’s interesting work suggesting that Russian troll activity was significantly more ambitious on an international scale than previously thought. It also suggests a way of spotting this kind of meddling as it is happening by looking for the kind of forensic fingerprint the team identified.

Of course, finding trolls is a cat-and-mouse game. For the organizations responsible for Russian troll activity, it ought to be a straightforward matter to change the pattern of activity in a way that does not create the same signature.

And yet, if this malicious activity is to be significant and effective, it will inevitably take place on a relatively large scale and so generate a different signature. The question is how to spot it in time to take action. And so the game continues.

Digital transformation in certain Asia-Pacific markets is already heavily underway particularly in terms of internal systems, products, and services. The digitalization of manufacturing and supply chains is lagging but will be substantially accelerated by the launch of 5G.

You've read
of three
free articles this month.
Subscribe now for unlimited online access.
You've read
of three
free articles this month.
Subscribe now for unlimited online access.
This is your last free article this month.
Subscribe now for unlimited online access.
You've read all your free articles this month.
Subscribe now for unlimited online access.
You've read
of three
free articles this month.
Log in for more, or subscribe now for unlimited online access.
Log in for two more free articles, or subscribe now
for unlimited online access.