#TrollTracker: How To Spot Russian Trolls

Some clues which identify probable Kremlin accounts

#TrollTracker: How To Spot Russian Trolls

Share this story
THE FOCUS

Some clues which identify probable Kremlin accounts

Since Russia’s attempt to polarize American society and interfere in the 2016 election was exposed, and even more since the poisoning of former spy Sergei Skripal in the United Kingdom, accusations of being a “Russian troll” have proliferated online, often with the flimsiest of evidence.

Such accusations do more harm than good, obscuring the ways in which trolls can really be identified and increasing online polarization still further.

In the interests of transparency and reasoned debate, this post lists some of the factors which can be used to identify possible Russian trolls masquerading as citizens of other countries.

These clues are indicative, rather than conclusive. It is seldom possible to say with 100 percent certainty that a given account belongs to a Russian troll operation, rather than merely supporting Russian narratives.

However, these factors do allow for the reliable indication of pro-Kremlin accounts. If an account shares most of the factors, but claims to be a patriotic citizen of another country (especially the United States and United Kingdom), it may well be a covert influence account, like the hundreds which Russian journalists exposed working out of the St. Petersburg “troll factory.”

As with any open source investigation, the combination of multiple factors is key. A single indicator is seldom enough to confirm identification. What is important is the approach: assessing a suspect account from all angles and across as long a timespan as possible.

One small caveat that with the increase in accusations of Russian trolling since October 2016, a number of Western users have claimed to be Russian, in an apparent attempt to “trigger” opponents and discredit researchers. It is therefore important to review the full lifespan of an account, and to exercise caution with more recent posts.

Partisan is not enough

Before examining the factors which reliably indicate a pro-Kremlin troll, it is important to look at one factor which does not. That is hyper-partisan content.

Russia’s information operation against America ranging from 2014–2017 did involve a large number of accounts posing as hyper-partisan Americans on both sides of many divides, notably the #TakeAKnee controversy.

At left, a post from Russian troll account “PanAfricanRootsMove.” At right, a post from Russian troll account “MericanFury.” Both images thanks to the repository created by UsHadrons on Medium. (Source: Facebook)

As a result, the already heated atmosphere of online political debates has been further envenomed by mutual accusations of being “Russian trolls” (or even “Russian bots”).

However, hyper-partisan content is never enough, on its own, to qualify an account as pro-Kremlin or Russian. More evidence is always needed.

A or The?

One of the linguistic signs which is characteristic of many known Russian accounts is the inability to use the grammatical articles — “a” and “the” — appropriately. The Russian language has neither.

This post, for example, was made by Russian troll Instagram account Muslim_Voice in May 2016; note the phrases, “I don’t want my kids to walk on streets with the sign like this,” and “to stop the Islamophobia and Xenophobia.”

Post from @Muslim_Voice, recovered by @UsHadrons. (Source: Instagram / Muslim_Voice)

Russian Twitter troll @USA_Gunslinger appeared to have made the same mistake in this post and claimed that Hillary Clinton will never have “an honor.” In British English, an “honour” is an award from the Queen, but this is unlikely to be the U.S. account’s context; it appears to mean that Clinton will never have honor.

Tweet by @USA_Gunslinger, recovered by @UsHadrons. (Source: Twitter / USA_Gunslinger)

Confirmed Russian Twitter trolls made many similar mistakes in their posts, according to a repository of over 200,000 posts recovered by NBC News.

“We need fight the terrorism to fight the terrorism! But not track phone calls to fight the terrorism.” — @michellearry, June 2015.

“I want chaos to be over! We need #GOP president to stop it! #GOPDebateSC” — @heyits_toby, January 2016.

“With Hillary in charge, America will burn in flames of a shame. So don’t let this happen!” — @tpartynews, February 2016.

This linguistic telltale is one of the most common indicators of a Russian speaker (although other languages, including Polish and Latvian, also lack the definite article). Being common, it can be faked by those who wish to masquerade as Russian trolls in order to deceive and discredit researchers; however, an account which poses as an English speaker, but consistently makes the same mistake, should be investigated further.

What the question is?

Another common linguistic indicator is the inability to phrase a question. In Russian, the word order for questions does not change, unlike in English, German, and formal French. Many known Russian troll accounts have posted questions which kept the word order of statements.

Post from Russian troll Instagram account Anonymous_News; note the mangled word order in the question, and the addition of other highly-charged terms in the text such as, “NoDAPL” (a reference to the contested Dakota Access Pipeline), “policebrutality” and “amerikkka,” a reference to the Ku Klux Klan. Image recovered by @UsHadrons. (Source: Instagram / Anonymous_News)

Again, other confirmed Russian Twitter trolls made the same error, as these examples from the NBC News archive show:

“Why our government doesn’t send us some help?! Phosphorus leak in Pocatello #phosphorusdisaster” — @ryanmaxwell_1, March 10, 2015.

“ Why I need a credit card? What you think this cash for?” — @logan_whatsup, February 2015.

The same error shines through in this YouTube video, part of a package of measures the Russian troll operation put together in September 2014 to allege a fictitious chemical leak in the United States. The short clip consisted of an unseen man with an unclear accent yelling at a television screen which purports to show ISIS claiming responsibility for the blast. His attempt is not quite English:

“Now I wonder, what do you watching, guys?”

(The phrase comes at timestamp 00:16.)

https://www.youtube.com/watch?v=E2J6RvajSaA

Again, an ostensibly English-language account which makes this mistake should be viewed with skepticism.

Searching for narrative clues

There is, of course, a difference between a Russian account and a pro-Kremlin one. Linguistic telltales are, therefore, generally insufficient to expose the troll.

Rather, the linguistic clues should be combined with narrative ones. The Russian government has developed a distinctive narrative on key geopolitical events of the last five years. These events cover different countries and continents, and are separated across time.

An account which repeatedly shares Russian government talking points on most or all of these events can justifiably be considered pro-Kremlin. The next step in identifying a pro-Kremlin account is, therefore, to search its timeline for these narrative telltales.

For example, to search for original posts mentioning Crimea from account a notorious troll account like @TEN_GOP, the formula on Twitter is:

from:TEN_GOP Crimea

To search for posts mentioning the phrase, “Ukrainian Nazi coup,” from @TEN_GOP, the formula on Twitter is:

from:TEN_GOP “Ukrainian Nazi coup”

To search for posts from @TEN_GOP mentioning the word “Crimea” between February 28 and March 18, 2014 — one of the times when the Kremlin propaganda machine as a whole launched a sustained disinformation campaign to justify Russia’s attack on Ukraine — the formula is:

from:TEN_GOP Crimea since:2014–02–28 until:2014–03–18

These searches do not return retweets, only original tweets: as such, they give an indication of what the account has posted itself, but not its entire pattern of behavior. This search technique returns only active accounts, thus @TEN_GOP is only an example, as it has been taken down by Twitter Public Policy.

Key moments and themes to search for include:

  • The Russian annexation of Crimea, February 28 — March 18, 2014, and especially the narrative that Ukraine was the aggressor, or a Nazi state;
  • The shooting-down of Malaysian Airlines flight MH17, July 17, 2014, and the subsequent investigations which demonstrated that Russia provided the fatal missile;
  • The heavy fighting in Ukraine in January-February 2015, and the deployment of Russian tank reinforcements, as exposed by the Atlantic Council’s report, “Hiding in Plain Sight;”
  • The assassination of opposition leader Boris Nemtsov, February 27, 2015;
  • Turkey’s shooting down of a Russian Su-24, November 24, 2015;
  • The siege of Aleppo, December 2016;
  • In the broader Syrian context, the White Helmets rescue group and Syrian girl Bana Alabed;
  • The sarin attack on Khan Sheikhoun in April 2017, and the resulting U.S. missile strike.

When assessing a potential troll account, it is important to look at what, if anything, it posted on all these themes. An account which posted on the majority of them, sharing Kremlin narratives, can safely be classified as a pro-Kremlin one.

If the account shares most or all of the Kremlin narratives, makes the characteristic linguistic errors and poses as an American or British user, it may be a Russian-operated troll.

Crimea and MH17, 2014 and onwards

Russian government propaganda focused especially heavily on the annexation of Crimea and the shooting-down of MH17 in 2014. A number of themes characterize Kremlin propaganda of this period, and reliably indicate pro-Kremlin trolls.

For example, Russian government outlets consistently broadcast claims that MH17 was brought down by Ukraine or the CIA, despite the weight of evidence that showed the aircraft was downed with a Buk-M1 missile made in, and brought from, Russia.

Posts on MH17 from RT. (Source: Twitter / RT_com)

Later messaging broadcast attacks on the Joint Investigation Team (JIT), a multinational group which concluded that MH17 was indeed downed with a Russian missile.

RT posts on the Joint Investigation Team. (Source: TwitterRT_com)

They also targeted the Bellingcat team of investigative journalists, who gathered much of the open source evidence around the crash.

RT post on Bellingcat. @DFRLab demonstrated that the group of “citizen-journalists” to which the article refers was, in fact, a group of Kremlin employees. (Source: Twitter / RT_com)

Other significant claims included that the Russian annexation of Crimea was the result of a “coup” in Kyiv; that the Ukrainian government is dominated by neo-Nazis and fascists; and that the Russian special operations forces which occupied Crimea in February-March 2014 were “local self-defense forces.”

RT posts on the Ukrainian “coup-appointed” government, the “self-defense squads” and “fascist” Ukraine. (Source: TwitterRT_com)

In each of these cases, the Kremlin’s claims did not stand up to the test of the evidence. The Russian special forces were exposed as such as early as March 2, 2014; the JIT concluded that the Buk-M1 missile system which downed MH17 entered Ukraine from Russia, and returned there; far-right forces never made up more than a fraction of the Ukrainian parliament, far smaller than far-right groups backed by Russia in countries such as France.

@Jenn_Abrams, one of the Russian operation’s most influential accounts, also appears to have attacked Ukrainians as “savages,” to judge by this reply to a deleted post.

(Source: Twitter / clochette31000)

Accounts which regularly post such claims can be considered pro-Kremlin. It would be too much of a leap to conclude on the basis of such posts alone that they are part of a Russian government operation: more evidence would be needed to prove that point.

Ukraine battles, 2015

Many of the known Russian troll accounts paid particular attention to Ukraine in January 2015, as Russian Army tank units from the Russian Far East entered the conflict there.

As one example, under the hashtags #SomeoneWhoKillsChildren and #WorldWakeUpUkraineKillsItself, confirmed Kremlin trolls attacked the Ukrainian governmnent, and especially Ukrainian President Petro Poroshenko.

Tweets on Ukraine posted by ostensibly American accounts run by the Russian information operation in January 2015. (Source: NBC News)

These were primarily Russian-language campaigns; searching for the hashtags revealed a number of still-extant pro-Kremlin, Russian-language accounts from the period.

Post from @SplashNm on the troll hashtag. Archived on February 28, 2018. (Source: Twitter / @SplashNm)

Accounts which showed a sudden focus on Ukraine in general, and Poroshenko (often mocked as “Porky”) in particular, in the first two months of 2015, therefore fit the pattern of known Russian trolls.

Nemtsov assassination

The assassination of Boris Nemtsov was an especially important moment in the study of pro-Kremlin information operations, because we have confirmation from both internal and external sources that the “troll farm” in St. Petersburg was ordered to treat the event as an emergency, and post massively on it.

The internal confirmation comes from Russian journalist Andrei Soshnikov, who published a list of orders given to the troll farm in early 2015:

When Nemtsov was killed, the work of the Kremlin robots changed: they stopped pouring on Ukraine (this is their routine everyday work) and were transferred to the murder … They said the same thing in different words: the murder is a provocation, the Kremlin has nothing to do with it, the opposition killed its own man to attract more people to the march.

The external confirmation comes from British researcher Lawrence Alexander, who exposed a network of thousands of Twitter bots after they suddenly began posting about Nemtsov.

Image by Lawrence Alexander of interconnected botnets exposed as part of his investigation into automated posts about Nemtsov. (Source: Global Voices / Lawrence Alexander)

Each independent source confirmed that the Nemtsov murder triggered a major troll farm operation. Accounts which posted attacks on Nemtsov, or shared a wide range of conflicting theories about his death, in the days immediately after February 27, 2015, should therefore be examined closely.

Su-24 downing, 2015

On November 24, 2015, Turkey shot down a Russian fighter jet on its border. In the aftermath, Russian state outlets launched a full-scale campaign against Turkey and its President Recep Tayyip Erdoğan.

The campaign included a number of interlocking themes. One was that Turkey’s behavior was reckless; another was that it was pre-planned.

RT posts in the aftermath of the downing. (Source: Twitter / RT)

A third was that Turkey was both enabling and profiting from the ISIS terrorist group, by acting as an illegal exporter of oil from Syria.

Yet another was that Turkey was guilty of genocide against the Kurds; RT’s treatment of the issue was so one-sided that it had two interviews found guilty of violating basic impartiality rules by the UK’s telecoms regulator, which enforces standards of broadcast journalism.

RT even launched a hashtag, #JusticeforKurds, demanding a United Nations investigation into alleged mass killings of Kurds by Turkey.

International questions were, of course, raised at the time over Turkey’s treatment of the Kurds, and its relationship with ISIS; mentioning them is not, therefore, enough on its own to qualify an account as pro-Kremlin.

However, posting on such issues immediately after the Turkish shootdown, and especially sharing posts from RT and Sputnik, can be one indicator of a potential pro-Kremlin account, when cross-referenced with other factors.

Aleppo, 2016

The siege of Aleppo in December 2016 saw numerous atrocities and attacks on civilians committed by all sides, including Russia; for example, Kremlin propaganda outlet RT accidentally revealed that Russian aircraft were using indiscriminate cluster munitions over the city. @DFRLab and partners chronicled such atrocities in “Breaking Aleppo.”

Russia’s propaganda operation attempted to push back in a number of ways, including by claiming that the siege of Aleppo was a “liberation”. Russian troll accounts supported it by repeatedly retweeting posts from other users which amplified the narrative.

In the following examples, the original tweets were posted by independent users, but then retweeted by confirmed Russian troll accounts, as shown by the NBC archive. We included them here to illustrate the narrative themes which the troll farm promoted.

(Source: Twitter / Maytham956)

Troll account @judelambertusa retweeted the below post.

(Source: Twitter / @VanessaBeeley)

Russian government messaging focused particularly on the White Helmets rescue group, one of the main sources of evidence for war crimes committed in the Syrian conflict.

Another confirmed Russian troll account, @LauraBaeley, retweeted this post. The tweet was from website 21st Century Wire, but it shared an RT program which featured three vocal critics of the White Helmets, and suggested that the rescuers were “an elaborate and cynical Western PR stunt promoting illegal regime change in Syria.”

(Source: Twitter / @21stcenturywire)

Known Kremlin accounts also targeted Bana Alabed, a young girl who tweeted about daily life in Aleppo under siege, and drew public attention to the suffering of civilians.

Sputnik tweet about Bana. (Source: Twitter / @SputnikInt)

It is important to note that Syrian government propaganda also targeted these issues. The presence of such posts, without further evidence, is therefore insufficient to prove whether an account is pro-Kremlin. However, when other indicators are present, the presence of posts on Aleppo provides a valuable cross-reference.

Sarin, April 2017

The sarin attack on civilians in Khan Sheikhoun, Syria, on April 4, 2017, and the retaliatory U.S. missile strike on April 7, were another trigger moment for the troll farm. A number of known Russian trolls reacted to the U.S. strike by demanding that President Trump fire his son-in-law, Jared Kushner, who was seen as the motivating force behind the strike.

Post by Russian troll @TEN_GOP after the U.S. strike. Recovered by @UsHadrons. (Source: Twitter / @TEN_GOP)

Russian troll @baobaeham retweeted an angry post calling the sarin attack a “CIA false flag.” In fact, a UN investigation attributed it to the regime of Bashar Al-Assad, Russia’s ally.

Post retweeted by Russian troll @baobaeham. (Source: Twitter / @Fixer_Guy)

This was an occasion on which Russian trolls amplified attacks on Trump, rather than defending him, such as this post retweeted by confirmed Russian troll @mrclydepratt.

(Source: Twitter / AfroStateOfMind)

As with the posts on Aleppo, posts such as these are not diagnostic on their own: many genuine members of Trump’s support base turned on him after the air strikes. However, taken in combination with other evidence, they can reinforce the picture of an account’s activity.

Case study: a probable troll

In December, @DFRLab identified a probable Russian troll account, @IamJohnSmith, which had been targeting British MP Tom Brake. The account has been suspended; we review the identification here as a case study in the combination of evidence which can expose the troll.

The account came to @DFRLab’s attention after if targeted Brake for a tweet he had posted, saying that Russia has a “history” of election interference.

The post by Tom Brake MP, and the reply from @iamjohnsmith, who is almost certainly not John Smith. Archived on December 20, 2017. (SourceTwitter)

The account claimed to be English and based in London. It was effectively anonymous, and prolific in its tweets, both indicators of a potential professional troll. Created in March 2017, it had posted some 11,000 times by the end of the year, or roughly 37 times a day.

@iamjohnsmith’s profile page. Archived on December 20, 2017. (Source: Twitter / @iamjohnsmith)

Although it was a recent creation, the account posted on several of the known Kremlin themes, notably the “Nazis in Ukraine” narrative, attacks on Bellingcat and the “sarin false flag” theory.

Tweets from @iamjohnsmith on Ukraine, Bellingcat and sarin. Tweets archived on December 20, 2017. (Source for posts: Twitter / @iamjohnsmith)

It repeatedly shared articles from RT on a range of themes.

Shares of RT stories by @iamjohnsmith on Syria (right and left) and the EU. Note the time of the right-hand post, 3.43 am UTC, which is an unusual time for a UK user to be online. Tweets archived on December 20, 2017. (Source for posts: Twitter / @iamjohnsmith)

The account also interacted repeatedly with confirmed Russian troll accounts @TEN_GOP and @Pamela_Moore13.

@iamjohnsmith’s tagging of @TEN_GOP (left) and @Pamela_Moore13 (right). Source: Twitter searches, archived on December 20, 2017. (Source: Twitter / @iamjohnsmith)

Taken individually, none of these points would be sufficient for identification. Based on all of them, and others, however, we concluded that @IamJohnSmith was likely to be a Russian troll masquerading as an English account, not merely a pro-Kremlin one.

Conclusion

These factors can be used to indicate the probability that a given account is a pro-Kremlin troll, or a Russian-operated one. They are most useful with older accounts, but even in relatively recent accounts, such as @IamJohnSmith, they can provide useful clues.

The list is not exhaustive. The Kremlin’s information operations have focused on many themes in recent years; the ones mentioned above are only the most common we have observed. Nor should the list be considered as a guaranteed way to find Russian trolls. It indicates a likelihood, rather than a certainty. Some accounts which post on all the above subjects may be genuine users; some known Russian trolls posted relatively little on these subjects.

However, the list does provide a methodology for identifying possible trolls, and (as the case of @IamJohnSmith shows) it can be used to expose them. In such studies, it is the approach which is important: assessing multiple aspects of an account’s activity before coming to a conclusion.

Given the heated nature of the “Russian trolls” debate online, it is important to approach the evidence systematically and transparently. The method outlined above is designed to do so.


Follow along for more in-depth analysis from our #DigitalSherlocks.