#TrollTracker: Facebook Uncovers Iranian Influence Operation
Iranian narratives buried in divisive content target United States and United Kingdom
#TrollTracker: Facebook Uncovers Iranian Influence Operation
Iranian narratives buried in divisive content target United States and United Kingdom
On October 26, Facebook announced that it removed 82 pages, accounts and groups for “coordinated inauthentic behavior that originated in Iran and targeted people in the US and UK.”
According to Facebook, while these pages and accounts originated in Iran, there are currently no ties to the Iranian government. It was the second time in two months that Facebook shuttered an Iranian network, after taking down over 600 accounts in August. Last week, Twitter published over 1 million tweets from that same network; @DFRLab analyzed them here.
Facebook shared ten pages and 14 Instagram accounts with @DFRLab, 12 hours before the takedown. These accounts masqueraded primarily as American liberals, posting only small amounts of anti-Saudi and anti-Israeli content interspersed within large volumes of divisive political content such as race relations, police brutality, and U.S. President Donald Trump. This evolution of tactics from previous more blatant pro-Iranian messaging suggests the operation had learned from earlier takedowns.
These assets were designed to engage in, rather than around, the political dialogue in the United States. Their behavior showed how much they had adapted from earlier operations, focusing more on social media than third party websites and becoming much more engaging.
This post sets out the most important findings, giving an initial description of their open-source, publicly-visible features, focused on those which suggested a lack of authenticity or a resemblance to earlier troll operations. It is vital to understand the evolution of this threat, to ensure that responses to it also evolve.
@DFRLab intends to make every aspect of our research widely available. The effort is part of our #ElectionWatch work and a broader initiative to provide independent and credible research about the role of social media in elections, as well as democracy more generally.
@DFRLab will analyze the accounts in further detail in the coming days.
1. Election targeting
Some of the pages posted directly about American political processes, either discussing their own votes or calling on others to vote.
This is the most immediate and output-driven component of the accounts, given how close the U.S. midterm elections are.
2. Big Hitters
Some of the Facebook pages had very large followings, and impressive numbers of shares. One, called I Need Justice Now (@INeedJusticeNow), had over 13 million video views; another, No Racism No War (@nornowar), had over 412,000 likes and almost half a million followers.
3. Recent Creations
The accounts and pages were recently created, unlike the Twitter and Facebook assets and the websites which were taken down in August, many of which dated back years. This is significant, as the dates imply either a separate Iranian effort, or an evolution of the earlier campaign, focusing more on divisive content.
Two of the most-followed pages, @VoiceofChangee (with 113,155 followers) and @INeedJusticeNow (with 61,507 followers), were created this year, on February 3 and April 1 respectively.
The earliest, @nornowar, dated back to January 2016.
4. More Engaging, More Engagement
The accounts posted a much more engaging range of content than the earlier operation, which focused on using social media to drive users toward websites laundering pro-Iranian regime messaging. The latest batch of accounts sought to drive engagement on the platforms, rather than off them, with a mixture of memes, videos, and authored comments.
The approach appears to have worked, with posts on both Instagram and Facebook receiving large numbers of shares and replies.
5. Divisive Content
The great majority of posts by these consisted of divisive and polarizing content, especially attacking President Trump and the Republican Party. This is similar to the approach practiced by Russia’s troll operation, but the Russian operation targeted both sides in America’s most painful debates.
One Instagram account — @RepublicansUnited2 — masqueraded as a conservative Christian user. Its messaging was not explicitly divisive, and it ceased posting in September 2017; nevertheless, its identification as part of the network may indicate an initial intention to target both sides.
Most of the others posed as left-wing accounts, attacking Trump and the Republicans, and praising the Democrats.
Many of the attacks were personalized against President Trump.
Others focused on other divisive political issues, notably race relations, and especially on police violence against the African American community.
Some of them hinted at the need for violence, or called for it.
This divisive posting constituted the majority of content, suggesting that one main aim of the Iranian group of accounts was to inflame America’s partisan divides. The tone of the comments added to the posts suggests that this had some success.
6. Anti-Israel, Anti-Saudi, Pro-Yemen
In between these posts, the accounts repeatedly attacked Iran’s regional rivals, Israel and Saudi Arabia. This is the strongest external indication that they were part of an organized Iranian network designed to amplify the regime’s chosen narratives, in the way that earlier networks did.
Some opposed the United States’ policy in the Middle East more generally.
7. Recycled Content
Some of the content posted by these accounts appeared original, but much more appeared to have been taken from authentic websites. This appears to have been an attempt to blend in with the authentic communities, and also, perhaps, to attract the attention and endorsement of genuine users.
On 10 September 2018, for example, the page @TMag shared a video of a “kinetic door” which had been posted on YouTube the month before.
On July 8, 2018, the same page shared a cartoon attacking France for hosting a meeting of the controversial MEK group, regarded by Iran as a terrorist organization, and listed as such by the United States until 2012. The original cartoon was dated to 2015 and posted on a blogspot page called Latuff Cartoons.
A post by @INeedJusticeNow on August 28, 2018, copied a Twitter appeal by former cricket star Kevin Pietersen to find two lion hunters, posted two days before.
8. Many memes
All these accounts were heavy on meme content, light on text. This may have been a way of driving engagement, but may also have contributed to reducing the need for original-language posts, and thus reducing the chance for language errors which would have betrayed them.
Some of the memes were remarkable for the errors they made — errors which appear unlikely to have been made by Americans.
One post contrasted the deaths of American soldiers in WWII with modern neo-Nazi marches in America, but the image it used was of Soviet soldiers, not U.S. GIs.
Another was structured as a meme with two parallel images, but only made sense if read from right to left, as if translated from Arabic.
9. Mostly Negative
While some of the pages made some positive comments, listed above, the great majority were negative. This is a very similar approach to that adopted by the Russian “troll farm” in its attacks on America from 2014 to 2018, and may indicate that the Iranian account managers were drawing on Russia’s experiences.
10. Artificial Amplification
Some of the accounts’ amplification statistics, especially their shares, were so disproportionate to their followings that it suggested artificial amplification.
Conclusions
The shuttered assets seemed to focus on promoting divisive content in America (one account also focused on boosting left-leaning posts in the UK). In between that content, they amplified posts attacking Israel for its behavior towards Palestinians, and Saudi Arabia for its treatment of Yemen, and its own citizens.
The foreign-policy messaging was in keeping with earlier Iranian networks, which primarily amplified Iranian government narratives on the Middle East. The focus on divisive content is much closer to the behavior of the Russian information operation, although the Iranian operation does not appear to have targeted conservatives as much as liberals.
Taken with Facebook’s own attribution, this suggests that the accounts were indeed part of an information operation supporting the Iranian regime, but that they adapted and evolved in light of earlier Iranian and Russian operations. This confirms earlier assessments that such troll operations have moved on since 2016, but are still active in evolving permutations, underscoring the importance of ongoing analysis to keep up with them.
@DFRLab team members Donara Barojan, Lukas Andriukaitis, Kanishk Karan, Aric Toler, Michael Sheldon, and Nick Yap made this report possible with their research.
@DFRLab is a non-partisan team dedicated to exposing disinformation in all its forms. Follow along for more from the #DigitalSherlocks.
DISCLOSURE: @DFRLab announced that we are partnering with Facebook to expand our #ElectionWatch program to identify, expose, and explain disinformation during elections around the world. The effort is part of a broader initiative to provide independent and credible research about the role of social media in elections, as well as democracy more generally.
For more information click here.