#ElectionWatch: Scottish Vote, Pro-Kremlin Trolls

How pro-Russian accounts boosted claims of election fraud at Scotland’s independence referendum

#ElectionWatch: Scottish Vote, Pro-Kremlin Trolls

Share this story
THE FOCUS

How pro-Russian accounts boosted claims of election fraud at Scotland’s independence referendum

Pro-Russian internet trolls fueled claims that Scotland’s independence referendum in 2014 was rigged, and amplified demands for a revote.

The behavior of these accounts is pro-Kremlin, and consistent with the behavior of accounts know to be run by the so-called “troll factory” in St. Petersburg, Russia, during the United States 2016 presidential election. However, it is not possible to determine from open sources whether some or all of the accounts are independent actors, or linked to Russian information operations.

Given the concerns expressed in the United Kingdom over the support of Russian trolls for Brexit, and in the U.S. over Russian interference in the 2016 election, much more research is needed into the activity of pro-Kremlin trolls around Scottish independence, and much more investment is needed into building Britain’s resilience against online disinformation.

The referendum and the results

Scotland’s independence referendum was held on September 18, 2014. The referendum asked a single question:

Should Scotland be an independent country?

With a turnout of 84.59 percent, the “No” campaign (which, with a very British touch, campaigned as “No thanks”) won 55.3 percent of the vote, to the “Yes” campaign’s 44.7 percent.

Scottish National Party leader Alex Salmond accepted the vote in the early hours of September 19. He said, “I call on all of Scotland to follow suit in accepting the democratic verdict of the people of Scotland.”

https://www.bbc.co.uk/news/av/uk-scotland-29272422/alex-salmond-admits-defeat-in-referendum

Claims of fraud

However, almost as Salmond was speaking, claims of fraud began to circulate online. The fraud claims included a number of videos, which purported to show vote-rigging. The most influential was a video posted by “Elite NWO agenda”, which was viewed over 800,000 times. NWO stands for the conspiratorial “New World Order”, a theory of a secret elite plot to dominate the world.

https://www.youtube.com/watch?v=LbJif7vISQg

The post used video and still photos to claim the vote was rigged. The claims were debunked, but links to the video continued to circulate online.

Most of the accounts that shared it appear unexceptionably Scottish-focused; however, a significant minority, especially among the earliest accounts to post, look more like pro-Kremlin trolls. These accounts were among the most vocal amplifiers of the video — posting it repeatedly and tagging different users.

For example, @w_nicht posted the link several times on the morning of September 19, 2014.

Tweets from @w_nicht on September 19, 2014. Both posts archived on December 5, 2017. (Source: Twitter / @w_nicht)

According to a machine scan of tweets sharing the YouTube clip, @w_nicht posted it six times in 90 minutes, each time tagging other users in an apparent attempt to spread it more widely.

The six times @w_nicht shared the video on September 19. (Source: Machine Scan)

@w_nicht is a strongly pro-Kremlin account. It regularly takes positions consistent with known Russian propaganda narratives. Its use of English is sometimes erratic in ways which are consistent with the errors made by native Russian speakers.

Tweet from @w_nicht. Note the inaccurate word order (“Why nobody in EU wanted”) and the lack of the word “the”, characteristic of Russian speakers. Archived on December 5, 2017. (Source: Twitter.)

@w_nicht’s posts include repeated attacks on Ukraine, and defenses (and denials) of Russia’s actions there — common themes of the Kremlin’s propaganda outlets and of the troll factory.

Two posts on Ukraine by @w_nicht. Note, again, the non-native English (“Kiev keep shooting”). Both posts archived on December 5, 2017. (Source: Twitter / @w_nicht)

Its posts accuse the West of hypocrisy or falsification in the shooting-down of Malaysian Airlines flight MH17 over Ukraine, which criminal investigators concluded was done with an anti-aircraft missile sent into Ukraine from Russia. The account’s shares also include opinion pieces by Kremlin propaganda outlet RT, which recently registered as a foreign agent in the U.S.

Tweets from @w_nicht sharing RT opinion pieces on the shooting down of MH17, and the West’s reaction. Both posts archived on December 5, 2017. (Source: Twitter / @w_nicht)

Other posts lambast Turkey for shooting down a Russian plane on November 24, 2015; attack UK Prime Minister Theresa May and her Conservative party for warning against Russian interference in UK domestic politics; defend Russia against claims of systematic Olympic doping; and denigrate murdered Russian opposition leader Boris Nemtsov.

Tweets from @w_nicht on the Tories, doping, Turkey and Nemtsov. Note again the incorrect English (“Why not all Russians are using doping?”). All four posts archived on December 5, 2017. (Source: Twitter / @w_nicht)

All these are characteristic of pro-Kremlin trolls, in general, and the “troll factory” in St. Petersburg, in particular.

@w_nicht was by no means the only pro-Kremlin account to spread the claim of vote-rigging. Another particularly outstanding case was @ArianeDaladier. This account appears to have been inactive since September 2015, but, when active, shared pro-Kremlin content in English…

Retweets by @ArianeDaladier in English, including self-proclaimed “pro-Russia media sniper” Marcel Sardo, and a conspiracy theory blaming the West for Nemtsov’s death. Archived on December 5, 2017. (Source: Twitter / @ArianeDaladier)

… and in Russian.

Retweets by @ArianeDaladier in Russian, sharing an attack on then-State Department spokeswoman Jen Psaki (left) and an RT article about a Brazilian volunteer “fighting for Novorossiya and a multi-polar world” (right). Archived on December 5, 2017. (Source: Twitter / @ArianeDaladier)

The account also posted its own content, in non-native English, criticizing Ukraine and defending Russia’s annexation of Crimea.

Tweets by @ArianeDaladier, both archived on December 5, 2017. (Source: Twitter / @ArianeDaladier)

This appears to be an almost exclusively Ukraine-focused and pro-Kremlin, troll. Nevertheless, on September 19, it retweeted a post which shared the Scottish referendum video, in the midst of its usual Ukraine-centric posts.

Retweets by @ArianeDaladier on September 19, showing the retweet of the Scottish video. Archived on December 5, 2017. (Source: Twitter / @ArianeDaladier)

This behavior suggests it was a pro-Kremlin troll, possibly run from the troll factory, whose main focus was Ukraine, but which was diverted to amplify the claim of vote-rigging in Scotland.

According to the machine scan, 671 users in total shared the video alleging voter fraud. A manual scan of the earliest 100 showed over a dozen which appear to be pro-Kremlin trolls — sharing, for example, Kremlin narratives on Ukraine, Crimea, MH17, Turkey, and the Syrian conflict. These were among the most active accounts, pinging the link multiple times to different users, just as @w_nicht did.

Among them was @glopol_analysis, which posted the video seven times as tweets or retweets, and many other tweets alleging referendum fraud.

Posts from @glopol_analysis, sharing the vote-rigging video. (Source: Machine Scan)

Its posts included shares of another video making the same claims and using the same footage; this will be referred to as the “Boom!” video.

Post from @glopol_analysis on September 19, sharing another vote-rigging video. Archived on December 5, 2017. (Source: Twitter / @glopol_analysis)

Again, this account echoes many top Kremlin narratives, both on its Twitter feed and on the associated blog. It portrays Ukraine as headed by a far-right, U.S.-led coup; attacks Turkey and anti-corruption protesters in Russia; and claims that a sarin attack in Syria was a Turkish false-flag operation.

Posts from @glopol_analysis on Ukraine, sarin, Turkey and the Russia protests. All four tweets archived on December 5, 2017. (Source: Twitter / @glopol_analysis)

According to the Organized Crime and Corruption Reporting Project (OCCRP), yet another troll posted a video of vote-rigging filmed in Russia’s 2012 election and claimed it was in Scotland.

OCCRP wrote that the video was “widely shared by a Russian troll network” with which the hoaxer was close, and “reposted widely by accounts logged under Russian names, including via a Facebook page called NovoRossiya (a reference to breakaway regions of East Ukraine) and several others where dozens of comments can be found in the Russian language”.

Facebook and Twitter searches for these reposts returned only a handful of hits, suggesting that many have since been deleted. However, a Russian-language exposé of the fake recorded some of the Russian-language posts which had accompanied it, supporting the OCCRP’s report.

Screenshot of Russian post debunking the fake, with a selection of the Russian (and some English) comments appended to the original. (Source: avmalgin.livejournal.com)

Pro-Kremlin and Russian accounts such as these played a role in helping to promote the “vote-rigging” video. They were a minority, but an active minority, sharing the claims of fraud repeatedly, and tagging other users and news outlets to amplify the message.

Early movers

The same applies to the “Boom!” video shared by @glopol_analysis. Many of the accounts which posted it appear to come from Scottish users, but some — especially among the earliest movers — resemble Kremlin trolls.

The very first account to share the “Boom!” video was called @skull322. This is a highly-active account, having posted almost 200,000 times since it was created in 2008; its profile picture is taken from Fox television series “Fringe”.

Left, @skull322’s profile (Source: Twitter / @Skull322); Right, a still image from ‘Fringe’ (Source: CNN). Compare the avatar image with the CNN still.

At 06:20 local time on September 19, @skull322 shared the video, using the same headline as the video itself (suggesting that this was either directly shared from YouTube, or an automated post).

Tweet from @skull322, sharing the video. Note the time, UTC+1. Archived on December 5, 2017. (Source: Twitter / @skull322)

@Skull322 is a sedulous promoter of pro-Kremlin narratives. Many of its tweets copy word for the word the headlines which they share, suggesting that this account may be automated (or merely unoriginal). For example, it repeatedly shares anti-Ukrainian content, including from propaganda sites Sputnik and Russia Insider.

Tweets from @Skull322 on Ukraine. Note the shares from Russia Insider (top) and Sputnik (third). Archived on December 5, 2017. (Source: Twitter / @skull322)

It also shares conspiratorial, outlandish, and anti-Western posts on MH17 and the conflict in Syria.

Posts on MH17 and Assad. Archived on December 5, 2017. (Source: Twitter / @skull322)

It also posts on non-Russia-related issues, including terrorism, space exploration, and UFOs. All the below tweets were posted on September 19, 2014.

Other posts from @Skull322 on the day after the referendum. Archived on December 5, 2017. (Source: Twitter / @skull322)

The account thus shares large quantities of pro-Kremlin messaging, but it is not a solely pro-Kremlin account, and its other posts bespeak a preference for conspiratorial content.

Another early amplifier, @just1fix2004, exhibited similar behavior. It also shared the video in the early hours of September 19.

Early tweet from @just1fix2004. Archived on December 5, 2017. (Source: Twitter / @just1fix2004)

This account, again, shares many posts which amplify Kremlin narratives or outlets, including on Ukraine, MH17, and Russian President Vladimir Putin. Many are YouTube shares, again suggesting that this account may be partly automated. Its favorite sources include RT, Iranian outlet Press TV, and far-right commentator Paul Joseph Watson.

Posts from @Just1fix2004 on MH17 and Putin; note the number of YouTube shares. Paul Joseph Watson is shared in the third post on the left. Both archived on December 5, 2017. (Source: Twitter / @just1fix2004)

There are nuances in the behavior of the accounts named above, and others like them which also shared videos alleging vote-rigging in Scotland. Some appear to be run by non-native-speakers, whose allegiance is primarily to Russia; others appear less focused and may be automated. All, however, post considerable quantities of pro-Kremlin messaging, and have little or no apparent connection to Scotland.

Open source methods cannot confirm whether these were simply pro-Kremlin accounts, or troll-factory ones. More research is needed to determine the full extent of such pro-Kremlin accounts’ activity, and their reach and impact.

Russia complains

One claim which can be traced directly to Russia emerged even earlier, as the first results were coming out. This was a claim from a Russian election observer that the vote was not in accordance with international standards. The claim was important for the way in which it fed calls for a revote.

“Observer: Referendum in Scotland does not meet global standards.” RIA Novosti headline, September 19, 2014. Archived on September 19, 2014. (Source: ria.ru)

Within hours of the comment, a Facebook page, “Rally for a Revote”, had been created to demand a rerun of the referendum; as pointed out by the OCCRP, its first post was a Guardian article picking up on the Russian claim.

“Rally for a revote” Facebook page. The original has been deleted, but the page was archived on June 8, 2016. Note the prominence of the Guardian story. (Source: Facebook)

The “Rally for a Revote” page accompanied a petition on website Change.org calling for a full revote (not merely a recount), “counted by two individuals, one of whom should be an international impartial party without a stake in the vote,” in a possible acknowledgement of the Russian observer’s complaint.

Other petitions were also launched. One was a unilateral declaration of independence; a second called for a “public judicial review” of the referendum process; a third was hosted on pro-independence website Yes2014.net, again demanding a revote.

A fourth petition was submitted to the official UK Parliament petitions page, calling for a recount, instead of a revote.

The petitions appear to have been launched by Scottish users. The declaration of independence was launched by a user called Martin Keatings; the petition linked to the Facebook page was launched by a woman named Kirstie Keatings. Both gave locations not far from Edinburgh.

Kirstie Keatings subsequently complained about harassment on Facebook and changed the name of the author to “Rally for a Revote”. The addressee was also changed from Salmond — who resigned after the vote — to his successor, Nicola Sturgeon.

Left: the header of the original petition, from a German archive made on September 23, 2014, addressed to Salmond and attributed to Kirstie Keatings. Right, the updated petition, addressed to Sturgeon and attributed to Rally for a Revote, the same name as the Facebook page. (Source: change.org)

There is no reason to suspect that these petitions were launched by anyone other than Scots dissatisfied with the outcome. In particular, Kirstie Keatings was quoted by the Scotsman (her surname spelt as “Keating”) on September 28, 2014, suggesting both a local presence and local verification.

The petitions received very different levels of support. The declaration of independence gathered 3,888 signatures. The call for a review gathered 25,905. The Yes2014 version had 18,821 signatures by September 21. The petition to the UK Parliament had 23,697 signatures by March 2015.

Far more than any of these, however, the “Rally for a Revote” petition gathered 100,261 signatures by the time it closed.

This is a remarkably high figure, given that total turnout in the referendum was 3.6 million, and given that the formal petition to the UK Parliament achieved less than a quarter of the impact. It raises the question of whether an attempt was made to artificially amplify the signatures.

Change.org only requires an email address and name to sign petitions. @DFRLab asked Change.org twice how it verifies signatures on its petitions; Change.org had not replied by the time of publication. By contrast, the UK Parliament petitions page is limited to UK residents and citizens, and requires a postcode for verification.

Left, sample signature page from change.org. Right, sample signature page for UK Parliament petitions.

According to an archive of the Change.org petition, which showed the ten most recent signatures, they included submissions from Germany, Spain, and England, confirming the petition was not limited to a geographical area.

Signatures of the petition, from an archive made on September 23, 2014.

Signatures on the Yes2014 petition included “Dr Evadne Hinge Hinge” and “Dame Hilda Bracket” (references to comedy stage duo “Hinge and Bracket”) and “Cliff Richard’s Vaginal Deodorant Yewtree”, indicating a lack of credible verification methods.

List of signatories of the Yes2014 petition, from archive made on September 21, 2014 (the date difference is apparently due to the archive registering time in a zone West of UTC).

The signatures of the “Rally for a Revote” petition appear to have come rapidly. A Twitter account called @Hypermobile2011 tracked them during the day, and recorded 3,000 signatures in half an hour, 7,000 signatures in two hours and 50,000 signatures in eight hours.

Posts from @hypermobile2011, tracking the evolution of signatures. Note the timestamps at the bottom of each tweet. All tweets archived on December 6, 2017. (Source for posts: Twitter / @hypermobile2011)

Little social-media evidence remains of how the petitions spread. The “Rally for a Revote” Facebook page has been deleted, so the data on the spread are no longer available. A machine scan of tweets sharing the change.org petition returned only 2,287 posts, suggesting either that many tweets were deleted in the interim, or that Twitter traffic was limited.

Again, however, links to the petitions were amplified by conspiracy-minded, pro-Kremlin social-media users and websites in their later stages, and at least one small network of apparently automated “bot” accounts.

One post, for example, was made on September 22 by a website called Kickass Cookies (kickass-cookies.co.uk). This was not an early mover, rather amplifying the petition when it had already gained significant traction.

Post by @Kickass_Cookies on September 22, 2014, archived on December 6, 2017. (Source: Twitter / @Kickass_Cookies)

The website calls itself an “independent, viewer-supported news platform that helps alternative and non-mainstream viewpoints reach a wider audience.” It is an aggregator rather than a primary source and lists as its “friends” far-right, pro-Kremlin. and conspiratorial websites such as Infowars, Prison Planet, 21st Century Wire, and Iran’s Press TV.

Fragment of Kickass-Cookies’ “Some of our friends” list. (Source: Kickass-cookies.co.uk)

Its own posts on Twitter regularly amplify pro-Kremlin messaging on issues such as Ukraine, MH17, the murder of Nemtsov, and the crisis in Russian relations with Turkey. These include shares of the Kremlin’s own propaganda outlets, RT and Sputnik.

Posts from Kickass_Cookies on MH17, Nemtsov, Ukraine and Turkey. The Gorbachev article on Nemtsov and the article on Turkey were shared from Sputnik. All links archived on December 6, 2017. (Source for all posts: Twitter / @Kickass_Cookies)

According to a WhoIs search, the person behind the website has not yet been validated and chose to hide their address.

Results of WhoIs search on Kickass-Cookies.co.uk. (Source: WhoIs.com)

This site demonstrably shares pro-Kremlin messaging, but also shares far-right and conspiracy sites. Its affiliation and identity cannot be established at this stage; however, it is clearly not a user focused on Scotland. Other than a handful of tweets on September 19, 2014, most of its uses of the word “Scotland” were attempts to hijack the hashtag while promoting other pro-Kremlin content.

Posts from @Kickass_Cookies mentioning Scotland before the referendum; note the use of #Scotland and #ScotlandDecides to promote links with no relation to Scotland. Archived on December 6, 2017. (Source: Twitter / @Kickass_Cookies)

The post by @Kickass_Cookies on the referendum petition was retweeted fifteen times. Of those, nine were posted at exactly the same second, 22:23:41 UTC on September 22 , 2014— a classic indication of automated bot activity.

Retweets of the @Kickass_Cookies post, from machine scan. Note the date and time at the right-hand side.

These accounts have bot-like behavior patterns, and repeatedly share anti-Ukrainian, anti-NATO, and anti-Obama tweets, together with conspiracy theories on MH17. All these are characteristic of Kremlin trolls, but also of far-right users, making attribution difficult.

All, for example, shared a post on MH17, tweeting it at the same minute (6.52 pm on August 7, 2014). The post was taken from the Centre for Research in Globalization, an anti-Western site notorious for false reporting, and argued that “Obama definitely caused the Malaysian airliner to be downed.”

Simultaneous posts on MH17 by members of the botnet. All posts archived on December 6, 2017. (Source for all posts: Twitter)

Another simultaneous tweet was anti-NATO, and hijacked other hashtags to amplify its reach.

Simultaneous posts on NATO by members of the network. All posts archived on December 7, 2017. Note the selection of hashtags. (Source for all tweets: Twitter)

Yet another simultaneous post focused on Chinese and Russian military advances and their threat to the United States. The headline was taken from fringe site 21stcenturywire.com; that article was based on an RT original. The founder of 21stcenturywire.com is listed as a contributor to RT, the Centre for Research on Globalization, and far-right American site Infowars, showing how far-right, conspiratorial, and Kremlin propaganda intersect.

Simultaneous shares of the China/Russia article. All posts archived on December 7, 2017. (Source for all posts: Twitter)

Other late-coming amplifiers of the petition shared the pro-Kremlin stance. @MarquisLeDain, for example, retweeted a post about the petition; it repeatedly shares RT propaganda on Ukraine, shares content in Russian, and also amplified Russian accusations of Turkey’s oil trade with the Islamic State “Daesh” terrorist group.

Posts from @MarquisLeDain on Ukraine and Turkey, including a Russian-language share. All posts archived on December 7, 2017. (Source for all posts: Twitter / @MarquisLeDain)

Its profile proclaims it to be “Anglo Norman Scottish” and based in London. Despite this, it appears unable to spell the name of the English city “Southampton”.

Left, @MarquisLeDain’s profile. Right, tweet on Turkey, mentioning “south-hampton”, which is not the usual spelling. Archived on December 7, 2017. (Source: Twitter / @MarquisLeDain)

Accounts such as these were in the minority, and in some cases, their pro-Kremlin messaging is matched by far-right messaging, obscuring their overall affiliation. However, they did play a role in amplifying the various calls for a revote, especially the poorly-verified ones on Change.org.

Conclusion

In the aftermath of Scotland’s referendum, a range of accounts which post pro-Kremlin content amplified claims of election fraud and calls for a revote. Russia’s election observer did the same. The trolls did not create the claims or the calls; instead, they boosted the existing signal, in a manner consistent with known Kremlin operations, especially in the United States.

Open source research cannot establish definitively which of these accounts were run from the “troll factory” or associated with the Kremlin’s known information operations, and which share a looser ideological alignment; as we know from the experience of the United States, some Russian troll accounts managed to masquerade as Americans for many months, making open source attribution extremely difficult. Nor can it establish how much impact these troll posts had, compared with posts from genuinely Scottish accounts making the same claims.

However, overall, the impact of claims of fraud (not least the Russian observer’s allegation) was considerable. According to a report by the UK’s Electoral Commission, a third of all respondents to a survey conducted by the Commission “thought that fraud took place at the referendum, more than in any previous post- election survey.”

“Asked why they thought so, the main response was ‘I heard or saw stories in the media’.”

Moreover, “Respondents who identified themselves as ‘Yes’ voters (42%) were considerably more likely to think fraud took place compared with No voters (21%).”

The allegations of fraud demonstrably had an impact; pro-Kremlin accounts demonstrably boosted those allegations. The anger and disappointment felt by many Yes voters were entirely sincere, and are not the subject of this analysis; however, those sentiments were fanned by pro-Kremlin trolls, in a manner characteristic of Russian influence operations.

A number of responses are needed. First, more research is required to establish, as far as is possible, the scale of pro-Kremlin troll activity on Facebook and Twitter during the referendum campaign; this article only considers the day after, but the months before were more important politically. Second, the platforms themselves should analyze their own data for signs of Russian activity, as they have done for the United States election and the Brexit campaign.

Third, research is needed into the online petition platforms, especially Change.org, and the way in which their petitions are conducted. These seem like a sitting target for malignant actors who seek to “game” online polls in order to create a political effect.

Finally, as in other countries, attention should be paid to the question of digital resilience. Troll accounts are not impossible to identify, although attribution remains challenging; bots can be identified in many ways (for a sample, see here). Online attempts at manipulating political processes in democratic countries are only likely to grow in the coming years. With Britain facing the political upheaval of Brexit, and the possibility of further referenda, online vulnerability remains a glaring problem which needs to be addressed.


Follow along for more in-depth analysis from our #DigitalSherlocks.

This post was updated on December 13 to correct the No vote in paragraph 5.