Op-Ed: In the United States, the threat of election disinfo is mostly home-grown

Identification of U.S.-based troll farm illustrates

Op-Ed: In the United States, the threat of election disinfo is mostly home-grown

Share this story
THE FOCUS

Identification of U.S.-based troll farm illustrates broader trend

“We don’t do marketing. We do more.” (Source: rallyforge.com)

Today, the overwhelming majority of disinformation targeting the U.S. 2020 presidential election is being manufactured domestically. Although foreign interference remains a serious concern, no Russian, Iranian, or Chinese operation has rivaled the reach of the viral falsehoods routinely spread by far-right fringe media and often amplified by President Trump. The numbers are not even close.

Well beyond election day, Americans will have to face the fact that most of the disinformation campaigns they are exposed to include the work of other Americans. In many cases, these domestic actors have learned from — and improved upon — the playbook used by foreign adversaries and vice versa.

A troubling example was the recent exposure of Rally Forge, an Arizona-based troll farm that had worked on behalf of U.S. political influencers for five years until its removal on October 8, 2020 by Facebook and Twitter. The Rally Forge network used cheap labor and fake personas to build its audience while promoting falsehoods about COVID-19 and voter fraud. At its peak, the network spanned 255 Facebook assets, 76 Instagram accounts, and 262 Twitter accounts. It had accumulated 400,000 followers across Facebook and Instagram and spent nearly $1 million dollars in advertising, making it the second-largest U.S.-targeted inauthentic network to be removed since the 2018 midterm election.

By serving as a for-profit disinformation contractor, Rally Forge echoed methods that have become prevalent across the world. In the past two years, significant for-profit troll farms have been uncovered in Canada, Egypt, Georgia, Israel, the Philippines, and Tunisia, among others. These clandestine marketing firms have targeted dozens of elections and social movements. Even Russia’s Internet Research Agency (IRA) — itself a for-profit troll farm — has experimented with using contractors in Ghana to further obscure operations targeting the Black community in the United States. These services give disinformants a shield of plausible deniability and come with few downsides. Despite skyrocketing demand, labor costs remain cheap; sometimes practically free.

This same labor dynamic was true of Rally Forge. The troll farm employed mostly teenagers and encouraged them to treat their work like a summer job. The intent was to flood Facebook pages and news articles with seemingly organic content, written by paid contractors working from a common script. Some accounts used AI-generated faces to further bolster their ranks and adopt fake personas that could not be easily identified by outside observers.

Disinformants have also shifted from primitive (and easily detectable) botnets and automated scripts to focus on the curation of a small number of believable social media personae. Iran has sought to use its convincing fake accounts to win the trust of real-life journalists and politicians, while Russia’s IRA has experimented with hiring unwitting American writers to staff its deceptive content farms. The tactics of Rally Forge followed a similar mold, using large numbers of fake Facebook comments to steer public conversation while keeping their coordination virtually invisible.

But there is one crucial difference. Rally Forge was an American company, staffed by Americans and working for American clients, masquerading as different Americans while targeting American voters. It was also vastly more impactful than comparative foreign disinformation efforts. Recent operations by Russia and Iran have been detected and removed before they can reach more than a few hundred Americans. Rally Forge, on the other hand, reached hundreds of thousands.

Yet it is one thing for social media platforms to remove foreign disinformation; it is quite another for them to take action against domestic troll farms that work to the benefit of a sitting president. This is all the more difficult in the case of Trump, who has made exaggerated claims of Silicon Valley’s “anti-conservative bias” a central plank of his re-election campaign. This perhaps explains why Facebook and Twitter took no action to punish Rally Forge’s principal client, Turning Point USA, a prominent pro-Trump youth organization with 2.3 million followers on Facebook alone. When the entities behind disinformation campaigns have a direct line to the White House and Congress, it is much harder to act against them.

In some ways, the Rally Forge network is a sign of how far counter-disinformation efforts have come since 2016. Diverse teams of experts and researchers have flocked to this once under-studied field. Social media platforms like Facebook and Twitter have invested significant resources in disrupting obvious manipulation efforts. U.S. civil servants have worked hard to increase public awareness of the threat, even as the president has often sought to obfuscate the issue. If Rally Forge’s campaigns had looked like those of Russia’s IRA four years ago, they would not be as effective.

But the Rally Forge case is also a reminder that time does not stand still. Since 2016, disinformants have evolved in their strategies and intent. Those committed to facts must now do the same. The counter-disinformation community should expect to see more efforts orchestrated by marketing firms and middlemen. We should be ready to detect and counter smaller networks of more realistic social media personae. And we should be prepared to apply the same treatment to Americans as we do to foreign trolls, barring them from social media platforms when they engage in repeated disinformation campaigns, no matter how uncomfortable this task might be.

The alternative — fixating exclusively on foreign interference while ignoring the rising threat at home — is no alternative at all.


Emerson T. Brooking is a Resident Fellow with the Digital Forensic Research Lab.

Graham Brookie is Director and Managing Editor of the DFRLab.

Follow along for more in-depth analysis from our #DigitalSherlocks.