Op-Ed: Political campaigns should reject engaging in disinformation

Campaigns should implement rigorous standards to

Op-Ed: Political campaigns should reject engaging in disinformation

Share this story

Campaigns should implement rigorous standards to ensure they are being transparent with voters

U.S. President Donald Trump taps the screen on a mobile phone at the approximate time a tweet was released from his Twitter account, during a roundtable discussion on the reopening of small businesses in the State Dining Room at the White House in Washington, U.S., June 18, 2020. (Source: REUTERS/Leah Millis)

On May 28, 2020, President Trump issued an executive order to strip liability protections afforded to social media platforms after Twitter labeled two of his tweets about voting by mail “unsubstantiated” as part of their civic integrity policy. The move has kicked off another round of public debate about whether social media companies should have the right and responsibility to fact-check content on their platforms.

But the debate is missing an important element that must be factored in, which is how modern-day online political campaigning is changing in democracies, and not for the better. With social media and rapidly developing technology, political campaigns can now fight for votes with vast troll armies, coordinated fake accounts, paid influencers, fake campaign or news websites, micro-targeted ads, cheap fakes and deep fakes, and a whole host of other tactics to manipulate public opinion and ultimately influence voters — tactics we have come to associate with disinformation campaigns. And we are seeing candidates around the world doing just that in recent years, particularly in countries considered strong democracies.

During the presidential election campaign in Taiwan earlier this year, a so-called “Net Army” ran hundreds of Facebook pages and groups to spread false content, conspiracies, and rumors in support of then pro-CCP candidate Han Kuo-yu. In India’s national election in 2019, the ruling Bharatiya Janata Party employed “IT Cells” to run real and fake social media accounts to flood social media platforms with real, misleading, and outright false campaign content coordinated each day in shared Google documents.

In the US, the Trump campaign and supporters have made no bones about using similar tactics as a key part of his online strategy. In 2016, a Trump campaign volunteer and Roger Stone protégé made an app in which users turn over control of their Twitter accounts to the app to post, follow, unfollow, and like posts on their behalf in what they called “Operation Swarm.”

More recently, in a tweet earlier this month, Trump thanked “my great Keyboard Warriors,” tweeting “You are better, far more brilliant, than anyone on Madison Avenue (Ad Agencies)”. Just the month before, Trump retweeted a parody account’s post of a GIF (created with an iPhone app) manipulated to make it look like former Vice President Biden’s tongue was lolling out of his mouth. The tweet was shared more than 14,000 times and kicked off a debate among those of us in the disinformation community about whether or not it constituted a deep fake.

As political campaigning and technology evolve, it is increasingly necessary for campaigns to have a strong digital volunteer base and social media presence to reach voters. As a result, political campaigns around the world are developing new tools and tactics to campaign on social media and bring politics straight to voters’ social media feeds, but some of those tactics are starting to blur the line between what we would consider good digital organizing (i.e. harnessing online volunteers) and disinformation.

In an ideal world, political campaigns would want to steer clear of that line. For starters, we know that foreign governments like Russia use disinformation to sew political discord in the United States and influence the results of our elections. Disinformation and misinformation created and spread by political campaigns and supporters only serves to make foreign governments’ work easier by feeding them the very content to use.

Social media companies, of course, have a role to play as well in identifying and removing inauthentic activity built to promote or oppose candidates, but most have already determined they will give political campaigns more leeway on promoting false or misleading posts and ads on their platforms. But it really should not be left to for-profit companies to be the arbiters of what people running for office can and cannot say online.

That means it must be up to the campaigns to ensure they do not cross the line, and there I would argue that there should two main criterion for assessing what is acceptable digital campaigning and what is disinformation. First, they should consider whether or not a campaign’s intention in promoting particular content is to mislead or manipulate voters with false or deceptive content. Second, a political campaign would cross the line if they seek to conceal its connection, involvement, or direction of coordinated efforts to promote false or misleading content on social media.

As November draws nearer, campaigns should implement rigorous standards to ensure they are being transparent with voters. These standards would include labeling their own content, and being upfront when they post edited videos or social media threads with false or misleading text or graphics that are meant to be satire. They should also hold digital volunteers to the same standards.

Online political campaigning is evolving and the debate needs to be about a lot more than whether or not platforms can and should be arbiters of truth. It must include what we as citizens define as appropriate tools and tactics a campaign can and should use to secure votes. We can either set the standards now or pay for it later with elections that are run by paid internet armies.

Cindy L. Otis is a former CIA officer and disinformation expert. She is the author of the forthcoming “True or False: A CIA Analyst’s Guide to Spotting Fake News” and a Non-Resident Senior Fellow at the Digital Forensic Research Lab.