COMMENTARY: DFRLab predictions for 2020

DFRLab researchers reflect on what’s to come in the yearahead

COMMENTARY: DFRLab predictions for 2020

Share this story
THE FOCUS

DFRLab researchers reflect on what’s to come in the year ahead

(Source: NASA)

For our first commentary in our newly merged efforts, we decided to ask some of our team what they expect from the upcoming year as far as disinformation and foreign influence operations in their regions. What are the threats, what are the challenges, and what is the DFRLab going to focus on in 2020?

Jean Le Roux, Digital Research Unit (DRU) Africa

2020 should see a continuation and refinement of the tactics and strategies seen at the close of last year. In October 2019, Facebook shut down three networks, linked to Russian oligarch Yevgeny Prigozhin, that were engaged in inauthentic behavior in eight African countries. These networks engaged local actors — either co-opted ideologically or incentivized financially — to create and amplify hyperpartisan content to influence elections in the region. This is a tactical, and arguably strategic, shift in Russia’s approach to the continent, and I expect this to persist into 2020.

Elections attract disinformation merchants and state actors alike, and several countries in Africa head to the polls this year, including Ghana, Ethiopia, and Central African Republic. Regimes in the region have historically been receptive to the use of foreign public relations agencies and consultants (Cambridge Analytica-linked SCL’s ventures in Ghana spring to mind) and I foresee this to continue in the 2020 elections.

On a positive note, I believe increased cooperation between social media platforms and independent external organizations and academia is on the horizon, as these platforms deal with increased pressure to take action against the proliferation of disinformation.

Eto Buziashvili, DRU Caucasus

One of the main challenges in 2020 will be the increasing number of cases of governments engaging in social media manipulation and disinformation campaigns to influence and deceive the public as well as to silence dissenting opinions in their own countries. In 2019, we saw evidence of state actors working with private groups and companies offering computational propaganda as a service. We also witnessed several Facebook takedowns that collectively removed thousands of pages linked with various state actors, including Russia, China, and Iran. It is also becoming more challenging to distinguish between domestic and foreign disinformation, as the employed tactics and narratives are overlapping.

Elections, which will be taking place in a number of countries in 2020, including the United States, Georgia, and the Western Balkans countries, will be of particular interest to state actors. The Kremlin’s interference in the 2016 U.S. presidential election has already inspired other state actors to copy Russian tactics and tools.

Ayushman Kaul, DRU South Asia

The growing normalization of mis- and disinformation in the political culture of numerous countries in the South and East Asia regions, including India, Pakistan, Sri Lanka, Myanmar, Taiwan, and South Korea should be a cause for great alarm. In 2019, researchers at the University of Oxford Computational Propaganda Project found evidence of organized digital manipulation campaigns in 70 countries, up from 48 and 28 in 2018 and 2017, respectively. This development signals that false and misleading content is being deployed by an increasingly diverse range of regional actors, including terror groups, crime syndicates, interest groups, political organizations, and government bodies.

With a steadily increasing percentage of the population online and a plethora of inter-state rivalries, socioeconomic and cultural fissures among local populations will be rife for exploitation by malevolent actors leading to an elevated risk of violence and political instability sparked by falsehoods circulated online. The deadly efficacy of such disinformation campaigns that leverage preexisting divisions within a society is highlighted by the Anti-Vaccination campaign in Pakistan, in which a small group of people spreading a false rumor — that local children who had received a polio vaccine had died because of it — led to widespread panic, riots, and attacks on health workers administering the vaccine, forcing Islamabad to curtail its nationwide polio drive.

Jakub Kalenský, DRU Central Europe

The trend we see over the last few years is obvious: the primary information aggressors dedicate more resources and more energy to disinformation activities, increase their activity, and then educate additional malicious actors. The security community does not seem to doubt that the problems posed by disinformation will increase rather than decrease.

The main question for me will be whether decision-makers in democracies will finally start reacting in a more robust fashion. By now, it should be obvious that focusing solely on passive defense will never solve the issue. Raising media literacy and educating audiences about the threat is surely desirable, but it will never stop the information aggressors. For that, a different kind of countermeasure is necessary, one that acts as a deterrent. I believe that the U.S. elections this year might be another reminder of that.

Esteban Ponce de Leon, DRU Latin America

As Venezuela, Bolivia, and Peru hold elections in 2020, we will gain more insight into how social media influences conversations about democracy and politics in Latin America. In Venezuela’s case, the main challenge will be to monitor the information environment and see whether the Maduro regime will try to influence and manipulate the conversations around political topics on different platforms such as Twitter. In Bolivia, the gulf between those who consider Evo Morales’s resignation as president the result of a coup d’état and those who see it as a return to democracy will widen. Increasing political polarization will leave the country vulnerable to mis- and disinformation during the elections.

In addition, following the aftermath of the 2019 protests in Latin America, where citizens across the region — most notably in Ecuador, Chile, Bolivia, and Colombia — went to the streets to decry economic austerity policies and political corruption, the challenge will be to monitor potential public unrest, as well as government suppression and media coverage of protests.

Geysha Gonzalez, DFRLab HQ

The greatest danger facing the United States, particularly in the run up to the 2020 elections, will not be coming from foreign actors but rather from domestic actors that are adopting, finessing, and employing the tactics of malign actors, like the Kremlin, to advance their political agendas. There is no question that the 2016 election led to an awakening to the challenge of disinformation and how these campaigns exploit political, racial, and societal divisions with the aims to destabilize and distort our democracy. As our understanding of the challenge advances, however, so, too, will the tactics of malign actors. As we head into 2020, one of the things to watch for will be the use of shallowfakes and information manipulation over deepfakes and false information, both of which will be harder to disprove and debate.


If you have a suggestion, commentary, or feedback for this section, or if you would like to pitch an op-ed to us, please write to jkalensky@atlanticcouncil.org.