What to expect from foreign threat actors following the 2024 US election

US intelligence community on high alert for the intensification of influence operations in the days and weeks following the 2024 US election

What to expect from foreign threat actors following the 2024 US election

Share this story
THE FOCUS

Banner: Chinese President Xi Jinping and Russian President Vladimir Putin speak during the BRICS summit in Kazan, Russia, October 24, 2024. (Source: Reuters/Maxim Shemetov/Pool)

Threats of foreign malign influence from China, Iran, and Russia have loomed throughout the leadup to the November 2024 US general election and will continue after the votes are counted, according to assessments from the US Intelligence Community (IC). Repeated warnings from IC officials and agencies stress that foreign threat actors are attempting to influence the election by sowing discord and divisions among Americans, undermining support to allies, and casting doubts on democratic processes and elections. Foreign threat actors have capitalized on existing sociopolitical divisions, creating and amplifying propaganda and divisive narratives undermining the US, its democracy, and its role in international affairs.

In October 2024 , the DFRLab launched its 2024 Foreign Interference Attribution Tracker (FIAT), a dashboard documenting foreign malign influence allegations from various sources around the 2024 US elections. FIAT builds on our case selection and attribution methodology first developed for our 2020 FIAT dashboard. At the time of the tracker’s launch, US government statements and actions related to foreign influence accounted for half of the identified attributions.

IC expects foreign influence operations through inauguration day

On October 16, the Office of the Director of National Intelligence declassified an intelligence memo written eight days earlier which assessed that foreign actors from China, Iran, and Russia will conduct information operations through the elections and all the way up to inauguration day in January 2025. The memo details expectations that, as in 2020, foreign adversaries will “almost certainly post and amplify claims of elections irregularities, particularly if the electoral results are counter to their preferred outcomes.” This memo represents the first official US government announcement related to activities from foreign threat actors after the election.

The memo cited previously observed instances where China, Russia, and Iran influenced online discourse about US democratic processes, including by provoking Americans to engage in real world mobilization. Of particular concern is the actors’ amplification of domestic tensions in the crucial moments during and immediately following voting—where state-by-state procedures vary, rumors are plentiful, and an expected surge of unvalidated fraud claims threatens to spark unrest.

The Foreign Malign Influence Center (FMIC) of the ODNI has committed to providing regular disclosures on foreign malign influence in the US elections at the 100, 60, 30, and 15 day mark ahead of the US general elections, with additional disclosures released as needed. Although early October memos remarked on the likelihood of foreign influence actors to continue their campaigns after polls close, the October 16 memo was the first to provide a detailed assessments of the post-election period with attention to the crucial time through inauguration day.

Another US IC precautionary disclosure includes a “Just So You Know” public service announcement from the FBI and the Cybersecurity and Infrastructure Security Agency (CISA) raising awareness of the previously observed tactics from actors likely to interfere the elections. Published on October 18, the joint announcement included a recap of some of the known capabilities of government and non-government actors which have  previously peddled malign foreign narratives.

IC announcements, including the declassified ODNI memo, highlight the different approaches used by foreign threat actors to target our democratic elections and their consistency in building on influence efforts that targeted the 2020 US elections. Iranian, Chinese, and Russian threat actors have increased malign influence operations campaigns targeting the 2024 elections cycle since the 2022 midterms and are anticipated to continue even once all the votes are counted.

Iranian and Chinese actors launched cyber operations targeting presidential campaigns in addition to their ongoing influence operations with divisive narratives. Chinese actors placed much of their focus on down-ballot House and Senate races, amplifying conspiracy theories about candidates and undermining elections, while Iran coupled its hacking operations with social media influence operations utilizing narratives about the conflict in the Middle East. Russian actors conducted a range of influence operations, thoroughly documented by the US Department of Justice, through its state media, social media platforms, and via US influencers to create and disseminate content that aligns with the Kremlin agenda.

Another resource is a website launched by CISA to provide election threat updates on “how foreign actors are seeking to influence and interfere with our democratic process.” At the time of publication, the site contains three joint statements and six threat intelligence updates from relevant agencies on threats from foreign actors targeting the 2024 election cycle. Documented allegations of foreign interference from government and non-government entities contextualize the various threats targeting Americans.

The US State Department’s Rewards for Justice also recently announced a reward of up to $10 million for information about the identities and locations of individuals carrying out malign influence operations on social media around US elections on behalf of the Russian media organization Rybar. The announcement comes following several actions by the US Department of Justice and the State Department against Russian actors related to influence operations targeting Americans.

Documented influence operations targeting the US elections

Our 2024 FIAT tool includes reports from civil society, private companies, and technology companies documenting and detailing influence efforts from Chinese, Iranian, and Russian actors, among others. At the time of writing, Microsoft’s Threat Analysis Center had issued five reports assessing Russia, Iran, and China’s influence operations in the 2024 presidential election cycle. In its fifth report, Microsoft anticipated that influence actors from those countries “may seek to sow doubt about the integrity of the election’s outcome” and predicted that “AI usage to continue through the end of the election cycle.”

OpenAI reported in May 2024 that its tools have been used by actors associated with China, Iran, Israel, and Russia to create and disseminate content as part of influence operations ahead of the elections. The report referenced DFRLab investigations into information operations impersonating Americans, targeting US lawmakers with allegations against the United Nations Relief and Works Agency, and targeting  Canadians with Islamophobic content. Both OpenAI and Meta attributed this activity to an Israeli marketing company.

OpenAI also reported that its tools were also utilized by the well-known Chinese Spamouflage campaign, which uses fake accounts to sow division about sociopolitical issues. The report also says that Russia’s Doppelganger campaign, which uses spoofed news sites to amplify pro-Kremlin propaganda also used OpenAI tools. In August, the company disrupted an Iranian influence operation that used ChatGPT to generate content in a cross-platform activity first reported by Microsoft and later by Meta.

The DFRLab, civil society groups, and private firms have documented numerous reports of Chinese, Iranian, and Russian influence operations. Such reports include the Foundation for Defense of Democracies’ research on nineteen websites as part of a global Iranian influence operation, Graphika’s report on China’s Spamouflage accounts pushing divisive narratives, the Institute for Strategic Dialogue’s reporting on a sect of so-called “MAGAflage” accounts fueling partisan messaging, and CheckFirst and Reset Tech’s research on Russia’s Operation Overload and its impersonation of media sites. These reports highlight the consistency and persistence of threat actors targeting US audiences through various methods and platforms.

Viral claims, hacking, and AI-generated content

In late October, US government agencies investigated and attributed to Russian actors a fake viral video claiming to show mail-in ballots being destroyed in Bucks County, Pennsylvania—a crucial swing state in the election. The ODNI, FBI, and CISA warned that the video is “part of Moscow’s broader effort to raise unfounded questions about the integrity of the US election and stoke divisions among Americans.” The IC anticipates additional content from Russian actors in the lead-up to and after the elections.

Screenshot of the fake video allegedly showing someone ripping a ballot in Bucks County, PA. (Source: Facebook)

Other content, viewed by millions, depicted a man claiming to have been sexually abused by Democratic vice presidential nominee Tim Walz during his time as a high school teacher; it has been debunked as AI-generated. The name of the individual in the video matched the name of a former student of Walz who confirmed to the Associated Press that the video features an impersonation of him; the incident was later attributed to Russian actors by US intelligence officials.

Left: Fake video purporting to be of former Walz student Matthew Metro. Right: The real Matthew Metro, interviewed in October 2024. (Source: Hawaii News Now/archive)

The FBI also issued a statement on false videos depicting ballot fraud and reiterated warnings about false content and threats to “erode trust in the electoral system.”

FBI announcement on fake videos depicting electoral fraud. (Source: FBI / archive )

Articles from spoofed websites posing as legitimate publications remain a pillar of influence campaigns targeting the US general election. These tactics, previously employed by Russian actors, continue to persist and have been partially adopted  by other state actors. Iran’s Storm-2035 campaign—disrupted by the aforementioned Microsoft, Meta, and OpenAI takedown—promulgated politically divisive messages across purportedly credible news websites. Storm-2035 relied on SEO tools and large language models to create a slurry of plagiarized and false AI-generated content targeting both Democrats and Republicans—in contrast to Russian actors’ usual efforts, which are typically more partisan. 

In September, The Washington Post reported that a senior official at the Office of the Director of National Intelligence told journalists that Russia’s activities “are more sophisticated than in prior election cycles.” The ODNI official cited Russia’s use of artificial intelligence to expedite and enhance their efforts, “targeting U.S. swing states in particular” with the intent of shaping the election’s “outcome in favor of former President Donald Trump.” The activities, described as “laundering” messages under the guise of American voices, differ from the more oblique attributions to Iranian and Chinese efforts to sow dissent.

Following Iran’s hack-and-leak operation targeting the Trump campaign, Chinese actors launched another hacking campaign that extended to staffers from both the Trump and Harris campaigns and to family members of former President Trump and President Biden. The hack targeted US telecommunication systems, exploiting Cisco routers to gain access to the broadband infrastructure of AT&T and Verizon; the attribution was corroborated by multiple sources and traced to the Microsoft-dubbed Chinese-affiliated actor “Salt Typhoon.” The recent attribution coincided with an October 25  joint press release from the FBI and CISA, which described the China-linked effort as “specific malicious activity” that granted “unauthorized access to commercial telecommunications infrastructure.”

Foreign actors are anticipated to take advantage of existing divisions to amplify disinformation about voting and the election’s certification process, conspiracy theories about candidates, and calls for violence leading up to November 5 and beyond. Election officials have been ramping up security measures as fears loom over electoral disinformation, violent rhetoric and threats, and harassment of election workers, particularly in the post-election period.

A Pentagon official told The Washington Post that they are preparing for “a range of scenarios” for potential threats from Iran, North Korea, Russia, and China. ODNI’s memo highlighted several important dates as a “window of opportunity” for foreign actors to push disinformation, including the deadline for issuing Certificates of Ascertainment on December 11, the electors voting meeting on December 17, and the Congressional joint session on January 6.

Keep up to date on the latest attributions of foreign interference via the DFRLab’s Foreign Interference Attribution Tracker at interference2024.org.


Cite this case study:

Dina Sadek and Max Rizzuto, “What to expect from foreign threat actors following the 2024 US election,” Digital Forensic Research Lab (DFRLab), November 4, 2024,