Cross-platform, multilingual Russian operations promote pro-Kremlin content
The Russian influence operations Doppelganger and Operation Undercut utilized several tactics to spread content on X, TikTok, 9gag, and Americas Best Pics and Videos
Cross-platform, multilingual Russian operations promote pro-Kremlin content
Share this story

BANNER: Screenshots of AI-generated pictures purporting to depict Ukrainian (left) and US (right) politicians, used in video content posted on ABPV. (Source: irishCollegeV5/archive, giddyStorageV9/archive)
The Russian disinformation operations known as Doppelganger and Operation Undercut promoted content attacking Ukraine, Europe, and the United States using nine different languages and four platforms. On X, thousands of accounts were created to post pro-Kremlin content in addition to promoting redirect links to fake media websites. The network relied on trending hashtags and bot-like accounts to share the content to reach wider audiences. On TikTok, at least twenty-four accounts posted hundreds of videos that garnered millions of views, often relying on AI-generated narration and content masking to evade detection. Identical video content also appeared on online platforms 9gag and Americas Best Pics and Videos.
Operation Doppelganger is a Russian malign information operation known for impersonating reputable media outlets, targeting users with fake articles that promote Russia’s narratives. The DFRLab, other organizations, tech companies, and governments previously covered the operation’s multiple and ongoing iterations targeting various countries on different platforms since August 2022. Operation Undercut runs in parallel to Doppelganger, prompting similar narratives using AI-edited videos and images, along with screenshots from legitimate media outlets taken out of context to undermine Ukraine. The operation has been attributed to at least three Russian companies under sanctions, including the Social Design Agency, Structura and ANO “Dialog”, allegedly with support from cybercriminal syndicates like the AEZA group.
We collected data from X between December 12, 2024, to February 12, 2025, and observed Doppelganger activity primarily in French, German, Polish, English, and Hebrew. We also found some content in Turkish, Polish, Ukrainian, and Russian. We observed three main types of Doppelganger posts: posts with four captioned images, posts with one video or infographic, and posts with links that redirect to Doppelganger websites. As of February 21, 2025, 95 percent of accounts associated with the four captioned images posts and 73 percent of accounts associated with the single video/image posts in our sample had been suspended by X.

Posts with four images
The first type of Doppelganger post is characterized by its attached four images of the same style captioned with supporting narratives. The campaign uses bot accounts for amplification; we observed that it can post these images more than one hundred times within a few hours. The campaign also appends popular hashtags of the day for further amplification, but we have not found any post that received notable authentic engagement. During our sampling period, we documented 9,184 X accounts that posted 10,066 of these posts. Many of these accounts were banned soon after they began posting, but the campaign consistently replaces them with new accounts.

For example, on December 11, 2024, accounts posted messages in English accusing the US and the Biden administration of abandoning Israel during the war. Accompanying graphics featured Hebrew text elaborating these claims, expressing concerns about the situation in Israel and blaming the US for the country’s difficulties.

Similarly, on February 3, 2025, a coordinated attack targeted France and President Emmanuel Macron with identical messages, along with four custom graphics for each post, criticizing the potential deployment of French troops to Ukraine, following discussions between the two countries that began in January.
We plotted the time of posting of these data by day of the week and time of day. The resulting visualization suggests that Doppelganger activity for the four posts follows a cyclical routine. The campaign posts the same set of images across different bot accounts at a consistent pace for about 20 hours.
Posts with redirect links
The second type of post by Doppelganger also employs inauthentic accounts, but rather than images it attaches redirect links to fake media sites that are made to look like the original news website that they are imitating. We found impersonations of Le Parisien, Le Point, Welt, Walla, and Obozrevatel, as well as two WordPress blogs, Les Frontieres and Kaputte Ampel. Bot-like accounts in the network then amplified these posts by using them as replies to unrelated posts by authentic users to spread the fake content.

For example, on January 23, 2025, shortly after the ceasefire agreement between Hamas and the Israeli government took effect, a post by an account in the network pretending to be speaking from an Israeli perspective questioned the ceasefire and shared a link to a fabricated article on a website mimicking Walla News, garnering over a thousand reposts and around 8,000 views. Bot-like accounts, most of which were later deplatformed by X, rapidly amplified the post, using it to reply to posts on popular topics such as Bitcoin, American football, and K-pop. Within fifteen hours, it had been replied to more than one thousand times by over three hundred accounts.

Before the final redirection to the Doppelganger news site, the redirect chain arrives at a site called pvolp.pro, employing a key in its URL, such as https://pvolp.pro/click?key=826d9d175943304455e5. It appears that once the redirect chain arrives at pvolp.pro with the specific key, the site would then redirect the user to the corresponding Doppelganger website assigned to the initial redirect URL.
Domains such as pvlop.pro and so-called “typo-squatted” domains, often referred to as third-stage domains, continue to be hosted out of Cloudflare, thereby making pinpointing their actual hosting providers more difficult. As an update to reports published in the summer of 2024, the Doppelganger operators are no longer using Hostinger to second stage domains, but for first stage domains. These domains then redirect to second stage domains, with websites that embed JavaScript code before redirecting the user to pvlop.pro.
Our findings also confirm findings of Harfanglab published in July 2024, indicating that the second stage domains seem to be hosted by VDSINA, a Russian-company with registration in the United Arab Emirates. As indicated in Qurium’s April 2024 publication, Doppelganger second stage domains seem to still utilize Keitaro, a piece of software enabling operators to redirect traffic to run targeted advertisement.
While investigating VDSINA’s hosting in Europe, we discovered that the domain vdsina.com appears to be hosted by StormWall s.r.o, a Slovakia-based business that is widely popular in Russia.
Posts with a single image or video
The third type of Doppelganger posts are ones featuring either a single attached video or image. The videos are often a collage featuring war footage, European leaders, or stock videos processed with a vintage video filter with subtitles and an AI-sounding narrator. The campaign also used bot accounts to post these media assets, pausing on weekends. We identified 942 X accounts that posted 2,189 times. Although many of these posts received no engagement, we found instances of posts with highly similar engagement metrics, suggesting that the campaign may have used the same system to artificially amplify some of their posts.

Out of thirty-seven X accounts active between December 2024 and January 2025, twenty-five showed posting patterns that split the subset into two distinct coordinated sets. While the text content of the posts and the hashtags differ, the two sets show that the accounts have engaged in posting identical German-language media content on X since October and November 2024.
One set of accounts has seemingly gone dormant since December 27, 2024, while the other still engages in posting activity. This second set of nine accounts also seems older and consistently displays a change in target between late October and November 2024. Prior to this period, the nine accounts posted in Turkish, Polish, Russian and Ukrainian, before shifting to German. As an added sign of coordination, the first post in German appeared on November 8, 2024, with a video of Kamala Harris or a post with a photo of US President Donald Trump. This suggests that multiple accounts of the network coordinately shifted purpose to target the German speaking audience.

Earlier posts’ thumbnails also seem to follow a posting order of a series of videos, which Recorded Future found to be related to Operation Undercut. The videos posted prior to the shift mentioned above feature numbers, often come in series of multiple videos. Still images in Russian spread unverified claims about Ukraine’s number of deserters and translated headlines of articles skeptical of Ukraine’s ability to win the war.
Most videos feature AI-generated narration with subtitles. The DFRLab also found traces of the content of narration seemingly translated and reposted in different languages. Videos also use generative artificial intelligence en masse, often depicting political leaders in unflattering circumstances. According to OpenAI, inauthentic accounts connected to Russian influence operations have used its tools to automate the generation of disinformation content across multiple languages.


More recently, the accounts in our data sample have begun experimenting with techniques to avoid moderation by adding captions with narratives over unrelated viral videos.

America’s Best Pics and 9gag
In addition, video content of the operation has spread to multiple platforms including 9gag, AmericasBestPics.com (ABPV) and most recently TikTok. The content often features identical videos as content found on X posted by Doppelganger bots, alongside other videos. Contrary to the Doppelganger accounts found on X however, each account has a different posting behavior; while some videos may be found across multiple accounts their timestamps and posting dates often differ.
A prior report by Recorded Future already alluded to the spread of Doppelganger content to 9gag and America’s Best Pics and Vids (ABPV), known as “Operation Undercut.” On the latter platform, the DFRLab found posts that exhibit traits similar to Doppelganger content described above. It is plausible that several hundred accounts may be at large on the platform, as hashtags such as “funnymeme” and “oldmeme” and “goodolddays” often feature hundreds of videos often identical to video content posted on X. The DFRLab observed that the 9gag posts also use similar hashtags.
The DFRLab collected 4,029 posts published between February 2024 and February 2025 using the three hashtags that exhibit signs of doppelganger content. While the operation may have been reportedly active since May 2024, the number of videos posted over time increases drastically for the three hashtags reaching an average of five to six videos per day in October 2024. The posting activity also stops between December 30, 2024 and January 7, 2025, overlapping Russia’s end of year celebrations.
Moreover, accounts on ABPV also place Doppelganger content alongside regular memes and influencer videos, which makes it more difficult to track inauthentic content in their feeds. The videos feature content in French, English, Turkish, Polish, and Russian. The accounts seem to follow a user handle generation pattern that features an adjective and a noun, sometimes hyphenated, and an optional letter “V” followed by digits. We found that the accounts that posted the most often in our dataset between October 2024 and February 2025 were enormousCodeV8 for the hashtag funnymeme (18 videos), superior_officer for the hashtag oldmeme (20 videos), and liable_discussion for the hashtag goodolddays (16 videos). All these accounts posted Doppelganger video content in their feed.
In one instance, a video capture dubbed in Polish briefly showed the Russian-language interface of an X user, commenting over a post by US Lieutenant General Keith Kellogg. In subsequent posts on other platforms, the X interface was blurred.

TikTok
On TikTok, the DFRLab identified twenty-four accounts, with some active since May 5, 2024. The accounts primarily posted content targeting speakers of French (8), German (8), Polish (5), and Russian (3). At the time of writing the network had amassed more than 8,194,199 views, though only two accounts remain after deplatforming on February 24, 2025.
At the time of writing, the network had posted 297 videos in German, 288 in French, 265 in Polish and 163 in Russian. It amassed 221,704 likes, posted 1,057 videos and garnered a meager 14,402 followers. According to the video metrics however, the Russian subset gathered beyond 70,000 likes more than the other subsets despite having the fewest number of accounts. Despite posting the most videos, the German accounts garnered 1.803 million views, a little under the 1.808 million views of the Russian accounts. The Polish and French accounts gathered the most views, comments and shares compared to all the other accounts.
The accounts’ posting activity seems to have taken off in November 2024 and reached a peak of twenty-nine videos posted on January 29, 2025. Consistent intervals of pauses in posting also appear with no posting activity at all on Saturdays and Sundays, except for December 28, 2024, when six videos were posted. The network also ceased posting entirely between December 29, 2024, until January 8, 2025, coinciding with the Russian New Year and Orthodox Christmas celebrations.
Earliest records of posting activity appear first on May 13, 2024, the network then remained dormant with a total of fifteen videos posted between May and September 2025. Starting exactly October 15, 2025, the network started posting anew, totaling 27 videos that month. The network then increased its posting pace considerably from November 2024, with 229 videos posted that month until February 2025, with an average of 7.6 to 8.6 videos posted daily. At the time of writing the network shows a significant uptick in posting activity in February 2025.
In addition, they exhibit multiple indicators of inauthenticity, including AI-generated or stolen profile pictures. Some also used flags as their profile picture. The accounts also seem to generate their handles using a pattern that creates a plausible first name followed by a last name and a set of digits.

On multiple occasions the accounts also used the flags of the country’s target audience. Their description also indistinctively reads, “About my channel: Bright videos, important news and topical issues,” followed by two emojis. A similar phrasing was found translated into French, Polish, Russian, and German in the description of other accounts of the network, reinforcing the suspicion of coordination.

Lastly, comments also often appear inauthentic, as identical comments in Russian from various accounts seem to appear across multiple videos with no relation to the video content. Users would ask, “Did you write this post yourself?” or “Did you work on this post for a long time?” and “It’s a pleasure to follow you” followed by two emojis. These comments appear multiple times and are more often found on videos amassing beyond 100,000 views, suggesting that the network may have artificially inflated its metrics by using inauthentic comments and views, a tactic observed in the Operation Undercut campaign on 9gag.

Editor’s note: Ali Chenrose is a pen name used by a DFRLab contributor for personal safety reasons.
Cite this case study:
Valentin Châtelet and Ali Chenrose, “Cross-platform, multilingual Russian operations promote pro-Kremlin content,” Digital Forensic Research Lab (DFRLab), February 26, 2025, https://dfrlab.org/2025/02/26/cross-platform-multilingual-russian-operations-promote-pro-kremlin-content.