Inauthentic YouTube channels spam partisan content from fringe sites

The channels showed spam-like behavior while posting pro-Trump and anti-Democrat clickbait in the United States

Inauthentic YouTube channels spam partisan content from fringe sites

Share this story
THE FOCUS

The channels showed spam-like behavior while posting pro-Trump and anti-Democrat clickbait in the United States

(Source: @KaranKanishk/DFRLab)

Editor’s Note: This piece was written as a part of a collaboration with CNET.


A series of YouTube channels posted hyper-partisan political videos based on material boosted from fringe and conspiracy websites. However, the aim of each channel appears not to have been ideological but financial. The focus of the videos may have been political, but each channel used multiple tactics to avoid spam detection and grow audience.

YouTube has cultivated a successful community of content creators through its Partnership Program (YPP), which allows creators to monetize videos based on audience size and viewership rates. The platform is the second most popular social media platform, the second largest search engine, and is a prominent vector for disinformation that spreads differently because of its financial incentives, social media, and search engine functionality.

In this case, the DFRLab — in partnership with CNET — found more than a dozen YouTube channels that behaved in a highly similar fashion and that exhibited signs of inauthenticity but that also showed no verifiable signs of interconnection. The scripts for the videos available on the channels seemed to be verbatim readings of posts to at least five external, extremely biased websites. Two of the external websites appeared to be in direct coordination with each other, but not in coordination with any of the YouTube channels.

The behavior between the YouTube channels was similar because each replicated content from the same fringe websites, made minor tweaks to thumbnail images to avoid detection via reverse image search, and employed voiceover actors to read the posts. Each channel produced spam content, but no open-source evidence proved connection between the channels’ operators.

Based on this joint investigation, YouTube was able to apply existing content moderation policies to the network. Hyper-partisan material generally fits into disinformation for ideological means; however, it is also highly emotive and ripe for clickbait.

This case has everything! A Bosnian voiceover actor, possible Vietnamese YouTube channel administrations, and outrightly false political narratives from the United States. But make no mistake, the DFRLab’s open-source analysis could not determine the actors or intent behind this network, but the behavior — again — mirrored spam content for financial gain.

Inauthentic YouTube “news” channels

At least 15 YouTube channels observed in this case appeared to be spreading hyper-partisan and sometimes outrightly false stories on the platform.

One of the most popular in the identified set was “Gotcha News Network,” which garnered upwards of 102 million views in total.

The About page for Gotcha News Network showed that its videos received more than 102 million views since it was first created in mid-March 2017. (Source: Gotcha News Network)

The content from the channel was salacious and sensationalist in nature.

Screengrab of the channel’s front page showed the highly charged, anti-left-wing nature of the videos on the channel. (Source: Gotcha News Network/archive)

The content was presented behind a thin veneer of seeming news-like while presenting limited evidence for unsupported assertions. In general, the YouTube channels pushed heavily right-wing, often wholly fabricated disinformation. The content was certainly focused on political divisions in the United States; however, disagreement also generates high click rates and audience growth. Large audiences lead to eventual monetization for the content creator, and the language used on each channel appears to be more focused on reaching more audience than persuading the audience of a particular view.

Captions for one of the channels’ videos, sourced at 2:15, showed commentary targeting the political left in the United States. (Source: Latest News/archive)

As an example of financial motivation, a link to a Patreon payment page was included on Gotcha News Network’s videos.

Gotcha News Network included link to a Patreon page for its most recent — and seemingly final — video. (Source: Gotcha News Network/archive)

The writers’ guild

All of the YouTube channels under analysis appeared to be pulling many of their scripts for their videos from the at least five fringe, far-right websites.

For instance, one of the videos posted by Gotcha News Network, “BOMBSHELL! STORMY DANIELS CONNECTION TO OBAMA’S OVAL OFFICE! ARE WE SURPRISED?” (pictured above), featured a voiceover reading text directly from an article on Mad World News, a website often pushing xenophobic or racist narratives and conspiracy theories. The video and corresponding article falsely alleged former U.S. President Barack Obama and adult entertainment star Stormy Daniels had engaged in a “coordinated effort” to smear U.S. President Donald Trump but provided no evidence — likely because none exists — to support the claim.

Embellished with a profile photo of Trump, another channel named American News Today also produced misleading information targeting Democrats and lifted its content from far-right websites. For example, on March 11, the channel posted a highly spun video titled, “Nancy Pelosi Loses Her Head Over Trump — Pushes False Claim That Donald Cut $700M From CDC…”, the script for which was lifted from an article on another dubious website, The Patriot Journal.

Screenshot of the video, in which a narrator said that “Pelosi has sounded less and less coherent in recent months.” (Source: American News Today/archive)

In this instance, the script presented a highly editorialized, often misogynistic take on Nancy Pelosi’s statement regarding Trump’s proposed CDC budget cuts. The video incorrectly asserted that Pelosi’s criticism had been misleading, claiming that it was “not even remotely true” that the U.S. president had tried to cut funds from the Centers for Disease Control and Prevention (CDC). The argument was twisted to laud the U.S. president for ultimately increasing the funding for the CDC, ignoring the fact that Trump’s initial budget draft had indeed included budget cuts for the agency.

The article and the video provided no evidence to support their assertions, which were a deliberate mischaracterization of Pelosi’s words.

The DFRLab searched YouTube for another headline from a Patriot Journal article — “Trump Plans New Swing State Vehicle — Supporters Are Looking To The Skies For Donald’s New Blimp” — and found that four channels had plagiarized the text of the article and used it as a video script.

Screenshot of a YouTube search using an article headline from The Patriot Journal. Each thumbnail image showed minor editing done to make the video appear distinct from one another. (Source: YouTube/archive)

The thumbnails for the four videos were nearly identical, but there were minor edits made to each in a likely attempt to avoid YouTube’s duplication and copyright filters. All four included the same image of Trump looking into a solar eclipse without sunglasses on the left and a blue background overlaid with “Special Report” at right.

The actual image of the U.S. president was distorted as if in a fun house mirror, with one visage artificially squashed and another lengthened. Additionally, the “Special Report” text was similarly manipulated to be different between thumbnails. Finally, some of the thumbnails included overlaid text such as “BREAKING” or “TRUMP PLANS NEW,” further creating a similar but still unique thumbnail.

If the incentive for these channels was financial in nature, these small tweaks to presentation could contribute to a level of differentiation that would avoid detection of spam posting, a violation of YouTube’s Community Guidelines. Indeed, the most obvious description for these videos is likely spam: repetitive, non-politically motivated (though still ideological) content designed to generate clicks and, therefore, revenue.

Assuming the videos were spam, the channels could possibly have been seeking to join YPP, which allows content creators to monetize their videos. YPP is built around a channels with a certain audience size and viewership rate, which can often be attained quickly by posting emotive or salacious content.

The content across the channels’ videos was not exclusively sourced from The Patriot Journal. The DFRLab identified other websites the videos pulled material from, including the previously mentioned Mad World News, as well as Red Wave 2020, Conservative Brief, and Explain Life.

Two of those websites — Explain Life and Conservative Brief — appeared to be closely linked. A comparison of the two websites’ XML sitemaps, a file that lists all public URLs of a domain, showed that the corresponding URLs for each had last been modified within minutes of one another on many days.

A comparison of XML sitemaps for explainlife.com and conservativebrief.com showed updates on the same days at almost the exact same time.(Source: ExplainLife/archive, left; ConservativeBrief/archive, right)

Drawing an even more definitive connection, the XML sitemaps for the author pages not only suggested that the websites had the same authors but also showed identical “Last Modified” dates for the author pages.

Author XML on the two websites. (Source: Explain Life/archive, top; Conservative Brief/archive, bottom)

In most cases, the video content was lifted word-for-word from an article on these websites, though it is unknown whether the YouTube channels and the websites were directly connected. Put simply, two clickbait websites coordinated, and there is no evidence the website coordination extended to any of the YouTube channels included in this investigation

Other videos on the channels appeared to paraphrase very closely articles on more mainstream, but still right-wing, websites such as The Daily Caller. For example, the script for video on YouTube channel “Breaking News” by the title of “UNBELIEVABLE: White House Officials Allege Speaker Pelosi Pushed To Include Hyde Amendment Loophole” was a close but not verbatim copy of an almost identically titled article on The Daily Caller.

The script (and title, as apparent in red boxes) for a video on the “Breaking News” YouTube channel closely mirrored, but did not entirely copy, the text of an article on The Daily Caller. (Source: Breaking News/archive, left; The Daily Caller/archive, right)

The Breaking News video pictured above featured an introduction featuring a male’s voice with an American English accent. He promoted the channel and encouraged viewers to comment on the video. The same speaker can be seen at the front of other videos on the channel, almost always saying something similar and in no cases referring to the actual content of the video. This indicated that these introductory segments were likely recorded generically in front of a green screen and well ahead of the accompanying topical content. These introductions also served to encourage engagement with the content, which — again — would improve the possibility of the channel being monetized.

As with many of the others on the channel, the video then abruptly shifted to a relatively monotone voiceover — not the same person as in the introduction — that very narrowly paraphrased the Daily Caller story, keeping it just outside a verbatim reading; the narrator, however, does repeatedly acknowledge the story source as The Daily Caller. The voiceover was read somewhat haltingly, as the narrator bungled the script a number of times, including stumbling over “anonymity” and mispronouncing U.S. Secretary of the Treasury Stephen Mnuchin’s name as “Myookin” at around 52 seconds.

Between these two things, the videos demonstrated low production quality and, in particular, the use of talent unfamiliar with the topical content.

The producers’ guild of Vietnam

While all of these channels used similar spam-like tactics, a few of them had obvious signs of being operated by a Vietnamese speaker. However, there was little indication beyond these linguistic signs for these select few channels as to where the channels were being run from and who was behind them: it is impossible to preclude other geographic locales as an origin for any of the channels.

That said, three of the channels — as mentioned — did exhibit linguistic signs of Vietnamese, and CNET also found a related linguistic connection.

Two of the channels, “Breaking News” and “Breaking Story,” used Vietnamese as the header for their featured channels list.

The text in the red boxes is in Vietnamese and translates to “featured channel.” Both Breaking News and Breaking Story included a third channel, The Next News Network, in their featured channels. (Source: BreakingStory/archive, top; BreakingNews/archive, bottom)

Another channel — NEWS 24H — featured old videos in Vietnamese that predated the channel’s shift to focus on clickbait U.S. political content.

Vietnamese videos in the red box shared a month before the channel started posting about U.S. politics. (Source: News 24H/archive)

As a final Vietnamese connection, CNET found that an account with the name “Ngquyt” had apparently been reaching out for voiceover services on Fiverr, an online freelance service marketplace, to record narration — if not footage— for the channels’ videos. A Google search for “ngquyt” (or, alternatively, “ng quy t”) did not yield anything relating to the Fiverr account but did yield in a list of pages in Vietnamese.

While much of this evidence on its own was circumstantial at best, the aggregated effect, at minimum, indicated that at least three of the channels likely had some connection to Vietnam.

The screen (voice) actors’ guild

It appeared that these channels were explicitly hiring fluent English-speaking actors with an American accent to record their scripts. CNET found that the Breaking News YouTube channel, for example, had hired at least one fluent American English freelancer via Fiverr to add voiceover in the videos.

One video included footage of the voiceover actor (red box at bottom left) found via Fiverr. (Source: BreakingNews/archive)

While the channels’ motivation was not entirely clear, hiring talent with an American English accent allowed their videos to seem domestic in origin, adding an air of authenticity that would encourage engagement from a targeted audience: in this case, a U.S. audience. As CNET found, even that can be deceptive. In the above case, the voiceover actor was a Bosnian male who had spent significant time in the United States and who was living in Bosnia and Herzegovina when he recorded the audio.

Conclusion

The DFRLab and CNET identified a series of now-removed YouTube channels that showed signs of spam-like behavior. The videos on each channel used slightly modified variations of the same thumbnail images, copy and pasted the same headlines cribbed from external websites, and repurposed verbatim text from the same set of fringe media outlets as scripts for the channels’ videos.

While there was only circumstantial evidence to tie the channels together, the DFRLab was able to connect some of the far-right websites to each other directly, though connections between the websites and the YouTube channels remained tentative at best. Ultimately, the motivation for these now-removed channels’ existence was not clear. They may have appeared outwardly political, but ideological content also triggers an emotional response and, in turn, engagement.

This case has plenty of twists and turns. The bottom line is that some YouTube creators used a mixture of spam-like behavior and false political content to grow their audience on a platform where it pays to have a big audience.


Follow along for more in-depth analysis from our #DigitalSherlocks.