#BotSpot: How Bot-Makers Decorate Bots

Tracing the mind games of those who create fake accounts

#BotSpot: How Bot-Makers Decorate Bots

Share this story
THE FOCUS

Tracing the mind games of those who create fake accounts

(Source: Twitter)

When the world’s social media giants meet to discuss the most interesting jobs available online, creating fake accounts is not on the list. Fake social media profiles are the cannon-fodder of the propaganda wars. Automated and regimented, fake profiles can be deployed in tens of thousands within minutes and taken down almost as quickly.

But somebody has to create that cannon-fodder. As Twitter Public Policy has grown more adept at preventing automated account creation, a whole industry has grown up, employing people to create, name, register, and verify fake accounts.

Judging by the accounts they make, those people range from the starstruck to hyper-creative to the very, very bored. Some go to great lengths to give their throwaway accounts personality, and even a family. Others, frankly, can’t be bothered.

These are some of our favorite bots, gathered here to illustrate the different ways in which bot makers try to make their accounts stand out from the mass.

Girls and football

Many botnets use profile pictures of famous people, especially famous women. Bot creators seem to assume that more men than women use Twitter, men are more likely to pay attention to a beautiful woman, and when not staring at women, men watch football. Marketing experts are unlikely to disagree.

Many botnets seem primarily made up of beautiful women. One network, active in countries in the Persian Gulf in September, used profile pictures of actresses Cameron Diaz, Scarlett Johansson, Keira Knightley, and others.

Three of the accounts which retweeted, in the same second on July 5, an anti-Qatar post. Note the alphanumeric account handles, typical of the simpler sorts of bot. Accounts archived here. (Source: Twitter / Users left, center, and right)

Actress Emma Watson appears to be a particular favorite, with her photo cropping up on bots in the Persian Gulf and the Russian-speaking world.

Arabic- and Russian-language bots using Emma Watson’s picture. Accounts archived here. (Source: Twitter / Users left, center, and right)

Another bot stole its avatar picture from German Instagram celebrity Lorena Rae, and used it to amplify far-right messaging around the Charlottesville riots in the United States.

Left: archived Twitter page of @angeelistr (now deleted). Right: Image of Lorena Rae (Source: weheartit.com).

A botnet, which attacked @DFRLab’s colleagues in July and August, used the photos of dozens of young ladies, but shared them out between hundreds of accounts, presumably in the hope that nobody would notice the same girl had five different names.

Four of the accounts involved in the attack on @DFRLab and friends. All four accounts archived on August 24, 2017. (Source: Twitter)

A botnet @DFRLab observed in Qatar in May took a different approach. Rather than actresses, it featured soccer players, including David Beckham and Lionel Messi (though not Messi’s great rival, Cristiano Ronaldo).

Five profile pictures of accounts which retweeted the same pro-Qatar tweet within seconds of one another on May 24, 2017. (All accounts archived on December 21, 2017) (Source of all profiles: Twitter)

The same network, marked by simultaneous tweeting of the same pro-Qatar tweet, also used glamorous men from the world of film, such as George Clooney, Frank Sinatra, and Johnny Depp.

More accounts from the same pro-Qatar tweet. As of December 21, 2017, the Sinatra account had been suspended, Clooney was restricted, but Depp was still operational. (Source of posts: Twitter)

Given the Gulf context, in which genuine, home-grown accounts seldom feature glamorous women showing their hair, this appears to be a case of bot creators adapting to local styles, and shaping their accounts accordingly.

A further set of accounts used an image of a small girl, apparently taken from a collection of “cute baby images for WhatsApp”.

(Source: https://ww3.onvacations.co)

Literally dozens of accounts used this image, retweeting the same post almost simultaneously, as this screen shot of its retweets makes clear.

Amplifiers of @FtomahAltmimi’s tweet, archived on November 12, 2017. (Source: Twitter)

All these accounts were created in March 2017 and have identical biographies; many also give a location in Mecca.

Some of the accounts using the same avatar picture and retweeting the same tweet. All accounts archived on November 12, 2017. (Source for all accounts: Twitter)

Pink and blue

Some bot makers appear to have been more creative, building the identities of their accounts around a particular theme. One intriguing set, which also amplified traffic in the Gulf, was based around names starting in “Pinky”, and featured Korean pop profile images.

Profiles from the “Pinky” family. All accounts archived on November 12, 2017. (Source: Twitter)

Sticking with the color theme, some of the accounts amplifying the same traffic called themselves “blue”, with handles based on a numerical sequence, but the same screen name.

Profiles from the “Blue” family. All profiles archived on November 12 and December 12, 2017. (Source for all profiles: Twitter)

Some bot creators lack even this degree of creativity. One small botnet we identified in May featured accounts whose screen names and handles appeared to have been generated by drumming the creator’s fingers over the keyboard. One such account’s handle began asdasda, together with an eight-digit number which was probably randomly generated; its screen name was asd asdasd. A, S and D are the first three keys on the middle row of a QWERTY keyboard; the image is of a bot user so bored they are simply rattling the keys.

Profile page for @asdasda95998943. Archived on May 23, 2017. (Source: Twitter)

Another bot in the same network had the Cyrillic screen name прврпр варвапва, which does not make a recognizable (or even pronounceable) name. Its handle was @S1UjVQTtIq2OOZc, a randomly-generated string.

Profile page for “прврпр варвапва”, archived on May 23, 2017. (Source: Twitter)

The letters В, А, П, and Р are adjacent on middle row of the standard Cyrillic keyboard (the Russian В being in the position of D on a QWERTY keyboard). The creator seems to have randomly generated the handle and done the bare minimum to give it a screen name.

Layout of the standard Russian Cyrillic keyboard. Note the placing of В, А, П and Р. (Source: altec.colorado.edu)

These accounts, especially the ones which lack even a profile picture, suggest that the person who created them was singularly bored, and decided to stick with a minimalist recipe that worked, as long as nobody looked.

Humans copying bots, copying humans

Some bot creators appear to have gone for mass-production, to such an extent that they look like humans imitating automated creation software, which is creating imitation humans. Bot studies are full of this sort of paradox.

For example, one Japanese-language net features a very large number of accounts, all with sequential usernames, but with varying clusters of accounts using different avatar images.

The numerical sequence of still-active accounts starts with @vvjdthcs501, which had not yet tweeted as of December 12. @vvjdthcs506 was in the same position, while @vvjdthcs502, and similar accounts whose handles ended in 100, 200, 300 and 400, had all been suspended.

Suggesting the scale of this botnet, accounts whose handles ended in 999, 1234 and 1999 had also been suspended.

Some, whose handles ended in numbers between 500 and 800, shared this avatar and bio, with or without a background image.

The “K” family. Note the numerical progression in the handles. All accounts archived on December 12, 2017. (Source: Twitter)

Others, whose handles were above 1100, shared a different avatar.

A separate series of bot accounts using a different avatar, but the same sequential handle. Numbers 113 and 114 were suspended, while 112, 116 and 117 were restricted, suggesting that Twitter had spotted some of the bots, but not the naming convention. All accounts archived on December 12, 2017. (Source: Twitter)

Yet a third set, whose serial numbers ran between 1880 and 1902, used this set of avatar and background.

The third set of accounts in the sequence. Note the identical bios, avatars, and background, as well as the numerical order of handles. All accounts archived on December 12, 2017. (Source: Twitter)

Many of the accounts in this series appear not to have been activated yet: they have a primitive screen name (the Japanese hiragana sound for “a”), no avatar or background, and no posts. This suggests a dormant capability, perhaps a backup as the active accounts are taken offline. Their serial numbers run all the way up to 2099.

Profile page for @vvjdthcs2099, as of December 12, 2017. Archived on the same day. (Source: Twitter)

It largely follows verified, blue-check accounts, a technique often used by bot herders to give their accounts the appearance of humanity.

Accounts followed by @vvjdthcs2099, as of December 12, 2017. Archived on the same day. (Source: Twitter)

This is not the only botnet to behave in this way. An even more obvious one is built with a serial starting @hhkk, then a four-digit number. These accounts seem to be divided into clusters sharing the same avatar, bio and background, as the following examples indicate.

Profiles of serial numbers 0205 to 0213. All accounts archived on December 7, 2017. (Source for all accounts: Twitter)

Yet a third series took a similar approach, but used whisky-related imagery.

Bots from the @ccfftt series, numbers 133, 146 and 148. All archived on December 18, 2017. (Source: Twitter)

All these appear to be semi-automated, production-line creations. In each series, some accounts had been suspended, some restricted, and some left untouched, indicating that they had been created carefully enough to pass Twitter’s initial barriers.

They are obvious to the eye, however, as when the @hhkk network all liked the same tweet at the same time.

Tweet from the Daily Beast, liked by many members of the @hhkk group simultaneously; note the identical avatars at the bottom. The accounts had been suspended by December 21, 2017. (Source: Twitter)

Works of art

Occasionally, botnets appear like works of art, and the users appear to have devoted significant time to making them. One botnet @DFRLab uncovered amplifying posts on South Africa politics appeared especially high-quality. Many of the accounts in the network were copycats of apparently genuine ones, but transposed the letters “i” and “o” with “l” and “0” in the usernames.

Many of them were created in April 2014.

Left to right: comparison of the profiles of @megkind with @megklnd; @TeamSpoiler with @TeamSpoller; @giraffepink with @glraffepink; @toshalebeth with @toshaiebeth; and @emmaddelrey with @emmaddeirey. In each case, the upper account is the original; the lower account is a copycat created in April 2014, and retweeted one or more of @Adamitv’s tweets. Archived on December 19, 2017 (archive links embedded in each account name). (Source: Twitter)

Some of the apparent fakes had creation dates earlier than the accounts they imitated, but still exchanged an “i” or “l”, or an “o” or “0”, in the handle, which suggests these bot accounts originally had different names.

Comparison of profiles for “Daria Morgendorffer”: @DariaWisdom (left) and @DarlaWisdom (right). @DarlaWisdom retweeted @Adamitv’s posts, and appears to be a bot, but has an earlier creation date than @DariaWisdom, which nevertheless appears to be a genuine account, posting its own content, albeit infrequently. Archived on December 19, 2017. (SourceTwitter)

This botnet was remarkable for its craftsmanship, and the way in which its constituent accounts only posted at low rates — apparently in a bid to evade automated detection.

Our favorite botnet took a similar approach, but with a fatal flaw. Its creation appears to have been automated to copy and paste the account biographies of real users, giving it the appearance of credibility, but whoever set up the accounts did not cross-reference the bios with the names and images.

For example, @hehDelanababnor’s biography reads, “Micah Challenge is a global campaign of Christians speaking out against poverty & injustice in support of the MDGs.” It seems to be copied from an account called Micah Challenge Aust, handle @micaheaxa:

Profiles for @hehDelanaBabnor and @micahaexa; compare the bios. Archived on December 21, 2017. (Source: Twitter)

Curiously, a number of other accounts which appear botlike have the same bio, suggesting that this may be a popular approach among bot makers.

The profiles of @JaylenRey, @ciomdaj20, and @CarmelitaWxyx, all of which appear botlike, archived on December 21, 2017. (Source for profiles: Twitter)

Other bios from accounts in the network, which were suspended before @DFRLab could archive them, included a comments editor at the Guardian newspaper; a sports journalist for the New York Times; the largest media firm in Africa; a student at Alcorn State University; a luxurious island sanctuary; and a world expert on silver.

One which we did manage to archive is the charming account Erik Young, a shy and bearded young man who describes himself as “just a woman who loves Jesus”.

Erik Young, the woman who loves Jesus. The account has been suspended, but was archived on August 28, 2017. (Source: Twitter)

Bot creators have many skills. It may be that a sense of humor is one of them.


Follow along for more in-depth analysis from our #DigitalSherlocks.