Porn bots are more or less entrenched in social media, despite the platforms’ best efforts to eradicate them. We’re used to seeing them flood the comment sections of memes and celebrity posts, and if you have a public account, you’ve probably noticed them watching and liking your stories. But their behavior keeps changing slightly to get ahead of the automated filters, and now things start to get weird.

While porn bots once mostly tried to attract people with suggestive or even downright rude lines (like the increasingly popular “DON’T WATCH MY STORY if you don’t want to MASTURBATE!”), these days the approach is a bit more abstract . It’s become common to see bot accounts post a single, innocuous, completely off-topic word, sometimes accompanied by an emoji or two. In one post I stumbled upon recently, five separate spam accounts, all using the same profile picture — a close-up of a guy in red thongs spreading his ass — commented “Virgin 🌿,” “Music 🎶,” “Sapphire 💙, “ “Serenity 😌” and “Faith 🙏.”

Another bot — whose profile picture is a headless headshot of someone’s underwear-clad body — commented on the same post with a “Michigan 🌟” meme. Once you’ve noticed them, it’s hard not to start keeping a mental diary of the most ridiculous occurrences. “🦄farming,” one bot wrote. In another post: “terror 🌟” and “😍🙈insect”. Weird one-word comments are everywhere; the porn bots seem to have completely lost it.

Indeed, what we are seeing is the emergence of another evasion maneuver that scammers are using to help their bots evade Meta’s detection technology. That and maybe they’re getting a little lazy.

side-by-side screenshots of an Instagram comment section showing multiple posts from porn bots

Screenshots from Engadget

“They just want to join the conversation, so having to craft a coherent sentence probably doesn’t make sense to them,” Satnam Narang, a research engineer at cybersecurity company Tenable, told Engadget. Once fraudsters get their bots into the mix, they can have other bots pile likes on those comments to boost them further, explains Narang, who has been investigating social media fraud since the days of MySpace.

Using random words helps scammers stay under the radar of moderators who may be looking for certain keywords. In the past, they have tried methods such as putting spaces or special characters between each letter of words that the system might flag. “You can’t necessarily ban an account or remove an account if they just comment on the word ‘insect’ or ‘terror’ because it’s very benign,” Narang said. “But if they say, ‘Check my history’ or something like that … that could flag their systems.” It’s an avoidance technique and it obviously works if you see them on those big name accounts. It’s just part of this dance.

This dance is one social media platforms and bots have been doing for years, seemingly endlessly. Meta said it was stopping millions of fake accounts being created on the app suite every day and catching “millions more, often minutes after creation.” Yet spam accounts are still prevalent enough to show up en masse on high-traffic posts and creep into the story views of even users with few followers.

The company is the newest transparency report, which includes statistics on fake accounts it has removed, shows that Facebook removed over one billion fake accounts last year alone, but does not currently offer data for Instagram. “Spammers use every platform available to them to deceive and manipulate people on the internet and are constantly adapting their tactics to avoid enforcement,” said a Meta spokesperson. “That’s why we invest heavily in our enforcement and review teams and have specialized detection tools to identify spam.”

Comments from Instagram porn bots that read Comments from Instagram porn bots that read

Screenshot from Engadget

Last December, Instagram released a set of tools aimed at giving users more visibility into how it handles spam bots and giving content creators more control over their interactions with those accounts. Account holders can now, for example, bulk delete follow requests from profiles flagged as potential spam. Instagram users may also have noticed the increased appearance of the “hidden comments” section at the bottom of some posts, where comments flagged as offensive or spam can be moved to minimize encounters with them.

“It’s a game of whack-a-mole,” said Narang, and the crooks win. “You think you’ve got it, but then it just pops up somewhere else.” Scammers, he says, are very good at figuring out why they’ve been banned and finding new ways to avoid detection accordingly.

One might assume that today’s social media users would be too savvy to fall for obviously bot-written comments like “Michigan 🌟,” but according to Narang, scammers’ success doesn’t necessarily rely on tricking hapless victims into handing over their money . They often participate in affiliate programs, and all they need is to get people to visit a website — usually branded as an “adult dating service” or similar — and sign up for free. The bots’ “link in bio” usually points to an intermediary site hosting a handful of URLs that may promise XXX chats or photos and lead to the service in question.

Scammers can get a small amount of money, say one dollar, for each real user who creates an account. In case someone signs up with a credit card, the kickback would be much higher. “Even if one percent of [the target demographic] you sign up, you earn some money,” Narang said. “And if you’re running multiple different accounts and have different profiles pushing those links, you’re probably making a decent chunk of change.” Scammers on Instagram are likely to have spambots on TikTok, X and other sites as well, Narang said. “It all adds up.”

Comments from porn bots on Pikachu meme on Instagram, including Comments from porn bots on Pikachu meme on Instagram, including

Screenshot from Engadget

The damage from spambots goes beyond the headaches they can ultimately cause to the few who have been tricked into signing up for a sketchy service. Porn bots mostly use pictures of real people that they stole from public profiles, which can be awkward once the spam account starts asking to be friends with everyone the person pictured knows (speaking from personal experience here). The process of getting Meta to remove these cloned accounts can be a grueling effort.

Their presence also adds to the challenges faced by real content creators in the sex and sex-related industries on social media, which many rely on as a way to connect with a wider audience, but must constantly fight not to be deplatformed. Fake Instagram accounts can amass thousands of followers, driving potential visitors away from real accounts and casting doubt on their legitimacy. And real accounts are sometimes flagged as spam in the Meta bot chase, putting those with racy content at even greater risk of account suspensions and bans.

Unfortunately, the bot problem is not one that has an easy solution. “They’re just constantly finding new ways [moderation], coming up with new schemes,” Narang said. Scammers will always follow the money and, for that matter, the crowd. While porn bots on Instagram have evolved to the point of posting nonsense to avoid moderators, more sophisticated bots targeting a younger demographic on TikTok are posting somewhat plausible comments on Taylor Swift videos, Narang says.

The next big thing in social media will inevitably come along sooner or later, and they will go there too. “As long as there’s money to be made,” Narang said, “there will be incentives for these crooks.”

https://www.engadget.com/instagram-porn-bots-latest-tactic-is-ridiculously-low-effort-but-its-working-181130528.html?src=rss