Tested: Instagram isn't safe for kids as spam bots are peddling pornography

4comments
This article may contain personal views and opinion from the author.
Tested: Instagram isn't safe for kids as spam bots are peddling pornography
Ever since Facebook bought the Instagram app in 2012, the once photo-oriented app has been changing quite a bit. From getting more and more focused on shopping, to pushing a TikTok-like short form video format in an effort to compete with the highly popular Chinese app.

I can't say any of those changes were for the better, but I've always been a fan of Instagram and since it's the only social media app I use, I'm well familiar with it, always noticing even the smallest changes that Facebook applies to it.

But there's one thing in the last few months that seemingly hasn't changed, and that's how rampant "Story view" spammers are on Instagram. So over the last few weeks I've decided to test some things and let you know the results.

But first, let me explain how the "Story view" scam bots actually work.

How porn-peddling scam accounts try to reach you on Instagram


First, you need to have published an Instagram Story, or several. If you happen to catch the exact moment a wave of scam accounts are "on the hunt for victims", you'll quickly get Story views from suspicious accounts that are not on your friends list. Here are some I got almost instantly upon posting an Instagram Story:



If you have several Instagram Stories posted, say three, you may notice that the scam accounts have all only viewed your second or third Story only, which a human won't do, as users normally have to go through your first Story to reach the rest. So that's another red flag that these might be automated bot accounts.

You can see who viewed your Instagram Story in detail by opening it and swiping up. And if you've gotten views from scam accounts, you'll notice that all of their profile descriptions are highly sexual and attempting to lure you to open their profile with promises of pornography.

Now, the fact that children may get views from those accounts is bad enough, but it gets worse if they do what the accounts want and click on them.

Their profiles often contain sexually suggestive images, along with invitations to visit shady websites that are definitely up to no good for anyone inexperienced enough to actually open them.

You can't even properly report Instagram accounts that feature pornography


Instagram's Community Guidelines page clearly states that nudity (and thus, pornography) isn't allowed. Exceptions for nudity include art and sculptures, photos of breastfeeding and medical imagery, but not much else.

Recommended Stories
So I was pretty surprised when I ended up seeing an Instagram account featuring pornography, and after reporting it, I recieved the message shown below. Not only is Instagram not having actual humans review pornography reports, or at least didn't a few weeks ago when this happened, but the AI Instagram uses determined that the the post "probably doesn't go against" Instagram's Community Guidelines.



As mentioned earlier, Instagram claims to forbid pornography. The photo I reported for this experiment was one featuring a woman wearing a transparent top, with her breasts clearly visible. To be fair, even though the account in question remained active and just fine after my report, checking it now, about a month later, it appears to have been removed.

But whatever the process for that removal was, it feels quite out of our hands as users.

In fairness to Instagram, it seems to be trying



I took these screenshots last month, and the accounts shown have now been deleted, so eventually Instagram did what it should have done. The question is – when is the next scam bot wave coming, and what's the chance of it reaching your child, niece or nephew's Instagram account next?

Last month we reported that Instagram will require video selfies for registration to prevent bot-generated accounts, according to several sources, but we're yet to see when this much-needed feature will come into effect.

Instagram was also working on an Instagram Kids app, supposedly to be safer for young ones, ages 10-12, but its development was recently put on hold. The one and only Instagram app that is available is meant for anyone ages 13 and up, which is pretty young for the things one can encounter on Instagram, including the aforementioned, highly profane scam accounts.

Recently Instagram finally started requesting users to confirm their age, and should start blocking anyone who lies about their age eventually, with child safety in mind. Those who haven't provided a birthday may be asked to do so more than once if they stumble upon "sensitive or graphic" content, although considering that Instagram's AI seems to be pretty lenient on what qualifies as sensitive and graphic, I personally don't feel too sure about the efficiency of this change.

Instagram also has a huge Help Center, which provides almost therapy-grade advice to users on things like body image issues and oddly enough even eating disorders. Whether children will actually ever find and read those pages is questionable, as well as whether this is just an effort from the Facebook-owned company to wash its hands from some of its recent scandals. A notable one suggested that Facebook has always known (and ignored) how toxic Instagram is to young ones.

But at the end of the day, it's best for parents to take matters into their own hands when it comes to what apps their children are allowed to use. As even after all the efforts from big tech, the internet is still a wild wasteland, where anyone who's lacking the knowledge to avoid danger may end up in some sort of trouble.

Recommended Stories

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless