The New Feature-
Instagram has recently rolled out a controversial new feature that is aimed to make the platform a safer and more private experience for “young people”. New Instagram users who are under 16 years old (or under 18 in certain countries) will now have a private account by default until chosen otherwise. As per the company, this will stop or limit young people from hearing from unknown adults, they don’t want to interact with. That is If you have a private account, strangers will not be able to see or respond to your posts, Reels and Stories. Your content will not show up in the Explore tab, hence it will limit your audience to the friend list. And as for those who already have a public account, Instagram plans to show a notification explaining the benefits of a private account and how can they switch to one.
The changes come as Instagram is under pressure from lawmakers, regulators, parents, and child-safety advocates worried about the impact of social media on kids’ safety, privacy, and mental health. It is no new news that these entities have always criticized Instagram for exposure to explicit content and adding to undue anxiety among the teens who while scrolling feel the need to be accepted and are indulged with the comments and likes that they are unable to to handle the peer pressure. And to top it all there is no regulation on the trolling culture that we live in.
And with this new feature the platform is addressing the problem of these teens coming into “unwanted contact with suspicious/ bad adults”. Instagram says quote-adults who, while not breaking Instagram’s rules, have shown “potentially suspicious behavior” — such as if they’ve been blocked or reported by young people — will have limited ability to interact with and follow teens.
Further breaking it down, they won’t see teenagers’ posts among the recommendations in Instagram’s Explore and Reels sections, and Instagram won’t suggest they follow teens’ accounts. If these adults search for specific teens by username, they won’t be able to follow them, and they’ll be blocked from commenting on teens’ posts.
The Loop Holes-
But does this mean all the famous Tik Tokers are being put out of work? Well, certainly not. We all have used the wrong age before, and it is no news that many of us well over the age of 18 still have used to wrong dates to sign up. That is hardball to tackle. To this Instagram has to say that say they are working on better methods of verifying users’ ages, so they can determine when policies for teens should apply and do a better job of keeping kids under 13 off the apps. Instagram already uses artificial intelligence to scan profiles for signals that suggest whether a user is older or younger than 18. But is it good enough? Certainly not.
Then comes the next problem that these children still have the liberty to accept such anonymous requests. So ultimately the feature fails all over again. And the feature only states that the user’s profile is by default set to private but isn’t it easy to make your profile public. Even if you don’t know, all one needs to do is google it. Doesn’t that bring us back to square one?
This feature came as an aftermath of the letter sent to Facebook stating concerns as it announced the preparation for a separate Instagram platform for kids under 13. And this feature seems like a bandage on a broken glass but then Facebook CEO Mark Zuckerberg has defended the idea, saying that under-13s are already using Instagram, so it would be better to provide them a dedicated version. But all I hear is that the age-identifying metrics of Instagram have been failing and we have zero remedies for it.
This might or might not solve the problem of safety but the element of effect on mental health is still present and adding a new platform doesn’t seems to be the apt answer for it. It will still constantly make the young ones feel the need to focus on personal branding and the need for social acceptance. Will the hate comments, trolling, or the anxiety of getting maximum likes go away because you made a different platform? Does it solve anything at all? The social dilemma is still there, and it needs to be addressed by such platforms.
New feature or a strategic move-
And this move seems like a strategy taken to promote the new platform. Once this feature fails, the idea of a new platform will seem like the next best solution, and parents might give in to it with lesser reluctance than what is being seen now. Doesn’t it seem like the perfect strategy? Put the so-called feature that is bound to fail up, that will help them pretend that they do care and then once the feature fails to deliver launch the new platform so that it looks like a perfect solution.
Such platforms have a moral and ethical responsibility that they have been not just been only ignoring rather using this anxiety of ours to milk in as much as possible. But isn’t it high time they become more morally woke and make changes for better rather than adding on features that seem useless to even a not-so-tech-savvy person? Or shall I say using strategies that only benefit them? Putting the bandage over a broken glass is not the solution and having a new platform is worse. This is creating more without dealing with the last one.