Here's how Instagram's safety tools will protect teens

Enhanced privacy for Instagram users under 18 aimed at addressing growing concerns around negative effects of social media

By
Reuters
|
Web Desk
|
A logo of mobile application Instagram is seen on a mobile phone, during a conference in Mumbai, India, September 20, 2023. — Reuters
A logo of mobile application Instagram is seen on a mobile phone, during a conference in Mumbai, India, September 20, 2023. — Reuters

Instagram will no longer allow users to take screenshots or record screens as the parent company Meta announces introducing AI-based security tools to prevent teens from blackmailing.

The enhanced privacy and parental controls for Instagram accounts of users under 18 in a significant overhaul are aimed at addressing growing concerns around the negative effects of social media.

Meta will port all designated Instagram accounts automatically to "Teen Accounts", which will be private accounts by default, the company said on Tuesday.

Users of such accounts can only be messaged and tagged by accounts they follow or are already connected to, while sensitive content settings will be dialled to the most restrictive available.

The previously tested features will stop scammers from tricking youngsters into sending their private images via direct messages via Instagram or Facebook Messenger, BBC reported.

This means that if someone sends a photo or video using the 'view once' or 'allow replay' feature, they don’t need to worry about it being screenshotted or recorded in-app without their consent.

"We also won’t allow people to open 'view once' or 'allow replay' images or videos on Instagram web, to avoid them circumventing this screenshot prevention," Meta stated in a statement for the latest safety guardrails.

Scammers often use the following and follower lists of their targets to try and blackmail them but they will no longer have the ability to exploit this feature.

Meta announced that people’s followers or following lists would be hidden from the accounts detected with scammy behaviours.

Other safety tools include the nudity protection feature in messages, which is being rolled out globally for Instagram DMs.

As per Meta, this feature, which will be enabled by default for teens under 18, will blur images that we detect contain nudity when sent or received in Instagram DMs and will warn people of the risks associated with sending sensitive images. 

Moreover, an informative PSA will appear on the feeds of millions of teens using Instagram to educate them on how to stay safe from "sextortion scams" and what to do in case of being targeted. 

As part of the update, the under-18 Instagram users will be notified to close the app after 60 minutes each day. The accounts will also come with a default sleep mode that will silence notifications overnight.

Meta said it will place the identified users into teen accounts within 60 days in the U.S., UK, Canada and Australia, and in the European Union later this year. Teens around the world will start to get teen accounts in January.