Meta is continuing its flurry of teen safety features for Instagram as the company faces mounting questions about its handling of younger users’ privacy and safety in its apps. The latest batch of updates are meant to tighten its protections against sextortion.
With the changes, Meta says it will make it harder for “potentially scammy” accounts to target teens on Instagram. The company will start to send follow requests from such accounts to users’ spam folders or block them entirely. The app will also start testing an alert that notifies teens when they receive a message from such an account, warning them that the message appears to be coming from a different country.
Additionally, when the company detects that a potential scammer is already following a teen, it will prevent them from being able to view teens’ follower lists and accounts that have tagged them in photos. The company isn’t saying exactly how it’s determining which accounts are deemed “potentially scammy,” but a spokesperson said they’re using signals such as the age of the account and whether it has mutual followers with the teen it’s attempting to interact with.
Meta is also making changes to prevent the spread of intimate images. Instagram will no longer allow users to screenshot or screen record images shared over DMs via the app’s ephemeral messaging feature and will no longer allow these images to be opened from the web version of Instagram. The app will also expanding the nudity protection feature it began testing earlier this year to all teens on the app. The tool automatically blurs images when nudity is detected in an image shared over DMs, and provides warnings and resources when such an image is detected.
The changes are meant to address the realities of how sextortion scams, in which scammers coerce teens into sending intimate images that are then used to threaten and blackmail them, are often carried out over Instagram. A report from Thorn and the National Center for Missing & Exploited Children (NCMEC) earlier this year found that Instagram, along with Snapchat, were the “most common” platforms used by scammers “as initial contact points.”
These scams are carried out by individuals and groups that sometimes organize on Meta’s own platforms. Alongside the updates, Meta said that it removed 800 groups on Facebook and 820 accounts, linked to a group known as the Yahoo Boys, that “were attempting to organize, recruit and train new sextortion scammers.”
Meta’s updates come as it faces increasing pressure to strengthen safety features for its youngest users. The company is currently facing a lawsuit from more than 30 states over the issue. (Earlier this week, a federal judge rejected Meta’s attempt to have the lawsuit dismissed.) New Mexico is also suing the company and has alleged that Meta didn’t do enough to stop adults from sexually harassing teens on its apps, particularly Instagram.