Meta is stepping up efforts to protect children on Instagram
San Francisco (AFP) - Meta, the parent company of Facebook and Instagram, on Thursday announced new measures to fight sextortion, a form of online blackmail where criminals coerce victims, often teens, into sending sexually explicit images of themselves.
The measures include stricter controls on who can follow or message teen accounts and safety notices in Instagram direct messages and Facebook Messenger about suspicious cross-country conversations.
The measures beef up Instagram’s “Teen Accounts,” which were announced last month and are designed to better protect underage users from the dangers associated with the photo-sharing app.
The company is also implementing restrictions on a scammer’s ability to view follower lists and interactions, as well as preventing screenshots in private messages.
Additionally, Meta is globally rolling out nudity protection features, which blur potentially nude images and prompt teens before they send one, in Instagram direct messages.
In certain countries, including the US and Britain, Instagram will show teens a video in their feeds about how to spot sextortion scams.
This initiative aims to help teens recognize signs of sextortion scams, such as individuals who come on too strong, request photo exchanges, or attempt to move conversations to different apps.
“The dramatic rise in sextortion scams is taking a heavy toll on children and teens, with reports of online enticement increasing by over 300 percent from 2021 to 2023,” said John Shehan of the US National Center for Missing & Exploited Children.
“Campaigns like this bring much-needed education to help families recognize these threats early,” he added on a Meta blog page announcing the measures.
The FBI earlier this year said sextortion online was a growing problem, with teenage boys the primary victims and offenders often located outside the US.
From October 2021 to March 2023, US federal officials identified at least 12,600 victims, with twenty of the cases involving suicides.
Meta’s move to protect children came as pressure has been building across the globe against the social media giant founded by Mark Zuckerberg and its rivals.
Last October, some forty US states filed a complaint against Meta’s platforms, accusing them of harming the “mental and physical health of young people,” due to the risks of addiction, cyber-bullying or eating disorders.
For the time being, Meta refuses to check the age of its users in the name of confidentiality, and is urging legislation that would force ID checks at the level of a smartphone’s mobile operating system, i.e. by Google’s Android or Apple’s iOS.