KUALA LUMPUR – Meta has announced new safety features aimed at protecting users, especially young individuals, from falling victim to sextortion.
The parent company of Facebook and Instagram said these latest initiatives included testing of new tools to make it harder for scammers to identify and target potential victims across all its platforms as well as the broader internet.
A key feature is nudity protection in direct messages on Instagram using on-device machine learning to analyse images sent and to automatically blur those containing nudity.
“Nudity protection will be turned on by default for teens under 18 globally, and we’ll show a notification to adults encouraging them to turn it on.
“When nudity protection is turned on, people sending images containing nudity will see a message reminding them to be cautious when sending sensitive photos and that they can unsend these photos if they’ve changed their mind,” Meta said in a press statement on its website.
The company is using advanced algorithms and signals to prevent potential scammers from connecting with teenagers on its platforms.
“While these signals aren’t necessarily evidence that an account has broken our rules, we’re taking precautionary steps to help prevent these accounts from finding and interacting with teen accounts,” it said.
For users who are already in contact with scammers, Meta is introducing new pop-up messages directing users to resources, including helplines and support hubs, for assistance.
“This means when teens report relevant issues – such as nudity, threats to share private images, or sexual exploitation or solicitation – we’ll direct them to local child safety helplines where available.”
In addition to its internal efforts, Meta is collaborating with other technology companies through initiatives like the Lantern programme, which enables the sharing of signals to combat sextortion scams across various online platforms.
Other companies in the Lantern programme include Discord, Google, Mega, Quora, Roblox, Snap and Twitch. – April 12, 2024