Meta Asks Users To Submit Their Intimate Photos To Prevent ‘Revenge Porn’

Fact checked
Meta revenge porn

Meta has come up with an idea to prevent your explicit photos and videos from being used as ‘revenge porn’ on Facebook and Instagram.

Meta, which owns Facebook and Instagram, has partnered with a UK nonprofit to build a tool that lets you submit your most intimate photos to a central website in order for them to be recognised and then removed from multiple platforms.

In a blogpost on Thursday, Meta said: the tool is for “adults over 18 years old who think an intimate image of them may be shared, or has already been shared, without their consent

Latest Videos

RT reports: The new platform, which Meta developed together with the UK Revenge Porn Helpline and 50 other NGOs, aims to prevent the publication of ‘revenge porn’, rather than just removing the delicate files after they’ve already appeared online.

Concerned users are being asked to submit photos or videos of themselves naked or having sex to a hash-tagging database through the StopNCII.org (Stop Non-Consensual Intimate Images) website.

The special hashtags, or “digital fingerprints,” are then assigned to those materials by the tool, and can be used to instantly detect and curb attempts to upload them online by the perpetrators.

Meta said that the system had been developed “with privacy and security at every step.” Only the hashtags are being shared with StopNCII.org and the tech platforms participating in the project, while the explicit images and clips never leave the user’s device and remain “securely in the possession of the owner,” it assured.

The new tool represents “a sea-change in the way those affected by intimate image abuse can protect themselves,” Revenge Porn Helpline manager Sophie Mortimer insisted.

But the question remains whether people will actually be willing to use it, considering Meta’s bad rap for mishandling user data.