Cyberflashing, the practice of sending unsolicited and unwanted nude photos to people via social media apps or through other means, is a serious offence, punishable by prison in some countries, including Scotland, Singapore and soon, England and Wales.
Now, it appears that Instagram’s parent company Meta is finally getting serious about it too.
Developer Alessandro Paluzzi noticed that Instagram is working on a feature called “Nudity protection.”
The feature’s description says it can “securely detect and cover nudity,” and that Instagram cannot access these photos. Users will have the choice to view the photos or not, turn the feature on and off, and get safety tips on dealing with sensitive photos.
In a statement to The Verge, Meta confirmed the feature, whose goal is for users to protect themselves from nude photos and other unwanted messages, is in early stages of development. The company compared the feature with its filters that allow users to hide abusive message and comments on the platform.
“We’re working closely with experts to ensure these new features preserve people’s privacy, while giving them control over the messages they receive,” a Meta spokesperson told the outlet.
Meta plans to share more details about this feature in the next few weeks.
A UK government report from March, which announced the upcoming anti-cyberflashing legislation, cited research from 2020 that found that 76 percent of girls aged 12-18 had been sent unsolicited nude images of boys or men. The law — slated to be part of the Online Safety Bill — treats cyberflashing as seriously as in-person flashing, and proposes a maximum sentence of two years in prison for offenders.