Photo-sharing application According to Liz Fernandez, Meta’s Product Communication Manager, Instagram is working on a feature that uses a “nudity protection” filter to prohibit unwanted nude photographs received over direct messaging (DM). The firm is collaborating with experts to make sure that these new capabilities protect people’s privacy while providing them control over their communications, she added in her statement to The Verge. Alessandro Paluzzi, a trustworthy leaker and reverse engineer on Twitter, shared this feature with The Verge. He tweeted an image of what users could see when opening this feature on Twitter along with the statement, “Instagram is working on nudity protection for conversations.”

How will the tool work?

The new “Nudity Prevention” feature will function by identifying photographs that have been transmitted to the user over chat that could include nudity. When the user opens the message, it will seamlessly cover the image, and they can choose whether to view it or not. The technology, according to Meta, will prevent Meta from viewing the actual communication or from sharing it with outside parties. Unwanted nude images have become a major issue on social media nowadays. While this is happening, other applications, like Bumble, employ AI-powered solutions to solve this issue. Twitter is having trouble detecting large-scale non-consensual nudity and child sexual abuse content.

Leave a Reply

Your email address will not be published. Required fields are marked *