Instagram hopes that blurring nudity in messages will make teens safer

Parents have long raised concerns over the types of images their children can be exposed to on Instagram, and now the social media brand is taking a new step to try and fight sexual exploitation on the app. In a major change for their interface, Instagram will begin automatically blurring nude images in direct messages. 

The change, announced by Instagram’s parent company Meta on April 11, is part of a series of new tools designed to minimize child sexual abuse and exploitation across social media brands. The tools will “help protect young people from sextortion and intimate image abuse” and also “make it more difficult for potential scammers and criminals to find and interact with teens,” Meta said in a press release. The company is also testing new ways to “help people spot potential sextortion scams.” 

Instagram has been criticized for not being strict enough about stopping children from receiving sexualized content in the past. Is the blurring of direct messages the first step toward a new beginning for the brand? 

How will Instagram’s nudity blurring feature work?

When a user receives an “image containing nudity, it will be automatically blurred under a warning screen, meaning the recipient isn’t confronted with a nude image and they can choose whether or not to view it,” Meta said. Instagram will additionally show the user a message “encouraging them not to feel pressure to respond, with an option to block the sender and report the chat.” This feature will be automatically enabled for users under 18, while adults will receive messages encouraging them to activate it. 

  Sudoku medium: March 11, 2024

On the other end, people who try to send a nude image will “see a message reminding them to be cautious when sending sensitive photos, and that they can unsend these photos if they’ve changed their mind,” said Meta. Additionally, anyone trying to forward a nude image will receive a message urging them to reconsider. 

Some people have expressed skepticism that Instagram will be able to tell whether an image is explicit, but Meta said that that its new tool “uses on-device machine learning to analyze whether an image sent in a DM on Instagram contains nudity.” This also means that “nudity protection will work in end-to-end encrypted chats, where Meta won’t have access to these images.”

The blurring feature will be tested in the coming weeks with a global rollout in the next few months, said NBC News

How will this change children’s interactions on Instagram? 

Some say it won’t. While users can block a sender and report their chat if they receive a nude, the situation on Instagram won’t change “until there is a way for a teen to say they’ve received an unwanted advance, and there is transparency about it,” Arturo Béjar, the former engineering director of Meta, said to The Associated Press.

Béjar, who has testified to Congress about his experience working at Meta, said the “tools announced can protect senders, and that is welcome. But what about recipients?” Béjar told the AP he had gathered evidence that 1 in 8 teens receive an unwanted advance on Instagram every week, asking, “what can they do if they get an unwanted nude?”

  Crossword: April 14, 2024

There is still ambiguity over how well these tools will work — especially because this is not the first time Meta has tried to curb online exploitation. The company “has had long-standing policies that ban people from sending unwanted nudes or seeking to coerce others into sharing intimate images,” TechCrunch said. But even with these rules in place, it “doesn’t stop these problems from occurring and causing misery for scores of teens and young people.”

There is also the fact that even with these tools at users’ disposal, predatory accounts remain online. Meta is trying to change that, saying in its press release that it is using technology to “help identify where accounts may potentially be engaging in sextortion scams, based on a range of signals that could indicate sextortion behavior.” While this sounds like a net positive, it is “not clear what technology Meta is using to do this analysis, nor which signals might denote a potential sextortionist,” said TechCrunch. The company is also taking steps such as hiding the “message” button on profiles deemed potentially extortionist, though the validity of these efforts remains to be seen. 

(Visited 1 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *