Instagram hopes that blurring nudity in messages will make teens safer

The option will be turned on by default for users under 18

An example of the blurred technology on Instagram
An example of the blurred image technology on Instagram provided by Meta
(Image credit: Meta)

Parents have long raised concerns over the types of images their children can be exposed to on Instagram, and now the social media brand is taking a new step to try and fight sexual exploitation on the app. In a major change for their interface, Instagram will begin automatically blurring nude images in direct messages. 

The change, announced by Instagram's parent company Meta on April 11, is part of a series of new tools designed to minimize child sexual abuse and exploitation across social media brands. The tools will "help protect young people from sextortion and intimate image abuse" and also "make it more difficult for potential scammers and criminals to find and interact with teens," Meta said in a press release. The company is also testing new ways to "help people spot potential sextortion scams." 

Subscribe to The Week

Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

SUBSCRIBE & SAVE
https://cdn.mos.cms.futurecdn.net/flexiimages/jacafc5zvs1692883516.jpg

Sign up for The Week's Free Newsletters

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

Sign up
Explore More
Justin Klawans, The Week US

 Justin Klawans has worked as a staff writer at The Week since 2022. He began his career covering local news before joining Newsweek as a breaking news reporter, where he wrote about politics, national and global affairs, business, crime, sports, film, television and other Hollywood news. Justin has also freelanced for outlets including Collider and United Press International.