Politics

Instagram’s Selective Blurring Of Nudity Falls Woefully Short Of Protecting Kids

Published

on

Instagram is finally taking action against sexual exploitation on its platform, just one day after being called out in the National Center on Sexual Exploitation’s (NCOSE) Dirty Dozen List. Instagram, which is owned by Meta, will use artificial intelligence to automatically blur images of nudity in the direct messages (DMs) of users under 18 years old. 

While the new policy may seem like a welcome step in the right direction, it’s far from enough. Minors may still click “view image anyway” and easily surpass the blurring on an explicit direct message. Many children will want to click on the blurred image to see what it is. In fact, the change is little different than Instagram’s existing policy banning the posting of nude images, which is easily circumvented or overridden by users. 

“Why is Meta even putting the burden on children to make the choice about seeing sexually explicit content when their own policies expressly prohibit it?” asked Tori Rousay, corporate advocacy manager and analyst at NCOSE.

Even TikTok simply disables all direct messaging for users aged 13 to 15, understanding that this private venue for communication opens up children to victimization of various forms, even without the sharing of explicit imagery. Disabling DMs for children under 18 would be an obvious next step for Meta to take in combating sexual exploitation on its platforms. 

“Disabling direct messaging for minors, like TikTok does … would be a significant step and something the National Center on Sexual Exploitation has also been asking Meta to do,” Rousay said. “Further, prohibiting

CLICK HERE to read the rest of this ARTICLE. This post was originally published on another website.

Trending

Exit mobile version