NEWS

Meta confirms Instagram issue that's flooding users with violent and sexual Reels

by | Feb 27, 2025

Meta has admitted to CNBC that Instagram is experiencing an error that’s flooding users’ accounts with Reels videos that aren’t typically surfaced by its algorithms. “We are fixing an error that caused some users to see content in their Instagram Reels feed that should not have been recommended,” the company told news organization. “We apologize for the mistake.” Users have taken to social media platforms to ask other people whether they’ve also recently been flooded with Reels that contain violent and sexual themes. One user on Reddit said that their Reels pages was inundated with school shootings and murder. 

Others said they’re getting back-to-back gore videos, such as stabbings, beheadings and castration, nudity, uncensored porn and straight-up rape. Some said they still see similar videos even if they had enabled their Sensitive Content Control. Social media algorithms are designed to show you videos and other content similar to ones you usually watch, read, like or interact with. In this case, though, Instagram has been showing graphic videos even to those who haven’t been interacting with similar Reels, and sometimes even after the user has taken the time to click “Not Interested” on a Reel with violent or sexual content. 

The Meta spokesperson didn’t tell CNBC what exactly the error was, but some of the videos people have reported seeing shouldn’t have been on Instagram in the first place, based on the company’s own policies. “To protect users… we remove the most graphic content and add warning labels to other graphic content so that people are aware it may be sensitive or disturbing before they click through,” the company’s policy reads. Meta’s rules also state that it removes “real photographs and videos of nudity and sexual activity.”

This article originally appeared on Engadget at https://www.engadget.com/apps/meta-confirms-instagram-issue-thats-flooding-users-with-violent-and-sexual-reels-051631670.html?src=rss

This post was originally published on this site