Meta confirms Instagram issue that’s flooding users with violent and adult content Reels

Meta has admitted to CNBC that Instagram is experiencing an error that’s flooding users’ accounts with Reels videos that aren’t typically surfaced by its algorithms. “We are fixing an error that caused some users to see content in their Instagram Reels feed that should not have been recommended,” the company told news organization. “We apologize for the mistake.” Users have taken to social media platforms to ask other people whether they’ve also recently been flooded with Reels that contain violent and sexual themes. One user on Reddit said that their Reels pages was inundated with school shootings and murder.

The Meta spokesperson didn’t tell CNBC what exactly the error was, but some of the videos people have reported seeing shouldn’t have been on Instagram in the first place, based on the company’s own policies. “To protect users… we remove the most graphic content and add warning labels to other graphic content so that people are aware it may be sensitive or disturbing before they click through,” the company’s policy reads. Meta’s rules also state that it removes “real photographs and videos of nudity and sexual activity.”

Source: https://www.engadget.com/apps/meta-confirms-instagram-issue-thats-flooding-users-with-violent-and-sexual-reels-051631670.html