Skip to main content

Instagram Users Flooded with Graphic Content Due to Algorithm Error

Meta, the parent company of Instagram, has acknowledged an issue with its algorithm that is causing users to see graphic and violent content in their Reels feed. According to a statement given to CNBC, "We are fixing an error that caused some users to see content in their Instagram Reels feed that should not have been recommended… We apologize for the mistake."

User Reports of Graphic Content

Many users have taken to social media platforms to share their experiences of being flooded with graphic and violent Reels, including school shootings, murder, gore videos, nudity, and uncensored porn. Some users have reported that they are still seeing similar content even after enabling their Sensitive Content Control. This issue has affected users who have not interacted with similar content before, and in some cases, even after they have clicked "Not Interested" on a Reel with violent or sexual content.

Meta’s Community Standards

According to Meta’s community standards, the company removes the most graphic content and adds warning labels to other graphic content to protect users. The standards also state that real photographs and videos of nudity and sexual activity are not allowed on the platform. However, some of the videos reported by users should not have been on Instagram in the first place, based on these policies.

Conclusion

Meta has acknowledged the issue and is working to fix the error. In the meantime, users are advised to be cautious when using the platform and to report any graphic or violent content they may encounter. If you buy something through a link in this article, we may earn a commission.


Source Link