According to reports published by The Guardian, Facebook is planning to modify its internal rules for moderating graphic content. This will give a new insight to the companies to view and determine what kind of post its users can post. This will allow users to make these social media sites a safer platform for all age groups.



Some Facebook Files revealed internal manuals concerned with the violent threats, graphic violence, animal cruelty and abusive content on Facebook. The Guardian explained that they have reviewed over 100 training manuals that represent how Facebook is working at moderating the site against this mal content.



These guidelines basically ask users to determine between a potential threat and a blowing off stream. It also lists a group of vulnerable individuals for moderators; these posts usually are automatically escalated or deleted from the website.

Recently Facebook has been in the news, for its high profile incidents which include the Vietnam War man image, killing and suicide broadcasts, have prompted Facebook in adjusting its policies. Facebook is adding hundreds of users everyday and in order to protect its billions of users, it has to come up with new, advanced plans for keeping users and user data safe.