Facebook announced a new policy on Thursday that will limit the spread of groups on its social network that focus on providing health advice to users and groups related to violence. The company will no longer show health groups in its recommendations and said in a blog post, “it’s important that people get health information from authoritative sources.”. In the past, closed groups have been used by Facebook users to spread misinformation about vaccines and covid-19. < / P > < p > similarly, the company said it would limit the spread of violence related groups, remove them from recommendations, limit their presence in searches, and reduce the frequency of their content appearing in people’s news feeds. The move comes as Facebook tries to censor groups, including a militia group in Kenosha, Wisconsin, which used Facebook to organize an event that resulted in the killing of two people in real life. < p > < p > Facebook said that after the incident, it deleted pages related to the incident and attributed it to “operational errors” and allowed it to continue despite violating company policies. But buzzfeed later found out that Facebook didn’t delete the pages, it was an administrator of the group. Only then did Facebook admit that it had never deleted the pages. < / P > < p > in addition, Facebook now archives groups without administrators, who are the key to managing groups. If administrators quit, they can invite other group members to replace them. Facebook will also recommend the role to active members, but if no one comes forward, the company will archive these groups. If Facebook removes a group for policy reasons, the group’s administrators and moderators will not be able to create any new groups for a while. < / P > < p > Facebook also said it would prevent administrators and moderators of any group deleted for breach of policy from creating any new groups “for a period of time,” although the company did not say how long that would last. Privacy Policy