Facebook will send notifications directly to users who like, share or comment on covid-19 posts that violate the company’s terms of service. The new feature works like this: if a user interacts with a post and the post is deleted, Facebook sends a notification to the user that the post has been deleted. If users click on the notification, they are taken to a landing page with a screenshot of the post and a brief explanation of why the post was deleted. < / P > < p > this is an extension of Facebook’s previous attempts to crack down on misinformation. Prior to that, the company displayed a banner on the news feed urging users involved in the deleted content to “help friends and family avoid false information about covid-19.”. A Facebook product manager told fast company that users are often confused about what banner means. The company wants the new approach to be more direct than banner, while avoiding scolding users or exposing them to error messages again. The Facebook announcements do not debunk the claims in the deleted posts, wrote fast company. They also don’t apply to posts that later have fact check tags. This means that less dangerous error messages can still spread. < / P > < p > Facebook has been slow to respond to misinformation that the company believes is not dangerous. Although conspiracy theories about the covid-19 vaccine have been circulating for months, Facebook only started to delete the wrong messages about the covid-19 vaccine in December. Chinese version of K-car: reading a10e design drawing exposure