How the Oversight Board is delivering results for users on Facebook and Instagram

Поделиться
HTML-код
  • Опубликовано: 7 фев 2024
  • Over the past years, many of our recommendations to Meta have become a reality, improving how the company treats people and communities on Facebook and Instagram.
    Our work has led Meta to review its content moderation policies, state its rules more clearly, and apply them more consistently. Meta now tells more users which specific policy area was violated when their posts are removed and is better aligning its content moderation with human rights principles.
    In response to our recommendations, Meta has introduced a Crisis Policy Protocol to make its responses to crisis situations more consistent, launched a review of its Dangerous Individuals and Organizations policy, and created a new Community Standard on misinformation.
    In response to a recommendation in our “breast cancer symptoms and nudity” decision, Meta has also enhanced its techniques for identifying breast cancer content on Instagram, contributing to thousands of additional posts being sent for human review that would previously have been automatically removed.
    Each decision and policy advisory opinion has brought further transparency to otherwise frequently opaque content moderation processes, including by revealing the number of newsworthiness exceptions that the company applies in administering its rules.
    Our policy recommendations have triggered public discourse about how digital platforms can approach some of the most complex challenges in content moderation.

Комментарии •