By Radha Iyengar Plumb, Head of Product Policy Research

We have a responsibility to keep people safe and prevent abuse on our platform - and we're committed to being transparent about how we do this. That's why we make our Community Standards available to everyone and why we share the changes we make to our policies each month. Today, we'll release the third edition of our Community Standards Enforcement Report to show how we're doing at enforcing these policies.

This kind of transparency lets people hold us accountable and helps us get much-needed feedback. But transparency is only helpful if the information we share is meaningful and accurate. In the context of the Community Standards Enforcement Report, that means metrics based on sound methodology that accurately reflect what's happening on our platform.

To this end, we established the Data Transparency Advisory Group (DTAG) last year. This is an independent body made up of international experts in measurement, statistics, criminology and governance. Their task was to provide an independent, public assessment of whether the metrics we share in the Community Standards Enforcement Report provide accurate and meaningful measures of Facebook's content moderation challenges and our work to address them.

Data Transparency Advisory Group's Findings

In November, the group announced its mandate and structure and shared initial findings after a number of reviews with our teams. We used these early findings to improve how we defined our metrics and to make sure the information we released alongside the numbers in the report gave readers helpful context about our processes and practices.

After that initial release, we gave the advisory group detailed and confidential information about our enforcement processes and measurement methodologies so they could continue their analysis. This included more details on our methods to sample, label and measure violations of our policies. They reviewed these practices and just released the results of their review, which answers three main questions:

  1. Is Facebook accurately identifying content, behavior and accounts that violate its Community Standards?
  2. Are the publicly-released metrics the most informative way to categorize and measure those violations and Facebook's response to them?
  3. What additional information does the public need in order to understand Facebook's enforcement efforts in order to evaluate the legitimacy of those policies?

Is Facebook Accurately Enforcing its Rules?

The advisory group's first question looks at whether our approach to content moderation is adequate. They concluded that our process - which includes a combination of automated and human review - is appropriate given the scale at which we operate and the amount of content people post.

Moreover, the group found the way we audit the accuracy of our content review system was well designed, if executed as described. In their report, the group recognizes the technical challenges that must be balanced in building an effective detection and enforcement system at scale. In particular, they note our current systems include mechanisms for people to report content to us and systems to detect harmful content even if someone hasn't reported it. They recommended Facebook bring more transparency to both processes and build additional ways for users to provide input into the policy development process itself.

Are the Metrics the Right Ones?

The advisory group evaluated the three metrics we included in the first two editions of the Community Standards Enforcement Report: prevalence, actioned content and proactive rate. They found our metrics to be reasonable ways of measuring violations and in line with best practices.

These findings are the crux of the report - specifically their findings around prevalence, which we use to understand the likelihood that people on Facebook may see violating content. The group noted that our prevalence metrics are similar to the way law enforcement authorities measure crime. And they recommended that we include other metrics that map to measurements law enforcement considers to give a fuller picture of prevalence. Law enforcement looks at how many people were the victims of crime - but they also look at how many criminal events law enforcement became aware of, how many crimes may have been committed without law enforcement knowing and how many people committed crimes. The group recommends that we provide additional metrics like these, while still noting that our current measurements and methodology are sound.

Based on the group's feedback - as well as Facebook's own internal review - we have added new metrics to our Content Standards Enforcement Report including the amount of content that people appealed after we removed it, the amount of content we restored. We've also outlined a number of changes we want to implement in future reports like further details about the content we take action on. The group notes that these changes will add valuable transparency.

What More Does the Public Need to Know?

In its assessment, the advisory group notes the Community Standards Enforcement Report is an important exercise in transparency. But they also highlight other areas where we could be more open in order to build more accountability and responsiveness to the people who use our platform. This includes recommendations of ways to enhance community input into our governance model and concrete suggestions on how to expand the information provided in the report to give people more context.

Final Recommendations

After answering each of these questions, DTAG lays out 15 recommendations for Facebook, which largely fall under the following categories:

  • Additional metrics we could provide that show our efforts to enforce our polices such as the accuracy of our enforcement and how often people disagree with our decisions
  • Further break-downs of the metrics we already provide, such as the prevalence of certain types of violations in particular areas of the world, or how much content we removed versus apply a warning screen to when we include it in our content actioned metric
  • Ways to make it easier for people who use Facebook to stay updated on changes we make to our policies and to have a greater voice in what content violates our policies and what doesn't

We already have plans to implement some of these in upcoming reports. For others, we're looking at how best to put these suggestions into practice. And for a few, we simply don't think the recommendations are feasible given how we review content against our policies, but we're looking at how we can address the underlining areas for additional transparency the group rightfully raises.

There are a number of suggestions the group offered outside of the report that we're incorporating into our practices as well. These range from technical suggestions for statistical tests to confirm the validity of our metrics to suggestions for how we can make the enforcement report easier to understand.

We agree with the calls for further transparency from this group as well as those from other organizations, advocates and academic experts - and we're committed to continuing to share more about how we enforce our Community Standards in the future.

We thank the Data Transparency Advisory Group for their time, their rigorous review and their thoughtful recommendations that will help inform our efforts as we enforce our standards and bring more transparency to this work.

Attachments

  • Original document
  • Permalink

Disclaimer

Facebook Inc. published this content on 23 May 2019 and is solely responsible for the information contained herein. Distributed by Public, unedited and unaltered, on 23 May 2019 14:17:06 UTC