By Antonia Woodford, Product Manager

Ahead of the European Parliament Elections in May, we have made fighting misinformation a top priority. One of the ways we reduce the spread of false news is by partnering with independent,third-party fact-checkersaround the world. Today we're announcing the expansion of this program in the EU with five new local fact-checking partners: Ellinika Hoaxes in Greece, FactCheckNI in Northern Ireland, Faktograf in Croatia, Observador in Portugal and Patikrinta 15min in Lithuania. These organizations will review andrate the accuracyof content on Facebook.

Our fact-checking partners are all accredited by the International Fact-Checking Network (IFCN), which appliesstandardssuch as non-partisanship and transparency of sources. These partners are also part of a collaborative effort led by IFCN to fact-check content related to the EuropeanParliament elections, calledFactCheckEU. Starting today, all FactCheckEU participants will able to rate and review claims on Facebook.

Our program now includes 21 partners fact-checking content in 14 European languages: Croatian, Danish, Dutch, English, French, German, Greek, Italian, Lithuanian, Norwegian, Polish, Portuguese, Spanish and Swedish. When a fact-checker rates a story as false, we show it lower in News Feed, significantly reducing its distribution. This reduces the spread of the story and the number of people who see it. In our experience, once a story is rated as false, we've been able to reduce its distribution by 80%. Pages and domains that repeatedly share false news will also see their distribution reduced and their ability to monetize and advertise removed. This helps curb the spread of financially motivated false news.

This program is part of ourthree-part frameworkto improve the quality and authenticity of stories in News Feed. We removeaccounts and content that violate ourCommunity Standardsorad policies, reducethe distribution of false news and inauthentic content like clickbait, and informpeople by giving them more context on the posts they see. In line with this approach, we're working on additional measures to protect elections in the EU, including:

Introducing a Click-Gap signal:Our News Feed ranking system uses many signals to ensure that people see stories that are relevant and interesting to them. Werecently introduceda new signal called click-gap, which can identify whether a website is producing low-quality content bylooking at the percentage of total Facebook clicks it gets, and comparing that number to its web graph - the clicks it gets from other sources, or its status within the broader internet. Sites that have a high click-gap ratio are likely to produce lower-quality content like misinformation.

Expanding thecontext button: To help people evaluate the credibility of an article, we provide abuttonthat displays more context about the article's source, such as the publisher's Wikipedia entry and where the article is being shared. We're expanding this to indicate if a Page has a history of sharing misinformation, as well as 'Trust Indicators,' which are publisher-provided links to a publication's fact-checking principles, code of ethics, corrections policy, ownership/funding, and editorial team.

Informing publishers with new updates to the Page Quality Tab: To better inform Page managers about our policies around repeatedly sharing misinformation, we're adding information about a Page's misinformation violations to thePage Quality tab. If a Page has repeatedly published misinformation, we apply a demotion to all of that Page's content on Facebook, and revoke that Page's ability to advertise or use monetization products. The Page Quality Tab tells Page admins if they have repeatedly shared misinformation, if they haven't, or if they're at risk of reaching 'repeat offender' status. We plan to addthe ability for Page admins to see if their Page is receiving any demotions for sharing clickbaitas well.

Informing group administrators about our misinformation tools: We'll soon let group admins know if a third-party fact-checker has rated content that was posted in their groups as false. We'll demote group content in News Feed if a group repeatedly shares misinformation, much in the same way that Pages and domains get a demotion on all their content if they frequently share false news.

Finally, we're improving how we identify content that contains a claim that has already been debunked by a fact-checker so we can demote that too. Publishers who share this content will receive a notification that they've shared misinformation, much in the way that people who shared that claim would.

Misinformation is a complex and evolving problem, and we have more work to do. We're investing heavily to get ahead because we believe in providing a space for civic discourse during elections. We'll continue to take steps to ensure this discourse is safe, authentic, and accurate.

You can read more about how we're maintaining the integrity of information on Facebookhere.

Attachments

  • Original document
  • Permalink

Disclaimer

Facebook Inc. published this content on 25 April 2019 and is solely responsible for the information contained herein. Distributed by Public, unedited and unaltered, on 25 April 2019 11:07:12 UTC