By Lukas I. Alpert and Benjamin Mullin
Facebook Inc. is considering changing how it prioritizes news stories in users' feeds to give better placement to media outlets deemed more trustworthy, as the company continues efforts to limit the exposure of false news, people familiar with the matter said.
Under its new approach, Facebook would evaluate parameters such as public polling about news outlets, and whether readers are willing to pay for news from particular publishers. Such variables would inform its algorithm that determines which publishers' posts are pushed higher in the feed, one of the people said.
Such a move would thrust Facebook into an even more active role in deciding what content is acceptable on its site. The company hasn't decided whether to proceed with the shift, and it may choose not to do so.
It is also considering other changes to the news feed, some of which it could announce as early as Thursday. The company plans to change how it handles video, by giving priority to those that users engage with and playing down those that generate views by automatically playing when seen in a person's feed.
Facebook is also looking to double down on a strategy of promoting content shared by friends and family over posts published by news outlets, other people familiar with the situation said. One of the goals would be to encourage more conversation among Facebook users. There is some consternation among publishers that this strategy could lead to a significant reduction in traffic referred from Facebook over time.
Digiday earlier reported some details of Facebook's plans.
The shift in news prioritization would be Facebook's most aggressive attempt so far to rein in false news. Chief Executive Mark Zuckerberg has sought to keep Facebook from taking on editorial responsibilities, saying repeatedly that Facebook wants to minimize the spread of false information on its platform without becoming the "arbiters of truth."
The company already has joined with outside fact-checkers like PolitiFact and Snopes to mark completely false stories, lowering their prominence in the news feed. And it has launched features such as "related articles" that push readers to think twice before sharing a story.
But up to now the social network has been reluctant to make editorial decisions about the quality or veracity of what is posted on its platform. Some critics have said the company, as the most powerful distributor of media content on the web, has a duty to police its feed and work hard to weed out misinformation.
Some publishers, however, have said they would be uneasy ceding editorial control to Facebook.
It is unclear how the new ranking system would affect publishers' reach -- it would likely vary from outlet to outlet. Some publishers fear that promoting community posts over news could eat into the traffic they receive via Facebook significantly.
Facebook has adjusted its algorithm over the years on multiple occasions to weed out content it believed was cluttering users' feeds, such as "clickbait" stories. Dealing with "fake news," stories that are either outright hoaxes or conspiracy theories or that include demonstrably false information, has proved tricker.
Criticism of Facebook's approach increased after the 2016 U.S. election cycle, during which hundreds of Russian-linked accounts spread disinformation, placed politically-themed advertisements and even tried to organize partisan events amid the highly charged political atmosphere.
Late last year, executives from Facebook, Twitter Inc. and Google were called before Congress to explain how their platforms had been used to spread misinformation and influence political events during the election cycle.
Facebook conceded at the hearing that an estimated 126 million users were exposed to posts during the last presidential election cycle made and promoted by Russian-backed actors. The company admitted it had struggled to always determine who was behind such activity but has worked to better identify suspicious players.
Write to Lukas I. Alpert at firstname.lastname@example.org