That was a tough sell to Mr. Kaplan, said people who heard him discuss Common Ground and Integrity proposals. A former deputy chief of staff to George W. Bush, Mr. Kaplan became more involved in content-ranking decisions after 2016 allegations Facebook had suppressed trending news stories from conservative outlets. An internal review didn't substantiate the claims of bias, Facebook's then-general counsel Colin Stretch told Congress, but the damage to Facebook's reputation among conservatives had been done.

Every significant new integrity-ranking initiative had to seek the approval of not just engineering managers but also representatives of the public policy, legal, marketing and public-relations departments.

Lindsey Shepard, a former Facebook product-marketing director who helped set up the Eat Your Veggies process, said it arose from what she believed were reasonable concerns that overzealous engineers might let their politics influence the platform.

"Engineers that were used to having autonomy maybe over-rotated a bit" after the 2016 election to address Facebook's perceived flaws, she said. The meetings helped keep that in check. "At the end of the day, if we didn't reach consensus, we'd frame up the different points of view, and then they'd be raised up to Mark."

Scuttled projects

Disapproval from Mr. Kaplan's team or Facebook's communications department could scuttle a project, said people familiar with the effort. Negative policy-team reviews killed efforts to build a classification system for hyperpolarized content. Likewise, the Eat Your Veggies process shut down efforts to suppress clickbait about politics more than on other topics.

Initiatives that survived were often weakened. Mr. Cox wooed Carlos Gomez Uribe, former head of Netflix Inc.'s recommendation system, to lead the newsfeed Integrity Team in January 2017. Within a few months, Mr. Uribe began pushing to reduce the outsize impact hyperactive users had.

Under Facebook's engagement-based metrics, a user who likes, shares or comments on 1,500 pieces of content has more influence on the platform and its algorithms than one who interacts with just 15 posts, allowing "super-sharers" to drown out less-active users. Accounts with hyperactive engagement were far more partisan on average than normal Facebook users, and they were more likely to behave suspiciously, sometimes appearing on the platform as much as 20 hours a day and engaging in spam-like behavior. The behavior suggested some were either people working in shifts or bots.

One proposal Mr. Uribe's team championed, called "Sparing Sharing," would have reduced the spread of content disproportionately favored by hyperactive users, according to people familiar with it. Its effects would be heaviest on content favored by users on the far right and left. Middle-of-the-road users would gain influence.

Mr. Uribe called it "the happy face," said some of the people. Facebook's data scientists believed it could bolster the platform's defenses against spam and coordinated manipulation efforts of the sort Russia undertook during the 2016 election.

Mr. Kaplan and other senior Facebook executives pushed back on the grounds it might harm a hypothetical Girl Scout troop, said people familiar with his comments. Suppose, Mr. Kaplan asked them, that the girls became Facebook super-sharers to promote cookies? Mitigating the reach of the platform's most dedicated users would unfairly thwart them, he said.

Mr. Kaplan in the recent interview said he didn't remember raising the Girl Scout example but was concerned about the effect on publishers who happened to have enthusiastic followings.

The debate got kicked up to Mr. Zuckerberg, who heard out both sides in a short meeting, said people briefed on it. His response: Do it, but cut the weighting by 80%. Mr. Zuckerberg also signaled he was losing interest in the effort to recalibrate the platform in the name of social good, they said, asking that they not bring him something like that again.

Mr. Uribe left Facebook and the tech industry within the year. He declined to discuss his work at Facebook in detail but confirmed his advocacy for the Sparing Sharing proposal. He said he left Facebook because of his frustration with company executives and their narrow focus on how integrity changes would affect American politics. While proposals like his did disproportionately affect conservatives in the U.S., he said, in other countries the opposite was true.

Other projects met Sparing Sharing's fate: weakened, not killed. Partial victories included efforts to promote news stories garnering engagement from a broad user base, not just partisans, and penalties for publishers that repeatedly shared false news or directed users to ad-choked pages.

The tug of war was resolved in part by the growing furor over the Cambridge Analytica scandal. In a September 2018 reorganization of Facebook's newsfeed team, managers told employees the company's priorities were shifting "away from societal good to individual value," said people present for the discussion. If users wanted to routinely view or post hostile content about groups they didn't like, Facebook wouldn't suppress it if the content didn't specifically violate the company's rules.

Mr. Cox left the company several months later after disagreements regarding Facebook's pivot toward private encrypted messaging. He hadn't won most fights he had engaged in on integrity ranking and Common Ground product changes, people involved in the effort said, and his departure left the remaining staffers working on such projects without a high-level advocate.

The Common Ground team disbanded. The Integrity Teams still exist, though many senior staffers left the company or headed to Facebook's Instagram platform.

Mr. Zuckerberg announced in 2019 that Facebook would take down content violating specific standards but where possible take a hands-off approach to policing material not clearly violating its standards.

"You can't impose tolerance top-down," he said in an October speech at Georgetown University. "It has to come from people opening up, sharing experiences, and developing a shared story for society that we all feel we're a part of. That's how we make progress together."

Write to Jeff Horwitz at Jeff.Horwitz@wsj.com and Deepa Seetharaman at Deepa.Seetharaman@wsj.com