By Georgia Wells, Jeff Horwitz and Deepa Seetharaman

Facebook Inc., Twitter Inc. and YouTube have limited posts from President Trump, and Twitter has temporarily locked his account, as the social-media platforms worked to tamp down content that could further fuel protests in the U.S. capital.

The companies on Wednesday also limited other posts that they deemed to be inciting violence or undermining the electoral process, in what amounted to the industry's strongest actions to date to rein in controversial content on their platform.

Twitter first said that it limited the ability of users to share content including a video about the protests from President Trump due to a "risk of violence," and said it was taking other measures to slow the spread of potentially dangerous content. Twitter later said it had required the removal of several of Mr. Trump's posts, including the one with the video, saying that they represented "repeated and severe violations" of its policies. The company said it had locked his account for 12 hours, and warned that further violations of its rules could lead it to permanently suspend Mr. Trump's account.

Facebook and YouTube also took down Mr. Trump's video entirely on Wednesday afternoon, and Facebook removed a subsequent post from the president.

"This is an emergency situation and we are taking appropriate emergency measures, including removing President Trump's video," Guy Rosen, Facebook's vice president of integrity, wrote on Twitter. "We removed it because on balance we believe it contributes to rather than diminishes the risk of ongoing violence."

The takedown occurred after the video had been viewed 2.8 million times on Facebook, according to CrowdTangle, a Facebook-owned social media analytics tool.

The actions late Wednesday, some of which occurred within minutes of each other, escalated the social-media companies' efforts from November, when they began grappling with false claims about the election as Mr. Trump made premature declarations of victory and claimed there was a plot to steal votes.

The companies have faced pressure from some people, largely on the right, who have labeled their efforts to limit such content censorship that is disproportionately applied to conservatives, and others -- including many on the left -- who have demanded the platforms take more aggressive action against such distortions.

Many of those who have called for more intervention said that Wednesday's unrest in Washington, D.C., in which pro-Trump rioters stormed the U.S. Capitol, demonstrated that the platforms' efforts have fallen short.

"These events were planned, propagated, and set into motion on social media," said John Redgrave, chief executive of Sentropy Technologies Inc., which sells tools for online communities to protect against abusive and malicious content. "No platform can stand back and wash their hands of it because the environment that has led to this grew directly from the fertile soil of extreme, unmitigated discourse."

U.S. election officials have characterized the November election as the most secure in U.S. history. Still, the false statements from Mr. Trump and some of his supporters unleashed weeks of viral claims about attempted election theft and posts that encouraged violence.

Beyond taking down praise for the storming of the Capitol and some of President Trump's posts, Facebook announced a raft of emergency measures to tame discourse on the platform.

More group moderators will be required to preapprove posts, and Facebook says it will automatically disable comments on posts that "start to have a high rate of hate speech or content that incites violence." The company also adjusted its recommendation systems to automatically suppress content deemed likely to violate its policies.

"The violent protests in the Capitol today are a disgrace," a spokesperson for Facebook said in a statement.

Twitter on Wednesday reminded users in a series of messages that calls to violence are against its rules, and said it has been significantly restricting the engagement of tweets that threaten the integrity of U.S. institutions.

In the video Mr. Trump posted on social media, he described the rioters as very special, called the election stolen and fraudulent and said he understands how the protesters feel. Mr. Trump also encouraged protesters to go home.

"This claim of election fraud is disputed, and this Tweet can't be replied to, Retweeted or liked due to a risk of violence," Twitter said in a note appended to Mr. Trump's video before the company removed it.

YouTube, a unit of Alphabet Inc.'s Google, removed a video posted to Mr. Trump's channel that violated its policies against alleging widespread fraud changed the outcome of the 2020 U.S. election. A spokesman for YouTube said it would allow copies of the video if they were uploaded with additional context.

Researchers who focus on social media on Wednesday were trying to piece together where the calls for violence originated and how they spread, but that it was clear that at least some of it took place on Facebook.

Claire Wardle, co-founder and U.S. director of First Draft News, a nonprofit dedicated to fighting disinformation, said Facebook has been too slow to respond.

"Facebook just does not have the moderation tools to deal with this stuff at scale and when it matters," Dr. Wardle said. "A lot of this stuff has been tolerated for years and years -- and it's not going to stop today."

Write to Georgia Wells at Georgia.Wells@wsj.com, Jeff Horwitz at Jeff.Horwitz@wsj.com and Deepa Seetharaman at Deepa.Seetharaman@wsj.com

(END) Dow Jones Newswires

01-06-21 2011ET