By Aaron Tilley

Zoom Video Communications Inc.'s rise as a lifeline for many businesses and individuals during the pandemic is drawing the company into a fraught area that has caused problems at much larger tech firms: policing its service.

Zoom has been criticized by some lawmakers and users in recent months after it blocked public events planned on its service around politically sensitive topics. At times, those who it blocked as well as their allies have accused the company of censorship, echoing charges some lawmakers last week leveled at the chief executives of Facebook Inc., Twitter Inc. and Google-parent Alphabet Inc.

Zoom was founded in 2011 largely to provide video-communications services to businesses. But when Covid-19 struck, Zoom won popular adoption, including by many academic institutions. Many of those users have relied on a free service the company has offered. Corporate customers still account for the bulk of its revenue.

Zoom said users may not use its service to break the law, promote violence, display nudity or commit other infractions. The policies are similar, though not as sweeping, as those of social-media companies that also target posts where groups masquerade as others. The videoconferencing app's rules have evolved this year, and the company has said it wasn't well-prepared to handle politically sensitive issues when its use took off during the pandemic.

In cases where Zoom has taken action and blocked a public event, the company has said it acted once it became aware of a virtual gathering that would transgress its rule or local laws. "Zoom does not monitor meeting content," a company spokeswoman said.

Zoom in September blocked the use of its service for a webinar at San Francisco State University. The meeting was due to feature Leila Khaled, a member of the Popular Front for the Liberation of Palestine, which the U.S. government has designated a terrorist organization.

The Lawfare Project, a pro-Israel group, said it lobbied Zoom to block the meeting. Zoom, which didn't identify who alerted it to the meeting, said it was blocked because Ms. Khaled's affiliation with a group the U.S. has designated a terrorist organization violated its service terms. Facebook and Alphabet's YouTube also removed the meeting, saying it violated their policies around supporting people or organizations considered dangerous.

Zoom also blocked a series of follow-up Zoom webinars in October organized, in part, by a pro-Palestinian group in conjunction with staff at several U.S. and overseas universities to address what they said was censorship by the company. Zoom's action was earlier reported by BuzzFeed News. Zoom again said the meeting violated its rules, because organizers said Ms. Khaled was scheduled to appear. Meetings about Zoom's alleged censorship where she wasn't due to appear were allowed to take place, the company said.

"It's an issue of concern to everyone who is in higher education who is more or less dependent on Zoom," said Andrew Ross, a New York University professor of social and cultural analysis and moderator of the event.

It wasn't Zoom's first exposure to the minefield of international politics. In June, it responded to a request from the Chinese government and blocked accounts of activists involved in a videoconference on the 1989 Tiananmen Square massacre. Zoom took action not just against accounts from participants in China, but also overseas, including in the U.S., saying it was complying with Chinese law. It later restored the accounts outside China saying they were suspended by mistake and promised to do better in the future.

Tech companies increasingly are caught between those who say they need to take more responsibility for potentially harmful or controversial content people disseminate using their services, and others who accuse the companies of censoring certain viewpoints. But the battle has previously focused on social-media platforms, rather than a communications tool like Zoom.

"The more visible Zoom is, the more pressure on them to be content police," said Daphne Keller, a former Google lawyer and now a program director at Stanford University's Cyber Policy Center. "Content moderation is a big thankless job," she said. "You're constantly doing a job that users disagree with, or half the public disagrees with, and you get yelled at by Democrats and Republicans."

Zoom says it is looking to better handle content issues, but it is relatively small, with around 3,000 employees in total. Twitter has more than 5,000 employees, in comparison, and Facebook has around 15,000 people dedicated just to handling content moderation. Both have years of experience dealing with such dicey issues, and have evolved their policies

Aparna Bawa, Zoom's chief operating officer, said it is adding staff to its trust and safety team to deal with sensitive issues. "We've taken significant steps," she said at The Wall Street Journal Tech Live conference last month.

The China incident, she said, sparked soul searching at Zoom. She said the company aims to "balance both our obligations in local jurisdictions and our own principles for the free and open exchange of ideas."

Zoom says it is becoming more careful about what events to block or allow. The Council on Foreign Relations in September held a virtual meeting with Iran's foreign minister. The minister was sanctioned by the U.S. last year, so the meeting would have violated Zoom's rules. It allowed the meeting to take place after the think tank showed it had approval from the U.S. government for a prior meeting with the minister.

Zoom is having to adapt in other areas, too, because of its unexpected mass popularity. Early in the pandemic, Zoom struggled to deal with a surge of instances of "Zoombombing" -- where people gain unauthorized access to a meeting and share hate-speech or pornographic images -- that became a scourge early when users didn't properly lock down user sessions. Zoom adjusted its software, by changing default settings and introducing virtual waiting rooms, to prevent outsiders from gaining access to meetings, though instances of Zoomboming still happen.

Zoom's efforts to moderate content also have made the company a target of U.S. lawmaker criticism. Sen. Marco Rubio of Florida, along with other senators, asked Zoom to explain its actions surrounding the takedown of the Tiananmen Square-themed meeting. In a letter, Josh Kallmer, the company's government relations head, said Zoom wasn't prepared to handle politically sensitive issues. It would be more sensitive in handling government requests for account removals, he said, and reject Chinese censorship requests on meetings outside China.

"We believe that our presence -- and the option for citizens to use a service like Zoom -- in countries like China and others with restrictive laws is creating more room for discourse and open communication to happen," Mr. Kallmer told senators in a letter viewed by the Journal.

Write to Aaron Tilley at aaron.tilley@wsj.com

(END) Dow Jones Newswires

11-03-20 0814ET