By Asa Fitch

Amazon.com Inc. said it is halting law-enforcement use of its facial-recognition software, adding its voice to a growing chorus of companies, lawmakers and civil rights advocates calling for greater regulation of the surveillance technology amid widespread concern about its potential for racial bias.

Facial-recognition technology has long been criticized for perceived bias, with studies showing most algorithms are more prone to misidentifying African-Americans' and other minorities' faces than Caucasians'. Those concerns have intensified as police tactics and law enforcement's use of technology come under intense scrutiny with the wave of nationwide protests triggered by the killing of George Floyd, a black man who died in police custody.

For tech companies, the protests and expressions of support for the Black Lives Matter movement is generating fresh introspection over the role the products and services play in society.

Facebook Inc. Chief Executive Mark Zuckerberg last week said his company would review how it treats discussions of police use of force and would seek to involve more diverse sets of people in management decisions. Amazon CEO Jeff Bezos has called criticism of his support for the Black Lives Matter movement "sickening."

The retailing giant said it has been advocating for strong government regulation of the use of facial-recognition technology, and Congress appeared ready to take on that challenge. "We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested," Amazon said in a blog post Wednesday. The company wouldn't elaborate on its actions.

A police reform bill House Democrats introduced Monday would prohibit federal law enforcement's use of real-time facial recognition.

Some activists said the provision didn't go far enough. Neema Guliani, senior legislative counsel at the American Civil Liberties Union, said legislation should prohibit all use of facial recognition on police body-camera footage, and called for restrictions on federal funding for local law-enforcement agencies that don't restrict its use in the same way.

Amazon has sold its Rekognition face-recognition software widely, including to police departments and other U.S. enforcement agencies. The company said it would continue to allow the use of its tools by organizations that deploy facial recognition to combat human trafficking and find missing children.

International Business Machines Corp. earlier this week said it had stopped developing and offering facial-recognition software following long-running criticism that such technologies are less accurate on African-American and other minorities' faces than on Caucasians.

IBM's new chief executive, Arvind Krishna, notified Congress of the decision in a letter Monday, in which he also advocated for police reform and expanding educational support for people of color following mass demonstrations against what activists and policy makers have denounced as institutionalized racism in American police forces.

"We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies," Mr. Krishna wrote in the letter, addressed to five Democratic members of Congress, four of them African-Americans.

"Artificial Intelligence is a powerful tool that can help law enforcement keep citizens safe," he said. "But vendors and users of Al systems have a shared responsibility to ensure that Al is tested for bias, particularity when used in law enforcement, and that such bias testing is audited and reported."

IBM had been working on facial-recognition products for years before its decision this year to back away from the technology. The company reacted to growing bias concerns by releasing a set of training images last year for facial-recognition algorithms designed to make them more accurate. It also called for the technology to be regulated but not banned outright.

Facial recognition's accuracy has improved dramatically in recent years, due to advancements in algorithms and ever-faster computers. But with the growing utility of the technology -- which has become ubiquitous as a means of unlocking smartphones and making digital payments -- academic researchers and activists have raised alarms about embedded racial and gender biases.

In a landmark study two years ago, Massachusetts Institute of Technology researchers found significant differences in the accuracy of facial-recognition systems, including IBM's and Microsoft Corp.'s, depending on sex and skin tone. Later academic work backed up the existence of biases, including a major government-funded study in December. The software also has been shown to be less accurate for women.

Tech companies have carved out different stances on how to handle their facial-recognition tools. Amazon, which has defended its Rekognition product against charges of bias, had continued to offer it to law enforcement.

Microsoft, another major facial-recognition vendor, has said it is handling the technology cautiously. President Brad Smith last year said the company declined to sell it to a police department in California over concerns it would be used in mass surveillance.

Companies across the U.S. are scrambling to react to public sentiment spurred by Mr. Floyd's death. ViacomCBS Inc.'s Paramount Network has cancelled long-running TV show "Cops." Nascar on Wednesday banned Confederate flags at its events. Greg Glassman, the founder and chief executive of CrossFit Inc., said he had decided to retire after his inflammatory remarks about the killing of Mr. Floyd prompted sponsors and gym owners to cut ties with his company.

As recent Black Lives Matter protests spread, employees and activists have assailed big tech companies' dealings with law enforcement, demanding in some cases that they cut ties. While Amazon has come out in support of police reform, a group of activists this week launched a petition demanding that the company sever all ties with police departments and U.S. Immigration and Customs Enforcement.

"Tweets and token donations mean nothing coming from a company that openly colludes with the agents and institutions of systemic racism and anti-Blackness," said Myaisha Hayes, campaign strategies director at MediaJustice, one of the groups backing the petition. "Amazon needs to examine its structural role in the systemic oppression of Black people."

Some Amazon shareholders had urged the company to commission an independent report on whether its surveillance productions were contributing to human-rights violations. "Amazon partners with over 600 police departments, providing police with access to Ring doorbell video surveillance data," the shareholder proposal calling for the review said. The proposal didn't pass at Amazon's annual meeting in May.

As tech companies debate the uses facial-recognition tools, many cities across the country have banned their use across government departments, including by police. San Francisco, home to many of tech's biggest names, became the first American city to pass such a ban, in May 2019.

--Sebastian Herrera contributed to this article.

Write to Asa Fitch at asa.fitch@wsj.com