The report underlines the challenge facing world governments and tech groups, such as Meta, which owns Facebook, Instagram and WhatsApp, Google's YouTube, Snap Inc's Snapchat, and ByteDance's TikTok, to enact safeguarding measures, especially for minors.

Britain passed legislation last October that set tougher rules for social media platforms, including a mandate for them to prevent children from accessing harmful and age-inappropriate content by enforcing age limits and age-checking measures.

The law gave Ofcom the power to fine tech companies if they fail to comply with the new requirements, but the penalties have not yet come into force as the regulator must produce codes of practice to implement the measure.

Messaging platforms led by WhatsApp have opposed a provision in the law they say could force them to break end-to-end encryption.

All of the 247 children, aged between 8-17, interviewed for the report - commissioned by Ofcom and carried out between May and November - came across violent content online mostly via social media, video-sharing and messaging sites and apps, Ofcom said.

In a statement, Ofcom said the report by research agency Family Kids & Youth found that violent gaming content, verbal discrimination and footage of street fights were commonly encountered by the children.

Many children said they felt they had no control over the content suggested to them and reported only a limited understanding of recommender systems - which use data to predict someone's preferred content. The children referred to these systems as "the algorithm," the report said.

"Today's research sends a powerful message to tech firms that now is the time to act so they're ready to meet their child protection duties under new online safety laws," Gill Whitehead, Ofcom's Online Safety Group Director, said.

She said Ofcom would consult on how it can expect the industry to ensure children have an age-appropriate, safer online experience.

(Reporting by Muvija M; editing by William James and Mark Heinrich)

By Muvija M