UK’s Ofcom urges tech firms to stop pushing harmful online content to kids

Regulator also calls for strong age verification to safeguard children while online

12:45 PM MYT

 

KUALA LUMPUR – Ofcom, the UK regulator for online safety, has urged technology companies to stop their algorithms from recommending harmful content to children and inundating them with such material, and called for strong age verification measures for their safety.

In response to this, it has put forward proposals outlining actions that social media platforms and other online services must implement to improve children’s safety while online.

Among the measures is protecting children so they can enjoy the benefits of being online without experiencing the potentially serious harms that exist in the online world, which is a priority for Ofcom.

According to the country’s Online Safety Act, social media apps, searches and other online services must prevent children from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography. 

They must also minimise children’s exposure to other serious harms, including violent, hateful or abusive material; bullying content; and content promoting dangerous challenges.

Ofcom also said that online services must establish whether children were likely to access their site, or part of it. 

“Also, if children are likely to access it, the company must carry out a further assessment to identify the risks their service poses to children, including the risks that come from the design of their services, their functionalities and their algorithms. 

“They then need to introduce various safety measures to mitigate these risks,” it added.

It said that these were among the 40 safety measures proposed, aimed at making sure children enjoy safer screen time when they are online. 

“Others include robust age checks; our draft codes expect services to know which of their users are children in order to protect them from harmful content.

“Safer algorithms are also important. According to our proposals, any service employing systems that recommend personalised content to users, particularly those at high risk of harmful content, must design their algorithms to filter out the most harmful content from children’s feeds. Additionally, they should downrank other harmful content,” it added.

It said effective moderation was also important where all services, like social media apps and search services, must have content moderation systems and processes to take quick action on harmful content, and large search services should use a “safe search” setting for children.

This cannot be turned off and must filter out the most harmful content, it added.

Further, Ofcom also proposed stronger senior accountability and support for children and parents.

“Our draft codes also include measures to ensure strong governance and accountability for children’s safety within tech firms. 

“These include having a named person accountable for compliance with the children’s safety duties; an annual senior-body review of all risk management activities relating to children’s safety; and an employee Code of Conduct that sets standards for employees around protecting children,” it added. – May 8, 2024

Topics

 

Popular

Petronas staff to be shown the door to make up losses from Petros deal?

Source claims national O&G firm is expected to see 30% revenue loss once agreed formula for natural gas distribution in Sarawak is implemented

Influencer who recited Quran at Batu Caves accused of sexual misconduct in Netherlands

Abdellatif Ouisa has targeted recently converted, underage Muslim women, alleges Dutch publication

Duck and cover? FashionValet bought Vivy’s 30 Maple for RM95 mil in 2018

Purchase of Duck's holding company which appears to be owned wholly by Datin Vivy Yusof and husband Datuk Fadzarudin Shah Anuar was made same year GLICs invested RM47 mil

Related