UK’s Ofcom urges tech firms to stop pushing harmful online content to kids

Regulator also calls for strong age verification to safeguard children while online

12:45 PM MYT

 

KUALA LUMPUR – Ofcom, the UK regulator for online safety, has urged technology companies to stop their algorithms from recommending harmful content to children and inundating them with such material, and called for strong age verification measures for their safety.

In response to this, it has put forward proposals outlining actions that social media platforms and other online services must implement to improve children’s safety while online.

Among the measures is protecting children so they can enjoy the benefits of being online without experiencing the potentially serious harms that exist in the online world, which is a priority for Ofcom.

According to the country’s Online Safety Act, social media apps, searches and other online services must prevent children from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography. 

They must also minimise children’s exposure to other serious harms, including violent, hateful or abusive material; bullying content; and content promoting dangerous challenges.

Ofcom also said that online services must establish whether children were likely to access their site, or part of it. 

“Also, if children are likely to access it, the company must carry out a further assessment to identify the risks their service poses to children, including the risks that come from the design of their services, their functionalities and their algorithms. 

“They then need to introduce various safety measures to mitigate these risks,” it added.

It said that these were among the 40 safety measures proposed, aimed at making sure children enjoy safer screen time when they are online. 

“Others include robust age checks; our draft codes expect services to know which of their users are children in order to protect them from harmful content.

“Safer algorithms are also important. According to our proposals, any service employing systems that recommend personalised content to users, particularly those at high risk of harmful content, must design their algorithms to filter out the most harmful content from children’s feeds. Additionally, they should downrank other harmful content,” it added.

It said effective moderation was also important where all services, like social media apps and search services, must have content moderation systems and processes to take quick action on harmful content, and large search services should use a “safe search” setting for children.

This cannot be turned off and must filter out the most harmful content, it added.

Further, Ofcom also proposed stronger senior accountability and support for children and parents.

“Our draft codes also include measures to ensure strong governance and accountability for children’s safety within tech firms. 

“These include having a named person accountable for compliance with the children’s safety duties; an annual senior-body review of all risk management activities relating to children’s safety; and an employee Code of Conduct that sets standards for employees around protecting children,” it added. – May 8, 2024

Topics

Popular

[UPDATED] Desperate, doomed move: Lokman Adam claims Daim, Dr Mahathir behind Langkah Dubai  

Langkah Dubai, a move by the opposition to topple Prime Minister Datuk Seri Anwar Ibrahim’s administration, is allegedly masterminded by former prime minister Tun Dr Mahathir Mohamad and his right-hand man Tun Daim Zainuddin.

Mamak restaurants’ group to sue TikTok user for defaming industry

The Malaysian Muslim Restaurant Owners’ Association (Presma) will proceed with suing a TikTok user for making defamatory claims about food preparation and cleanliness at mamak restaurants.

Where is Sabah AG Nor Asiah on 40% grant revenue issue, ask state advocates

Former assemblyman says AG should be at the forefront in advocating for Sabah's rights

Related