0

Social media firms must ‘tame toxic algorithms’ under new measures to protect children

UK technology firms have been told to ‘tame toxic algorithms’ and implement practical steps to protect children online.

It all comes under Ofcom’s new measures titled ‘Child Protection Code of Practice,’ which social media sites, apps and search engines will have to follow.

Ofcom is the government-approved regulatory and competition authority for the broadcasting, telecommunications and postal industries within the United Kingdom.

is one of the first elements listed age check, requires “much greater use of highly effective age-assurance”. Anything that promotes suicide, self-harm, eating disorders or pornography is classified as harmful content.

Dangerous challenges, harmful substances, incitement of hatred against people with certain characteristics, instructions for acts of serious violence, and actual or serious violence against people or animals are also classified as harmful under the UK Online Safety Act.

This should impact all services that do not currently ban harmful content, as they will now be expected to enforce age checks to prevent children from viewing it.

Dame Melanie Dawes, chief executive of Ofcom, explains how it goes “far beyond existing industry standards”, but aims to “bring about a step-change in online safety for children in the UK”.

“We want kids to enjoy life online. But for too long, their experiences have been seriously affected by harmful content they can’t avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. This has to change.”

The regulatory watchdog “will not hesitate to use the full range of our enforcement powers to hold platforms to account.”

The draft includes measures to ensure that there is stronger accountability for the protection of children technology firm Very. They say this should particularly include making a named person accountable for compliance with child protection duties.

OnlyFans is being investigated by UK Ofcom

The measures have been published just days after Ofcom announced it had launched a OnlyFans investigation On 1 May.

They’re looking at whether the company is doing enough to prevent children from accessing pornography on its site.

While OnlyFans does have age measures in place, Ofcom says they have “reviewed the submissions we received” and “there are grounds to suspect that the platform has not implemented its age verification measures in a way that ensures that people under 18” People below the age of one year can be adequately protected from obscene material.

Updates on the investigation are expected within a reasonable time.

social-media-firms-must-tame-toxic-algorithms-under-new-measures-to-protect-children