The final version of rules the regulator says will offer children in the UK “transformational new protections” online have been published.

Sites will have to change the algorithms that recommend content to young people and introduce beefed up age checks by 25 July or face big fines.

Platforms which host pornography, or offer content which encourages self-harm, suicide or eating disorders are among those which must take more robust action to prevent children accessing their content.

Ofcom boss Dame Melanie Dawes said it was a “gamechanger” but critics say the restrictions do not go far enough and were “a bitter pill to swallow”.

Ian Russell, chairman of the Molly Rose Foundation, which was set up in memory of his daughter, who took her own life aged 14, said he was “dismayed by the lack of ambition” in the codes.

But Dame Melanie told BBC Radio 4’s Today programme that age checks were a first step as “unless you know where children are, you can’t give them a different experience to adults.

“There is never anything on the internet or in real life that is fool proof… [but] this represents a gamechanger.”

She admitted that while she was “under no illusions” that some companies “simply either don’t get it or don’t want to”, the Codes were UK law.

“If they want to serve the British public and if they want the privilege in particular in offering their services to under 18s, then they are going to need to change the way those services operate.”

Prof Victoria Baines, a former safety officer at Facebook told the BBC it is “a step in the right direction”.

Talking to the Today Programme, she said: “Big tech companies are really getting to grips with it , so they are putting money behind it, and more importantly they’re putting people behind it.”

Under the Codes, algorithms must also be configured to filter out harmful content from children’s feeds and recommendations.

As well as the age checks, there will also be more streamlined reporting and complaints systems, and platforms will be required to take faster action in assessing and tackling harmful content when they are made aware if it.

All platforms must also have a “named person accountable for children’s safety”, and the management of risk to children should be reviewed annually by a senior body.

If companies fail to abide by the regulations put to them by 24 July, Ofcom said it has “the power to impose fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK.”

The new rules are subject to parliamentary approval under the Online Safety Act.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here