Home lifeDomestic services apparel Entertainment News Player community More

Millions of under 13s will be exposed to harmful online content despite crackdown, campaigners warn

2024-06-18 HaiPress

Ofcom has vowed to review and update the children's safety measures 'as technologies and harms evolve'

Credit: skynesher/E+

Millions of children under 13 will be exposed to harmful content on social media despite a crackdown by the online watchdog leading peers and campaigners have warned.

Eight peers including Baroness Morgan,a former culture secretary and Lord Bethell,a former health minister,have written to Ofcom warning that its draft children’s code fails to enforce minimum age limits on social media sites.

In their letter,the eight including online campaigner Baroness Kidron said this would mean millions of children under the age of 13 would continue to be exposed to products and content that even the social media platforms deemed inappropriate.

“We are bewildered at the decision to do nothing at all to protect children under 13,and at the same time give regulated companies safe harbour,” they said.

“After so many years in the making and so many promises to parents and children,the failure to use your powers in full will undermine faith in a regulatory solution altogether.”

The problem stems from Ofcom’s decision not to require the social media companies to apply “highly effective” technology when determining the age of children.

Officials say the only way to guarantee a child is 13 or over would be to ask for physical ID such as credit cards or passports but many teenagers do not have them. This would mean children could be denied access simply because they do not have the necessary ID.

Tools ‘inadequate’ 

However,alternative methods such as age assurance technology where a child’s age is assessed through facial or behavioural characteristics are not judged to be precise enough by Ofcom.

Such measures like physical ID will only be required of platforms like pornography sites where the minimum age is set at 18.

Its draft code states: “We considered whether it would be appropriate and proportionate to recommend that services that state a minimum age in their terms of service should use effective measures to enforce that provision,for instance,highly effective age assurance.

“We determined that this would not be proportionate given we have limited independent evidence that age assurance technology can correctly distinguish between children in different age groups to a highly effective standard and,given this,there is a risk that this could have serious impact on children’s ability to access services.”

Peers fear children will be exposed to online harms if more is not done to restrict access to certain sites

Credit: AIMSTOCK/E+

Ofcom estimates as many as 60 per cent of eight to 11-year-olds have social media profiles,equivalent to 1.6 million children across the UK. A third of five to seven-year-olds are said to use social media unsupervised.

Andy Burrows,of the Molly Rose Foundation,a charity set up by Ian Russell,whose 14-year-old daughter took her own life after being bombarded with suicide content,urged Ofcom to rethink its approach.

“While the Online Safety Act doesn’t specify a minimum age for social media usage,it requires platforms that do choose to set a minimum age to enforce their terms consistently,” he said.

“It’s therefore surprising that Ofcom isn’t setting out clear requirements on how to do this in its code of practice,and this raises questions about whether tech companies will be suitably incentivised to invest in age assurance technology to detect and remove under 13s.

“Enforcing minimum age limits should be a cornerstone of online safety regulation and we encourage Ofcom to reconsider its proposals.”

Ofcom said: “Our draft codes are designed to protect all children from harm,whatever their age. Under our proposals,all services which do not ban harmful content,and those at higher risk of it being shared on their service,will be expected to implement highly effective age-checks to prevent children from seeing it.

“The law also requires platforms to apply their terms of service – including any minimum age requirements – consistently. Those that fail to do so can expect to face enforcement action,including fines.

“Over time,we will continue to review and update the children’s safety measures as our evidence base,technologies and harms evolve.”

 

The next government must deliver the online safety regime that Parliament passed and the public expect

By Beeban Kidron

As someone who has been ringing the alarm bells for more than a decade about the unprecedented harm posed to children and society by unaccountable tech companies,I am disappointed to see that the technology that so profoundly shapes our lives is not playing a more central role in the general election.

There are five key issues that the next government must tackle and some fundamental themes that run through all of them.

First and foremost,online safety is a job not yet done.

The next government must deliver the online safety regime that parliament passed,and the public expect.

There is a growing concern that Ofcom’s draft children’s code does not enforce against underage use.  This means that the millions of children under the age of 13 will continue to be exposed to products and content that even the services themselves deem inappropriate.

This is not what we were led to expect,nor will it satisfy the millions of parents who have been promised seismic change. Equally,parents will be disappointed at the decision to ignore the Act which mandates children of different ages and development stages be offered “age-appropriate services” by design.

Any child or parent can tell you a five-year-old has different needs from a 15-year-old,and yet Ofcom suggests that there is limited evidence that harm affects children of different ages differently. 

‘Promises must not be broken’

Novel legislation will have teething problems,but whether discrete amendments to the Online Safety Act,the use of Secretary of State powers,or simply supporting Ofcom to create a bold regime,the Online Safety Act must bring forward effective codes,rather than leave promises broken and children unprotected.

Secondly,privacy protections are fundamental to our new AI world. The next government must protect and enhance the privacy rights of UK citizens.

In the last six months we saw an unprecedented attack on data rights. From the proposal to allow AI to monitor all benefit claimants and connected bank accounts,to watering down the – hard won – high bar of privacy afforded to children – so critical to their safety.

Thankfully the snap election sent the Data Protection and Digital Information Bill packing. But it revealed vulnerabilities in the current system,including the failure to robustly uphold the data rights of individuals and creators against generative AI companies,and widespread abuses within the ed tech sector,that have made children less protected at school than on the bus getting there.

Thirdly,an AI Bill is required that is fit for our future

The next government must start to build a place for democracy with the technology of the present and the future in mind.

Over the past 12 months discussions about generative AI have taken an enormous amount of airtime in public life.  Government took a “wait and see” approach to new laws while our European counterparts seized the moment to introduce guardrails.

Those in and out of the tech sector are clearly indicating that unfettered generative models create risks that range from,model collapse from an overload of synthetic content,mass joblessness,misinformation and lethal robots with no human override. To those I would add security of public data sets,syphoning off of value to a handful of players,wide-scale abuse of creative content and child safety concerns.

The lobbying efforts of Silicon Valley – the most well-resourced companies in history – will be overwhelming,as will the legal challenges launched by the most expensive lawyers money can buy. But ultimately as Shoshana Zuboff has written “the digital must live in democracy’s house”. Every other trillion-pound sector is subject to regulations: pharmaceutical,financial or automobile industries. These are not regulations for the sake of regulation.

They have been put in place to prevent harm,create sustainable businesses,and ensure that corporations are not able to outsource all their negative externalities on the public.

An AI Bill should consider both frontier risk and economic and social risks of AI. It should have high level principles and processes by which it can (and must) be applied domain by domain.

AI has the potential to create efficiencies and advances that we want,but they do need to work for us and not simply repeat the extraction and concentration of monopoly power that we have seen over the past two decades. Such a Bill needs to be broad,flexible and forward-looking to ensure the legislation can anticipate a rapidly changing environment.

Fourthly,tech literacy is a job for the government. The next Government must commit to a broad,deep and intelligent literacy program that is accessible to every UK citizen.

‘Let’s supercharge tech literacy’

From nursery through tertiary education,in professions from health to law,business to civil service,we have a level of tech ignorance that undermines all efforts to understand,use,legislate or to build our new world.

Meanwhile a handful of companies with their seemingly limitless resources,absorb many of those with skills,creating yet another asymmetry between tech and those who use or are subject to it.

We need to supercharge tech literacy. This is not about coding,or computer science though they too should be supercharged. This is about basic understanding of the concepts,impacts and design choices that are being made.

Finally,governance requires parliamentary oversight

The next government must initiate parliamentary oversight of digital regulation. Many legitimate (and some hyperbolic) concerns have been raised about democratic oversight and accountability of regulators who have been given great powers and responsibilities under new legislation. And they are the intense focus of tech lobbying.

The next government must demonstrate the political will to support robust enforcement regulators,and heed the cross-party call to establish a joint committee in Parliament with a dual remit of ensuring regulators are using their powers to ensure the intent and spirit of regulation is followed,and to prevent reticence or overreach.

It is only when the tech sector is subject to democratic oversight and responsive to the needs of the population,that it can be a force for good and a symbol of progress.

Baroness Kidron is chairman of 5Rights,a charity that campaigns for online safety

Disclaimer: This article is reproduced from other media. The purpose of reprinting is to convey more information. It does not mean that this website agrees with its views and is responsible for its authenticity, and does not bear any legal responsibility. All resources on this site are collected on the Internet. The purpose of sharing is for everyone's learning and reference only. If there is copyright or intellectual property infringement, please leave us a message.

Newest

State Grid Bortala Power Supply Company: Achieving the Application of Feeder Automation

China Matters' Feature: Beidahuang: Green Development on Black Soil

Hong Kong Screen Legend Chow Yun Fat Hosts Charity Photography Exhibition "Hong Kong‧Morning" at Harbour City, Showcasing 30 Stunning City Photographs

GEEKOM's high-performance mini PCs will shine at CES2025

Potech Group Expands Global Presence with New Entity Launch in Australia

China Matters' Feature: Lars Ulrik Thom: Understanding China on Its Own Terms

©copyright2009-2020New York Fashion News    Contact Us  SiteMap