Ofcom cracks on with online safety role, kids first

Ofcom has made a flying start as the UK's official online safety regulator, launching its first consultation and focusing in on one of the least controversial elements of its role – the protection of children.

Mary Lennighan

November 9, 2023

4 Min Read
UK houses of parliament

Ofcom has made a flying start as the UK’s official online safety regulator, launching its first consultation and focusing in on one of the least controversial elements of its role – the protection of children.

The UK telecoms regulator was granted new powers under the Online Safety Act, which became law a fortnight ago. Technically, it has the right to issue hefty fines and other sanctions on companies – big tech firms, social media platforms and so forth – that fail to police potentially harmful online content on their sites.

But before it gets to that stage, there’s a lot of work to be done. Ofcom is responsible for publishing a series of consultations into various elements of the law as part of a phased approach to fully implementing it. The first, a consultation into protecting people from illegal harms online, launched on Thursday, just as Ofcom said it would.

As part of that consultation Ofcom has released draft codes and guidelines, which will form the basis of illegal harms codes due to come into force at the back end of next year. As you might expect, there are myriad documents to wade through, the most useful being this still admittedly pretty lengthy precis of Ofcom’s proposals.

Ofcom is calling on the industry and a range of experts to comment on the documents to help it develop the final version that it intends to publish in autumn 2024. Related services will then have three months to conduct their own risk assessments, while the codes go through the parliamentary approval process, it explained.

Once the Codes come into force, “we can begin enforcing the regime,” Ofcom said. Essentially, Ofcom will be able to flex its new online safety regulatory muscle from around the beginning of 2025.

Meanwhile, Ofcom will also get on with phase two of the process, categorised as ‘child safety duties and pornography,’ launching a separate consultation before the end of this year. Specifically, it says it will propose guidance on how adult sites should comply with their duty to ensure children cannot access pornographic content, with another consultation on additional protections for children with regard to content promoting suicide, self-harm, eating disorders, cyberbullying and suchlike coming in spring 2024.

Presumably with that in mind – and doubtless because the protecting children angle of the new law is good PR, even while the elements of censorship and privacy remain divisive – Ofcom shared the results of new research into the scale and nature of children’s online experiences.

Headline findings include the fact that three in five children aged 11-18 have been contacted online in a way that made them feel uncomfortable; 30% have received an unwanted friend or follow request; and 16% have been sent naked or semi-naked pictures, or have been asked to share such images of themselves.

Ofcom’s draft rules propose that what it terms larger and higher-risk services, presumably the likes of Instagram, TikTok, Snapchat and so on, should by default adhere to a set of guidelines on the way children can be contacted. They should not be presented with a list of suggested friends or be visible in other users’ connections lists, for example, nor should they be contactable via direct message by an account outside their own connection list, or have visible location information.

There are also suggestions for how big tech platforms could tackle fraud and terrorism – the use of automatic keyword detection and transparency on account verification, for example – and, more broadly, a core list of measures that services can adopt to mitigate the risk of all types of illegal harm. The list covers the naming of an accountable person, easy reporting and blocking, and safety tests for recommender algorithms.

“Regulation is here, and we’re wasting no time in setting out how we expect tech firms to protect people from illegal harm online, while upholding freedom of expression. Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular,” said Ofcom CEO Dame Melanie Dawes, in a statement.

Regulators are often accused of dragging their feet, but in this case it’s fair to say Ofcom is moving as quickly as it can, despite the fact that the whole Online Safety Act has been an eternity in the making. It’s almost a year since it appointed Gill Whitehead as Group Director, Online Safety, and it has had plenty of time to prepare its research and its draft documents.

Let’s see how rigidly it sticks to that three-phase timetable, but, more importantly, how effective it can be in holding the big tech platforms to account. We’ll have to wait another year or so to get a good view on that.

 

Get the latest news straight to your inbox. Register for the Telecoms.com newsletter here.

About the Author(s)

Mary Lennighan

Mary has been following developments in the telecoms industry for more than 20 years. She is currently a freelance journalist, having stepped down as editor of Total Telecom in late 2017; her career history also includes three years at CIT Publications (now part of Telegeography) and a stint at Reuters. Mary's key area of focus is on the business of telecoms, looking at operator strategy and financial performance, as well as regulatory developments, spectrum allocation and the like. She holds a Bachelor's degree in modern languages and an MA in Italian language and literature.

You May Also Like