Online Safety Act becomes law

What matters

What matters next

On 26 October 2023 the Online Safety Act (‘the Act’) received Royal Assent, enacting rules designed to, in the UK government’s words, make the UK the safest place in the world to be online.

The Act requires social media companies and providers of search engines to keep the internet safe for children and give adults more choice over what they see online, placing legal responsibility on tech companies to prevent and rapidly remove illegal content for example terrorism and revenge pornography. They will also have to stop children seeing material that is harmful to them such as bullying, content promoting self-harm and eating disorders, and pornography.

Businesses covered by the Act that fail to comply with the rules, could face significant fines of up to £18 million or 10% of their global annual revenue, whichever is higher, which could reach billions of pounds. Failure to take steps required by Ofcom to protect children could see senior managers and officers facing prison, as we highlighted in an earlier article during the bill’s passage through Parliament: Online Safety: Senior managers to face imprisonment for non-compliance (shoosmiths.com).

Most provisions of the Act will come into force within two months. However, Ofcom’s codes of practice and secondary legislation, which are to be finalised over the coming months and through 2024, are likely to be determinative of how entities will be able to demonstrate compliance with the Act.

On 9 November 2023, Ofcom published its first consultation on illegal harms, the first of four major consultations to be published, which focuses on proposals for how internet services that enable the sharing of user-generated content and search services should approach their new duties relating to illegal content.  Ofcom will consult on the proposals, the deadline for which to respond is 5pm on 23 February 2024.  The first draft Code of Practice covers activity such as child sexual abuse material, grooming and fraud.  

Dame Melanie Dawes, Ofcom's Chief Executive, said: 

“Regulation is here, and we’re wasting no time in setting out how we expect tech firms to protect people from illegal harm online, while upholding freedom of expression. Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular.”

From Ofcom’s initial analysis more than 100,000 online services could be subject to the new rules, covering organisations ranging from very large and well-resourced companies to small and micro-businesses, in a wide range of sectors, which will all need to assess what safety measures they need to comply with the new rules, based on the risks they face.
Different safety measures will be appropriate for different types of service and Ofcom’s recommendations will vary for services depending on their size and degree of risk. 
We will be monitoring the implementation of the Act over the coming months. Please get in touch if you would like us to keep you updated on how the Act may affect the operation of your business or to help you adopt an appropriate risk-based strategy to ensure compliance. The onus sits with service providers themselves to properly assess the risks their users may encounter, and decide what specific steps they need to take, in proportion to the size of the risk, and the resources and capabilities available to them.

 

Disclaimer

This information is for general information purposes only and does not constitute legal advice. It is recommended that specific professional advice is sought before acting on any of the information given. Please contact us for specific advice on your circumstances. © Shoosmiths LLP 2024.

 


Insights

Read the latest articles and commentary from Shoosmiths or you can explore our full insights library.