The Online Safety Act 2023 - what to do now

What matters

What matters next

The Online Safety Act 2023 implements broad reforms for how many businesses must operate their online services. Here, Matthew MacLachlan explains the action to take now to comply.

The Online Safety Act 2023 (‘the Act’) received Royal Assent in October 2023. Although the Act is now law, it will not be enforced by the regulator, Ofcom, until secondary legislation is passed and when Ofcom’s draft codes of practice are finalised. Organisations which comply with Ofcom’s codes of practice will be treated as complying with the relevant duties of the Act.

In advance of the final versions, Ofcom has advised organisations to become familiar the various draft guidance and codes on which it is currently consulting, with responses due by 23 February 2024. This is the first phase of the guidance. Ofcom then intends to finalise its codes of practice by Q4 2024 into early 2025. Once the codes become final, organisations will have a three-month period to comply.

In the meantime, organisations should understand online safety harms, and complete the draft list of risk factors and accompanying risk profiles in the guidance Ofcom has provided (see Annex 5). These are essentially detailed questionnaires about an organisation’s services, and specific risks and practices associated with them. 

This article addresses the scope of the new Act and broadly touches on its duties, explaining the steps organisations can take at these early stages to prepare.  

Overview

The Act requires organisations providing internet services to ask themselves three questions:

  • assuming that they do not publish illegal or pornographic content, is the content of their internet service generated directly by users and encountered by other users (a ‘user to user service’), or do they provide a search engine (a ‘search service’);
  • if so, do their systems and processes ensure they meet duties of care promoting the safety of UK internet users (particularly regarding illegal content, lawful content which is harmful to children and/or fraudulent advertising); and
  • in meeting the duties of care, do their safety measures and policies have particular regard to the importance of protecting users’ right to freedom of expression?

Having the answers to these questions will avoid the so-called ‘intermediary liability’ imposed by the Act. Intermediary liability primarily arises when an organisation is on notice of hosting inappropriate content, but fails to act expeditiously. 

The Act imposes various duties of care on a sliding scale depending on an organisation’s size. Although organisations are not required to conduct general monitoring of their internet services, it may be necessary in practice.

In addition to a wave of new communications offences, penalties for breaching the Act include fines of up to £18m or 10% of global turnover, whichever is higher. The greatest risk may be that of a service cessation order for non-compliance, which must be approved by the court if imposed.

Whilst the Act will not necessarily create new avenues for individuals to sue organisations, the government envisages that the precedents set by the regulatory decisions made pursuant to the Act will be relevant to legal action sought by individuals and that “legal action [will] become more accessible to users as the evidence base around online harms grows” (per the Government’s consultation response to the Online Harms White Paper).

Rather than focusing on individual content, organisations should consider their entire systems and processes in light of the duties in the Act. It may be that only part of a service falls to be regulated. Ofcom uses the example of a site only providing retail services, which will not fall to be regulated because users cannot interact with each other. If the site, however, also has a chat forum, then the chat forum will be regulated. Further details about which services fall within the Act may emerge following the outcome of Ofcom’s consultations.

Scope of the Act 

As the new ‘online safety regulator’, Ofcom has already provided an initial analysis indicating that more than 100,000 online services could be subject to the new rules, covering organisations from very large and well-resourced organisations, to small and micro businesses in a wide range of sectors. 

Ofcom is clear that the onus sits with service providers themselves to properly assess the risks their users may encounter and decide what specific steps need to be taken in proportion to the size of the risk and their resources.

So-called ‘user to user’ and ‘search’ services fall within the scope of the Act if UK users are a target market, or if the service has a significant number of UK users. These are not simply social media sites or search engines. ‘User to user’ services encompass a broad number of organisations in many fields, including forums, online gaming sites, dating sites, instant messaging services, and cloud storage providers. They also include generative AI services where the content has been uploaded by a user, or where a user embeds AI to produce content encountered by others. ‘Search services’ over a single website or database will not be regulated, but one which searches multiple sites or databases will be. Examples include internet search providers, price comparison and travel booking sites.

Exemptions apply to certain services where emails are the only generated user content, or those with limited functionality, for example, where users can only communicate by posting comments or reviews in a limited manner. Other specific exemptions apply to those services with legal responsibility for educational childcare, amongst others. They highlight the need for organisations to specifically consider the Act in the context of their own activities.

Where an organisation is not directly regulated, it could be caught by so-called ‘indirect regulation’, i.e. a requirement to comply imposed by contracts with counterparties or other service providers. Whilst the scope of this is not yet known, in circumstances where senior managers can be personally liable, it is conceivable that organisations with agreements with user to user or search services will be subject to contractual obligations to assist the regulated entities with compliance.

Types of regulated content

Organisations need to assess the risk of each kind of illegal harm set out in the Act. 

Ofcom has identified15 ‘kinds’ of priority illegal content, all of which are the subject of individual offences. These include content that: is contrary to terrorism offences; encourages or assists suicide or serious self-harm; includes harassment, stalking, controlling or coercive behaviour; is fraudulent; relates to financial services; and relates to proceeds of crime offences. 

Aside from priority content, organisations must also assess the risk of harm from relevant non-priority offences, i.e. types of illegal harm from offences not specifically listed in the Act.

If a service is likely to be accessed by children, then duties to protect their online safety apply, including to prevent children from encountering:

  • ‘primary priority content’ (for example, content which encourages self-harm); 
  • ‘priority content that is harmful to children’ (for example, content which is abusive, incites hatred or is bullying); and 
  • ‘non-designated content that is harmful to children’ (i.e. content “of a kind which presents a material risk of significant harm to an appreciable number of children in the United Kingdom”). 

Duties of care

The duties of care in the Act require organisations to focus on assessing risks, providing proactive safeguarding, empowering users, and being accountable. Organisations must, for example, be alert to the risk of illegal content appearing on their services, offences being committed through using their services, or offences being facilitated by use of their services.

Risk assessments must cover the likely user base of the service; users’ risk of access to illegal content or that which is harmful to children (if applicable); the risk that the service facilitates access to such content; and the risk of harm. Organisations must take steps to mitigate the risks identified, following Ofcom’s codes of practice.

Safeguarding must be conducted actively through proportionate measures, systems and processes to prevent access to the above types of content, to mitigate and manage risks and to minimise the presence of that content. For example, proportionate systems and processes must be designed for services likely to be accessed by children to prevent children of any age from encountering, by means of the service, content which is harmful to them. The findings of an organisation’s most recent children’s content risk assessment, for example, together with the size and capacity of the organisation, will assist in determining what measures will be proportionate.

The Act’s user empowerment duties require the establishment of reporting and complaints mechanisms enabling users to easily report content which they consider to be illegal or harmful to children, for example, or if they consider the organisation is not complying with a duty. As organisations must balance duties to ensure freedom of expression, complaints may include those where the proactive technology of the service results in lawfully expressed content being taken down, or being less likely to be encountered by other users in a way not envisaged.  

Mechanisms for compliance can include ID verification to protect users from avoiding non-verified users, but may also result in organisations obtaining large volumes of personal or special category data subject to regulation under the UK GDPR regime. Matters which organisations should consider in a UK GDPR context include whether they can now rely on regulatory duties under the Act as a lawful basis for processing personal data, particularly as ‘safeguarding a vulnerable individual’ is being proposed as a legitimate interest providing a lawful basis for processing under the new Data Protection and Digital Information (No. 2) Bill, currently being considered by the House of Lords. The largest organisations have duties to ensure that all users have control over the content they see, through, for example, content filters. 

Transparency and accountability are ensured through transparency reports which must be provided to Ofcom annually, even by smaller or mid-sized organisations. The largest organisations have broader risk assessment duties and must publish their assessments. If organisations provide more than one regulated service, a notice must be given in respect of each of them. Most organisations must also clearly explain how they will protect users in their terms of service.

Assessment of risk

Through its risk profiles in the draft guidance (see Annex 5), Ofcom encourages organisations to consider how perpetrators may exploit a service’s design for illegal purposes. Ofcom’s risk profile of social media sites, for example, indicates that they have an increased risk of all kinds of illegal harm because the sites seek to maximise engagement between users. The risks that Ofcom identifies include the exploitation of ‘virality’ to share illegal content; that young users could be targeted by the sending out of many messages; and that services may be used for large-scale foreign interference campaigns. 

By contrast, for gaming services, for example, Ofcom indicates that risks include allowing hateful content to be spread and the sites becoming areas of ‘normalised harassment’, where insults become part of user gaming interactions. Risk factors should be considered for different parts of relevant services. User profiles, for example, present the risk of fake profiles or the use of legitimate ones for ‘gendered legal harms such as harassment/stalking’. 

Ofcom’s proposed safety measures are categorised by the size and risk of the service. ‘Large’ services are those with an average user base of more than seven million per month in the UK, and risks will either be: 

  • ‘low risk’ for all kinds of illegal harm in a service’s risk assessment; 
  • ‘specific risk’ where a service is assessed as being medium or high risk for a specific kind of harm for which Ofcom proposes a particular measure; or 
  • ‘multi-risk’ for services which face significant risks of illegal harms.

The safety measures cover areas such as governance and accountability, content moderation, reporting and complaints, terms of service, default settings and user access. They depend on where a service falls within the above categories, so ‘large’ services have more obligations per category than ‘smaller’ services. Under Ofcom’s measures, a ‘low risk’ ‘large’ service, for example, must ensure that governance bodies carry out an annual review and must record how the service has assessed risk management activities, whereas ‘low risk’ ‘smaller’ services do not have to do so. 

Difficulties arise when considering Ofcom’s requirements alongside the terms of the Act, particularly as Ofcom uses different terminology for similar concepts. Organisations should be alert to the risk of Ofcom making changes to its risk profiles or safety measures over time.

Conclusion

Organisations should review their services now to determine whether all or parts of them are regulated by the Act. If they do, they should comply by following Ofcom’s codes of practice when finalised, and consider the draft codes and requirements in the meantime. 

Early engagement offers the prospect of organisations using the new regime to their advantage, limiting liabilities to users and counterparties, and using potential new avenues for lawful basis processing of personal data.

 

 

First published by Matthew MacLachlan in the Privacy & Data Protection Journal, Volume 24, Issue 3, available here.

Disclaimer

This information is for general information purposes only and does not constitute legal advice. It is recommended that specific professional advice is sought before acting on any of the information given. Please contact us for specific advice on your circumstances. © Shoosmiths LLP 2024.

 


Insights

Read the latest articles and commentary from Shoosmiths or you can explore our full insights library.