Digital Decade: Online Safety Bill
Exploring the UK's proposals to manage online harms
Introduction
In 2019, the UK government published the white paper for the Online Safety Bill ("OSB"). The OSB provides for the regulation of online and internet services by tackling the dissemination of illegal and harmful content online. It aims to make the UK the safest place in the world to be online while defending freedom of expression. Four years on, the OSB has been through substantial revisions and questions remain over whether it meets its objectives.
In this insight, we consider the current state of the OSB and highlight some of the key provisions. We also consider what organisations can do to prepare for these upcoming changes. This review is based on the version of the OSB published on 18 January 2023 together with related policy1 and guidance documents2.
Background and status update
In September 2022, the new Secretary of State for the (then) Department for Culture Media and Sport (DCMS) reviewed the scope of the OSB to address concerns raised by some MPs over the balance between 'protecting users from harmful content online and protecting users' rights to freedom of expression'.
Following that review, key amendments were agreed, including the removal of the 'harmful communications offence' and adult safety ('legal but harmful') duties which have been replaced by a 'Triple Shield' covering the removal of illegal content, requiring platforms to take down content that breaches their own terms of service and providing adults with tools to enable them to regulate the content they see – in other words, new user empowerment tools.
Around the same time (November 2022), the EU Digital Services Act ("DSA") entered into force, seeking to regulate similar issues. In practice, the OSB and DSA are variations on the same theme of online safety, transparency, and accountability.
Effective date
The OSB recently passed its second reading in the House of Lords and the plan is for the bill to pass before the end of the current parliamentary session.
Diverging approach – EU vs UK
The OSB's EU counterpart, the DSA, contains several diverging approaches, which will be covered in a separate insight. Two key differences are who the DSA applies to and the actions for non-compliance:
- Applicability:the DSA applies to providers of 'intermediary services' (to users) in the EU.'Intermediary services' are: 'mere conduit' services (e.g., internet exchange points, virtual private networks, and domain name registries), 'caching' services (e.g., content delivery networks), 'hosting' services (cloud computing services, web hosting services), and online search engines. In summary, the obligations imposed by the DSA relate to the type of service provided, as opposed to the service provider itself. How this plays out in practice and the extent to which it results in varying obligations between the UK and EU remains to be seen.
- Non-compliance:under the DSA, non-compliance could lead to fines or a ban on operating in the EU single market for repeat offenders.Under the OSB, no such bans exist but the court has the power to make, on application by Ofcom (the OSB regulator) and in the event of non-compliance, service restriction orders on those providing an ancillary service (for example, payment services) to a Regulated Service (as defined below).
Deep dive
Key provisions
- What is its primary purpose
To create a safe online environment for internet users, especially children, in the UK by placing obligations on social media and search engine platforms to regulate illegal and harmful content available online, while protecting freedom of expression. - Who does it apply to
Obligations are imposed on companies that provide user-to-user (e.g., social media sites) and search (e.g., search engines) services ("Regulated Services") to users based in the UK. Those obligations vary depending on how the provider is classified: i.e., whether the provider is a 'Category 1' (large user to user), 'Category 2A' (search) or Category 2B (other user to user) regulated service. The conditions for those classifications will only be known once the Secretary of State has published regulations specifying the applicable threshold conditions. - Territorial scope
The OSB has extra-territorial effect in that it applies to companies wherever based geographically offering Regulated Services to users based in the UK. - Key provisions / requirements
- Illegal content: Providers of Regulated Services have obligations and duties to prevent users (both children and adults) from accessing illegal content. This includes content that amounts to an offence relating to terrorism, child sexual exploitation and abuse, assisting suicide, threats to kill, fear or provocation of violence, harassment, or stalking. At present, only content that amounts to an offence under law is considered illegal.
- Protecting children: Providers of Regulated Services have a particular duty to protect children from accessing illegal as well as harmful content. This arises where, having undertaken an assessment, the service is likely to be accessed by children. At this point the OSB does not define what types of content would fall into this category and, for example, what is meant by 'primary priority content that is harmful to children'. The Secretary of State does, however, have the power to do so (i.e. define this type of content) where there is 'a material risk of significant harm to an appreciable number of children' affected. There is also some guidance on the scope of harmful content in the government guidance published on 16 December 20223, namely: 'pornographic content, online abuse, bullying or harassment, and content that does not meet a criminal level but which promotes or glorifies suicide, self-harm or eating disorders.'
- Fraudulent advertising:Providers of Regulated Services are required to take action to prevent users from encountering any fraudulent advertising content and minimise the likelihood of fraudulent advertisements being published on their service. In addition, providers will be required to swiftly take down any such content when notified. They must include clear language in their terms of service as to the technology they are using to comply with their obligations. In determining what a 'fraudulent advertisement' is, the OSB has stipulated certain requirements covering a wide range of offences from criminal fraud to misleading statements in the financial services area.
- Communication offences: Subject to certain exemptions (e.g., internet service providers transmitting this content), providers of Regulated Services as well as individual users and corporate officers, will be guilty of an offence under the OSB for spreading false information, making death threats or sending messages threatening serious harm. An offence of sending flashing images is also provided for. Possible prison sentences vary depending on the offence committed but can be up to five years. A company ('body corporate' under the OSB) need not be a provider of Regulated Services to be caught by these provisions under the OSB. As already mentioned, the 'harmful' communications offences that existed in earlier versions of the OSB have been removed.
- Regulatory supervision
This will be performed by Ofcom and the Secretary of State. - Enforcement and fines (other than for communications offences)
- For non-compliance, there are possible fines of up to £18 million.
- There is also the potential of imprisonment for up to two years, which could apply to senior managers responsible for ensuring compliance with information notices received from Ofcom (i.e., information requested to enable Ofcom to perform its safety functions) and/or corporate officers (if breaches of the OSB occurred with the consent, connivance, or neglect of those individuals).
Other regulatory developments to be aware of
- Content reporting procedures: Providers of Regulated Services are required to adopt systems and processes that allow users and other affected persons to report illegal and harmful content. They are also required to put in place complaints procedures that are accessible and easy to use (including by children).This may lead to a provider of Regulated Services being investigated by Ofcom.
- Terms of service: Providers of Regulated Services are required to set out in their terms of service how users are protected from illegal content and how children are prevented from accessing any harmful content. These terms must be clear and accessible. Again, any complaint by users or affected persons in this regard may lead to a provider of Regulated Services being investigated by Ofcom.
- Pornographic content: Providers of Regulated Services, unless exempt under the OSB (such as providers of platforms where emails are the only user-generated content allowed), must ensure that children are not normally able to access pornographic content on their platforms by adopting measures such as age verification. Further guidance on this will be provided by Ofcom.
- Fees: Providers of Regulated Services, unless exempt under the OSB, may be required to pay yearly fees to Ofcom if they meet certain threshold figures which will be published by Ofcom and the Secretary of State.
What you should be doing now in preparation
Under the OSB, providers of Regulated Services will have a duty to actively monitor content and remove illegal content when it appears. Providers of Regulated Services also need to provide 'adequate protection' to citizens using their services to reduce the risk of harm. These changes will necessitate a systemic review for companies operating within the scope of the OSB.
The starting point for many providers will be a thorough audit of the services provided, together with appropriate risk assessments for those in scope. They also need to consider and build potential development work into their roadmap, as many obligations imposed by the OSB will require technical solutions and backend changes, and providers should also revisit their terms of service to ensure future compliance.
Conclusion: our thoughts
From a legal perspective, the potential liability of individuals (including imprisonment) is an obvious area to be aware of and manage. Companies will need to identify workable solutions to meet the requirements of the OSB when, as anticipated, it comes into force. The introduction of effective age verification services, a necessity for those companies whose sites pose the highest risk of harm to children, is a key area to address. Companies failing to meet this requirement could face serious consequences, including enforcement action.
The various communications offences provided for in the OSB are of general application, meaning individuals and body corporates can be liable. In other words, it is not just social media and search companies that could face enforcement action even though they are the main focus in other sections of the legislation.
Regulated providers should monitor any further guidance and legislation provided by Ofcom and the Secretary of State to fully understand their obligations under the OSB. For companies wishing to get ahead before the passing of the OSB, the UK government has (as above) published guidance that could provide further assistance.
1 Department for Science, Innovation and Technology and Department for Digital, Culture, Media & Sport, 'Policy paper: Overview of expected impact of changes to the Online Safety Bill' (18 January 2023)
2 Department for Science, Innovation and Technology and Department for Digital, Culture, Media & Sport, 'Guidance: A guide to the Online Safety Bill' (16 December 2022)
3 See footnote 2.