The Online Safety Bill is now law after receiving Royal Assent on Thursday 26 October 2023, marking a new era of internet safety in the UK.
The Act creates a new regulatory regime, with the purpose of addressing illegal and harmful content published online. The new rules brought in by the Act focus on keeping children safe on the internet and empowering adults by giving them choices about the content that they see online.
The Act seeks to ensure that tech companies actively assess the risks of harm posed by their online content and design their systems and processes to keep users safe.
The Act sets new online standards by imposing a number of legal duties on certain internet service providers, namely:
The legislation seeks to protect children by requiring those firms within scope to:
The legislation also seeks to protect adults online by requiring firms within scope to:
Ofcom has been appointed as the regulator responsible for supervising and enforcing the new rules brought in by the Act. The regulator has been given enforcement powers, including the ability to fine internet service providers up to £18 million or 10% of their global annual revenue, whichever is greater, for non-compliance with the rules. Ofcom can also bring criminal sanctions against the senior managers of companies who do not comply.
Looking forward, the majority of the Act’s provisions will come into force in two months’ time. However, certain provisions have been commenced early to establish Ofcom as the regulator and enable them to start work to implement the new online safety laws. Ofcom is set to publish and consult on draft guidance and codes of practice on 9 November 2023. Following consultation, finalised guidance and codes of practice are expected to be published in 2024, in a bid to enable in-scope companies to comply with the new regime.
Although the Act is an important milestone for online safety, it was a long process for the Bill to become law. This was partially due to privacy concerns and the threat to freedom of speech. The question of whether or not tech companies, such as WhatsApp, would be required to scan encrypted messages for harmful content, has been at the centre of the debate most recently. For now the UK government has not imposed any such scanning requirements and has confirmed that it won’t do so until it is technically feasible and there is accredited technology to meet minimum accuracy standards in detecting only child sexual abuse and exploitation content. As a result, it is likely that this debate will rear its head again in the future, as and when this technology becomes available.
Companies that provide online content should carefully consider if they fall within the scope of the Online Safety Act and if so, should ensure that they comply with its provisions in order to make the internet a safer place and avoid both civil and criminal liability.
For more information, please contact our privacy & data team.