The online safety bill: the new role of tech companies

  • 2 mins read

The Queens Speech yesterday hailed the eagerly awaited Online Safety Bill which is due to be published imminently. The bill will impose a duty of care on tech companies for the safety of their users. Tech companies will be held to account if they fail to remove illegal content online, including terrorist material, child sex abuse, suicide promotion and cyberbullying.  Interestingly, in our post Brexit era, the UK Government has gone a step further than the European Commission in the Digital Services Act and the obligation on tech companies in the Online Safety Bill extends to lawful content that is considered 'harmful'. 

Companies caught by this strict new regime will include companies hosting user generated content that is accessible in the UK and companies facilitating private or public interactions between users, one of which is based the UK.

The bill reserves numerous enforcement rights  for Ofcom including the ability to fine tech companies up to 10% of a company's annual turnover or £18 million (whichever is higher). 

While the potential fines will most likely become one of the key talking points of this new piece of legislation over the coming months, for me it will be imperative to understand more about how tech companies are expected to police lawful but harmful content. The ongoing tension between rights of free speech and accountability and protection of users of course need to be carefully balanced but also the requirements of tech companies to have certainty around what is required of them. 

The days of being just a 'platform provider' when it comes to content and social media platforms are behind us and tech companies under this new bill will have to take a more active role in monitoring the content that is being posted on their platform - a move which is likely to be welcomed by most in an era where dangerous and harmful content can be accessed by anyone, including children, at the click of a button. But if the UK government is going to seek to impose obligations on companies to monitor lawful content that is considered harmful tech companies will be demanding clarity on what is considered harmful. If this certainty isn't achieved then it could undermine the ambition of our self-proclaimed pro-tech government in ensuring "a more inclusive, competitive and innovative digital economy for the future". 


Send us a message