- 3 mins read
Earlier this month the House of Lords and House of Commons Joint Committee on the Online Safety Bill released a report detailing their discussions and areas of improvement, the ‘report’, available here.
The Online Safety Bill, the ‘draft Bill’, published in draft in May 2021, hopes to fundamentally shift much of the UK’s webspace in order to attempt to combat online harms such as misinformation, fraud and other such topics.
In this article, we discuss the immediate consequences service providers will likely face if the committee’s recommendations are adopted.
What services does the Bill apply to
The Bill applies to providers of search services or user-to-user services, ‘Service Providers’, where those services are:
- available to individuals in the UK; and
- the resulting content could reasonably be believed to provide a material risk of significant harm to those individuals.
Some specific exemptions are listed, but are otherwise outside the scope of this article.
Service provider requirements
Privacy by design should be a familiar phrase by now – the idea that the easiest way to protect personal data is by considering and accounting for it at the design phase of any project, website or service, to ensure appropriate protections are carried through to completion.
The report suggests a similar change revolving around online harms. This would include for example increased friction, i.e. number of steps needed, to share posts on Facebook in order to slow the spread of misinformation through viral media posts. This change is mentioned in the report as potentially being as effective as the whole fact-checking ecosystem.
Other proposed design steps include avoiding auto-playing content unless opted-in, and taking measures to ensure algorithmic content recommendation does not funnel users into content ‘rabbit holes’.
The suggested measures would be included as minimums in a mandatory code of practice, and could in theory require a redesign of all affected services.
Pseudonymity, anonymity and account creation
Many services offer anonymous or pseudonymous accounts, where the user cannot be identified by either other users or by the service itself. The report indicates that such accounts lead to an increase in online abuse, and can hinder investigation of online activity. A number of requirements are proposed, including allowing users to set how they interact (or don’t) with anonymous/pseudonymous accounts, and making it more onerous to create new accounts to prevent use of disposable ‘troll’ accounts.
In addition, the report calls for the creation of a private-sector ID verification and compliance industry to allow services to adequately identify their users. The report does note that controls will be needed to protect users who may have legitimate reason to be concerned about disclosure of their identity, such as activists in repressive states, or LGBT users in jurisdictions where homosexuality may be illegal.
- Explain how content is promoted and recommended to users;
- Remind users of the type of activity and content that can be illegal online; and
Service providers who may potentially be affected by the Bill should pay careful attention to its development, and start considering what changes they may need to make, including both policy introduction and service redesign.