The vile racist online abuse directed at penalty takers Marcus Rashford, Jadon Sancho and Bukayo Saka after England’s defeat in the Euros final has focussed much needed attention again on the government’s Online Safety Bill, published on 12 May. Most voters and politicians agree that “something must be done” to help protect people (and especially children) more effectively from harm online like racist abuse. But what exactly can and should be done? The Bill symbolises many of the practical legal and regulatory dilemmas the politicians and their cheerleaders tend to skirt around, but Ofcom will be required to solve.

Dawes: Ofcom must ‘grow new skills’

Ofcom’s Chief Executive, Melanie Dawes, has told staff that she plans to recruit around 100 extra personnel in the short term to regulate social media and another 200 or so later. Many of these will have direct experience of online and emerging technologies – an area where Ofcom is weak at the moment. They will be sorely needed to help catch the various curveballs lobbed at the regulator by the Bill.

No self-respecting regulator turns down the chance to expand its empire. Ofcom is no exception. Having taken on postal services and brought video-on-demand back in house, Ofcom is now peering with some apprehension over the lip of the potentially poisoned chalice incised with the letters Online Safety Bill.

The Bill is 145 pages long, with 123 pages of explanatory notes and a 146 page impact assessment. There are various (lengthy) summaries of the content available online so there is little point in my boring you by repeating this exercise here. I want to focus on the challenges for Ofcom and the proposed new system of regulation, less familiar aspects of the Bill and whether at the end of the day it is likely to work – that is, provide more effective protection while respecting freedom of expression and not hobbling the tech sector.

The principle behind the Bill is that it imposes a ‘duty of care’ on social media companies to protect their users from harm. This might be through exposure to illegal material (eg inciting terrorism) or potentially harmful but legal material (eg cyberbullying by children of other children). Ofcom will be the independent regulator enforcing the duty of care.

The first point is that Bill is (deliberately) very intrusive. Failures to comply with Ofcom’s requests for information will be criminal offences (this is not the case with broadcasters or on-demand providers). Ofcom will have the power to force companies to provide interviews. By analogy with video-on-demand providers, many regulated online services (stretching from the tech giants, to services which allow users to upload and share user generated content [UGC] known as user-to-user services; or search services) will be required to register with Ofcom and normally be required to pay an annual fee.

The Impact Assessment (not referred to by many commentators) suggests perhaps an enormous 24,000 businesses will need to register. It also estimates that the reporting and record-keeping and other continuing requirements will cost the tech industry in total over £1 billion pounds, including £346 million in regulator fees, and £14.2 million to update terms of services. This figure does not account for potential fines and the broader impacts, including freedom of expression implications, innovation and competition impacts, and the potential requirement for some businesses to adopt age verification measures.

Fair enough, many may say. They can afford it. Some (perhaps many) may. But as Ofcom discovered with on-demand services, a number may not. Ofcom will need to tread carefully here.

Mark Bunting, Ofcom’s Director of Online Safety Policy (pictured, who is leading the regulator’s work in this area), will need to tread with even greater circumspection when it comes to Ofcom’s codes of practice. These are supposed to define what for example is harmful and what is not for regulated services, so they can put systems and processes in place to prevent it, and at the same time protect freedom of expression and democratic values.

I fear this may be an impossible attempt to square the circle. In some ways the Bill is very Boris Johnson, who famously declared that he likes to have his cake and eat it. it aims to create a regulatory system which is all things to all voters – keeping people safe online while respecting freedom of expression, privacy and our democracy. Where material online is unlawful, as with terrorism, hate speech, or child sexual exploitation, the boundaries can be drawn with relative clarity in Ofcom’s codes and guidance. But when it comes to content which is potentially harmful but lawful, the Bill is silent (except to refer vaguely to people of ‘ordinary sensibilities’). Ofcom’s new online codes of practice (like its Broadcasting Code) will necessarily be fuzzy.

This vagueness matters much less with broadcasting and on-demand services because the amount of material they generate which needs to be regulated is absolutely minute by comparison with potentially 24,000 online regulated online services. There will be no time for Ofcom to investigate carefully individual cases at the boundaries of regulation and create precedent. Instead there is a genuine risk that many social media providers will minimise their risk of being whacked by a gargantuan Ofcom fine (a maximum of £18 million pounds or 10% of turnover) through over-regulating by means of algorithms from the outset.

No doubt many of these tough issues will be debated when the Bill is examined by a joint committee of MPs in the coming months before a final version is formally introduced to Parliament. I fear however that a number of flaws in the Bill will be glossed over by politicians, journalists and lobby groups with little practical experience of regulation, while Ofcom itself will loyally keep silent. The real problems lie in the detail of the regulation, such as the codes of practice. I predict these will only be consulted on – let alone implemented – many months if not years after the Bill becomes law. Even after consultation, Ofcom’s draft online codes of practice may have a rough ride. They must be approved by both Houses of Parliament. If either House rejects the draft codes, Ofcom must withdraw them and prepare new versions.

All of this work to create a workable system of online regulation, despite the intermittent outcry in the media and certain politicians, will take years. It is far, far more complicated than with broadcasting or on-demand. In the meantime, step by step, over the coming months and years, many social media companies will continue to take their own measures to improve the online safety of their users. By the time the new system of regulation is implemented it will be fascinating to see exactly how much, how heavily and where Ofcom will need to intervene.