UK: Draft Online Safety Bill poses serious risk to free expression

UK: Draft Online Safety Bill poses serious risk to free expression - Digital

In May 2021, the UK government published its long-awaited proposals on ‘online harms’, now re-cast as the Draft Online Safety Bill. It is a long and complex piece of legislation running to 145 pages that is highly likely to be a chokehold on freedom of expression and will severely interfere with the right to privacy in the country. We urge the UK government to reconsider its proposal and go back to the drawing board in order to fully protect freedom of expression online. 

From its original billing as tackling ‘online harms’, the Draft Online Safety Bill (Draft Bill) has been presented as a gold standard that would make the UK “the safest place in the world to be online”. The Draft Bill arises against a backlash in the last few years against the growing power of a handful of US tech companies over what people get to see and say online. The largest platforms, such as Google, Facebook and Twitter, have been regularly lambasted for doing too little too late to protect users and in particular women politicians and journalists against abuse from online trolls. Most recently, the Covid-19 pandemic has highlighted the challenges raised by misinformation online. The Draft Online Safety Bill is also the result of the ongoing debate about a new approach to platform regulation that is preventative and based on risk-assessments.

ARTICLE 19 has been involved in platform regulation debates for over ten years. Whilst we recognise the shortcomings of the largest platforms and the need for greater accountability, we have been wary of legal frameworks that would give either private companies or regulators broad powers to control or censor what people get to see or say online. We also have concerns about risk-based approaches that are not grounded in human rights. Our experience is that these approaches are often synonymous with calls for general monitoring, weakening of encryption, removal of illegal or undesirable content within unduly short timeframes.

For this reason, we have continued to defend the principles of (at least conditional) immunity from liability and a prohibition on general monitoring. If a regulatory framework is thought necessary, we have recommended that such frameworks should focus on transparency obligations and due process mechanisms rather than content, whilst recognising that even those obligations could be instrumentalised and abused. After all, much of the debate on intermediary liability over the years has focused on issues of process more than content itself. We have also called for tiered approaches bearing in mind that the largest platforms are able to take on more duties and responsibilities than smaller competitors. We set this out in our response to the government consultation on Online Harms.

State control: a chokehold on freedom of expression

 How does the Draft Bill fare against those? As a free speech organisation, we find the Draft Bill startling in its complexity and deeply disquieting for what it represents, namely an attempt at regulating the totality of human communications and interactions online in or targeted at the UK. Although some attempts are made at protecting freedom of expression, the Draft Bill also gives incredibly broad powers to the Secretary of State to control its implementation in ways previously unseen in modern Western democracies. We also agree with the assessment already made by our colleagues at Open Rights Group, Index on Censorship, Big Brother Watch, Global Partners Digital and EFF, who have all expressed serious concerns about the Draft Bill’s impact on freedom of expression.

Our own key concerns are as follows:

  • Unduly broad scope: To begin with, the Draft Bill is incredibly broad in its scope. It applies to both user-to-user and search services (clause 2). In practice, private messaging services such as WhatsApp, Signal or Telegram would be covered. Coupled with extensive risk assessment duties to minimise the presence of illegal or ‘legal but harmful’ content and more generally to protect children (e.g. clauses 7-11), this raises the prospect of weakening encryption and scanning of people’s every private communication. This is also apparent from the provisions dealing with ‘technology notices’ (e.g. clause 64), whereby Ofcom could require the provider of a user-to-user service to use ‘accredited’ technology to e.g. identify public terrorism on its service.
  • Overly complex scheme, vague definitions and duties: The scheme of the Draft Bill itself is very complex in its approach to various types of illegal or ‘legal but harmful’ content. Illegal content itself is effectively divided between four different categories, namely ‘terrorist’ type of content, child sex abuse material, priority illegal content as specified by the Secretary of State in regulations and other illegal content where the actual or intended victim is an individual (clause 41). ‘Legal but harmful’ content gives rise to separate duties, which also vary depending on whether the content is harmful to adults or children. ‘Content harmful to adults’ is (i) content that is so designated by the Secretary of State; or (ii) content that the provider has reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, including indirectly, a significant adverse physical or psychological impact on an adult of ‘ordinary sensibilities’ (clause 46 (1) to (3)) (our emphasis). This definition itself is subject to further qualifiers (clause 46 (4) and (7). It is obvious that this kind of scheme will benefit lawyers, not freedom of expression or privacy. ARTICLE 19 has previously warned that the concept of ‘legal but harmful’ speech is extremely problematic, since it regulates protected speech (see also Graham Smith here). Similarly, the House of Lords Communications and Digital Committee’s report of 22 July 2021 on freedom of expression in the digital age concluded that it could not support the Government’s proposed duties [in clause 11]. In particular, the Committee was “not convinced that they are workable or could be implemented without unjustifiable and unprecedented interference with freedom of expression”. We agree. Unfortunately, it is not just the definitions that are too vague. Extensive due diligence assessments are expected, but very little guidance is given as to what measures ought to be taken in response to those assessments. In practice, these will be left to codes of practice developed by Ofcom that effectively meet the approval of the government (Clause 33).
  • Overbroad discretion and control of the government: Most worryingly, the Draft Bill gives incredibly broad powers to the Secretary of State to define the scope of regulated services (clause 39 (12) and (13)) and the categories of content that companies ought to remove both as ‘priority’ illegal content and legal but harmful content by way of secondary legislation (clauses 41 and 46 above). The Secretary of State also gets to decide – whether or not by way of regulations – what are minimum standards of accuracy for the purpose of technology notices (clause 66 (5)), the frequency of transparency reporting obligations (clause 49(6)), what constitutes ‘qualifying worldwide revenue’ for the purpose of sanctions (clause 85 (14)) or what is the right threshold figure for the purpose of payment of fees (clause 53). Numerous other provisions in the Draft Bill indicate a very close grip of the government over the implementation of its provisions by Ofcom (Part 6), from statements of strategic priorities (clause 109) to directions in special circumstances (clause 112) or guidance about how Ofcom should exercise its powers (clause 113). The Secretary of State would also be able to issue directions to Ofcom when it “believes that modifications are required to ensure that the code of practice reflects government policy” (clause 33 (1)). While the Secretary of State would lay the Codes of Conduct for approval by Parliament, we are concerned that in practice this would be no more than a rubber-stamping exercise of the government’s approach.
  • Disproportionate sanctions: The sanctions for failing to comply with the obligations under the Codes of Conduct are severe, from significant fines to criminal liability for senior managers in some circumstances and service blocking orders. Ultimately, these are highly likely to have a chilling effect on freedom of expression. This is all the more concerning given that some of the duties themselves are not well-defined or will be further developed in ‘codes of practice’. We are especially concerned about the possibility for Ofcom to seek a blocking order from the courts for those services that fail to comply with their duties under the Draft Bill (clauses 92 and 93). Website (or service) blocking is almost always disproportionate under international human rights law because in most cases, websites would contain legitimate content. In practice, blocking is a sanction that would penalise users who would no longer be able to access the services that they like because a provider hasn’t removed enough content to the liking of Ofcom or the Minister. It is also the kind of measures that have been adopted in places such as Turkey. It is therefore regrettable that the UK is signalling that these types of draconian measures are acceptable. Similarly, the Draft Bill creates criminal liability for senior managers who fail to comply with demands for information from Ofcom (clause 73). Again, we are concerned that this would encourage managers to be overzealous in their compliance with their duties, especially the quick removal of content.

Insufficient and unworkable safeguards for freedom of expression

 Given the risks to freedom of expression and privacy inherent in the current scheme of the Draft Bill, the government has included a number of duties to ‘take into account the importance’ of protecting users’ freedom of expression and protecting users from ‘unwarranted’ infringements of privacy (clause 12). It has also created exemptions for ‘recognised’ news publishers so that they are out of scope of the Draft Bill, as well as duties to protect journalistic content (clause 14) and a new category of ‘content of democratic importance’ (clause 13).

While these duties and exemptions may be well-intentioned, ARTICLE 19 believes that they still raise significant concerns for freedom of expression. For a start, the duty in clause 12 is merely ‘to have regard’ to the importance of freedom of expression. In the case of privacy, the Draft Bill seems to assume that some infringements are warranted whereas others are not; but it does not provide any further guidance on what may be considered acceptable or beyond the pale.

The Draft Bill further creates an exemption for ‘recognised news publishers’, as defined in clause 40. We are concerned that carve-outs for media are only likely to reinforce the power of incumbents at the expense of citizen journalists, smaller bloggers or activists who do not fulfil the criteria for the exemption, including when they engage in journalistic activity for non-profit purposes. In other words, it would create a tiered system where the speech of some actors is more valued than others simply by virtue of who they are rather than what they say. Moreover, the creation of an exemption almost inevitably creates a need to define and decide who falls within it. As a result, the Draft Online Safety Bill is reminiscent of antiquated laws where the press can get special privileges bestowed upon it by a government or press regulator if they fulfil certain requirements. This seems like a distinct possibility under the Draft Bill, a concern echoed by many others (see e.g. here, here and here).

Clause 14 of the Draft Bill creates a ‘duty to protect journalistic content’. In particular, service providers within scope will be required to specify in their terms of service ‘by what methods content present on the service is to be identified as journalistic content’. This is likely to be as unworkable as asking automated systems to identify ‘illegal content’ since journalism should be understood broadly to include the dissemination of information and ideas to the public by any means. Moreover, it will once again beg the question of who will ultimately make the determination as to what constitutes journalism. Ofcom is likely to play a significant role since it will be scrutinising whether regulated services are complying with their duties.

Finally, the Draft Bill creates a category of ‘content of democratic importance’, which seems to stand in for the protection of political expression in the UK and appears to assume that content published by ‘news publishers’ or politicians would automatically fall within that category. This provision is confusing, especially given that the Draft Bill provides no guidance on how to resolve the inevitable conflicts that will arise between the providers’ various duties, from preventing risks of ‘harm’ to protecting freedom of expression.

Weakening privacy and security of communications

The Draft Bill doesn’t only raise significant concerns for freedom of expression: it would also significantly weaken the security and privacy of people’s communications. The Draft Bill does not reproduce the prohibition of general monitoring under the EU E-Commerce Directive. Nor has the government provided any commitments to that effect. That is because the Bill is intended to enable monitoring, at least in relation to certain categories of content. But even ‘specific’ monitoring is only possible by scanning all of people’s communications. The government knows this and that it would therefore be impossible for it to comply with a prohibition on general monitoring.

Risk-assessments for the protection of content harmful to children would almost inevitably entail age-verification mechanisms that have been severely criticised by digital rights groups, including on data protection grounds. Tech experts have also raised concerns that online ID/age verification might be the death knell of online search or non-browser web access. Would UK users effectively be required to go through numerous pop-up windows or provide some form of ID every time they search and try to access content deemed ‘legal but harmful’? These are all serious issues that deserve much greater scrutiny. While age-verification mandates may well contribute to a budding safety industry, they are also likely to constitute a disproportionate interference with the rights to freedom of expression and privacy.

Next steps

The parliamentary committees that will give the Draft Bill its pre-legislative scrutiny have just been announced. Their task could not be more important for the future of online communications. The Draft Bill needs more than just quick fixes. It is overly complex, state-controlled and unworkable. It will chill free expression and make people’s communications less secure. ARTICLE 19 believes that the Committee should tell the government to go back to the drawing board.