UK: Joint civil society briefing for Online Safety Bill’s second reading

UK: Joint civil society briefing for Online Safety Bill’s second reading - Digital


On 13 April 2022, ARTICLE 19, together with Big Brother Watch, Index on Censorship, Open Rights Group, the Adam Smith Institute and Global Partners Digital, published a joint briefing for the second reading of the Online Safety Bill.


A coalition of human rights and non-governmental organisations committed to protecting fundamental rights to freedom of expression and privacy have issued a joint briefing for the second reading of the United Kingdom’s Online Safety Bill.

The Internet has brought about a revolution in people’s ability to connect, share information, to organise, and to learn, in the UK and across the world. During the COVID-19 pandemic, the Internet kept families and loved ones in touch, helped local communities organise help for neighbours, and provided people with entertainment. In Ukraine, the Internet has enabled vital information on the Russian invasion to be disseminated worldwide, helped record evidence of potential war crimes, and keeps Ukrainians connected to the rest of the world. In short, the Internet is a powerful resource to help individuals fully enjoy their human rights, and brings countless benefits to individuals and societies.

We also know that the Internet, like society, has a darker side and that it is used for illegal, abusive and hateful purposes. We all agree that more can be done to make people safer on the Internet, not least in ensuring that the rule of law is properly upheld online. We also believe that greater transparency over platforms’ policies, scrutiny of their use of algorithmic content moderation systems, and empowerment for users to challenge content moderation decisions would all be beneficial to an individuals’ online experience.

However, we do not believe that the Online Safety Bill will effectively address these challenges. On the contrary, we believe that the Bill, as drafted, will lead to censorship of legal speech by platforms, will undermine people’s privacy and security, putting them at greater risk of harm, and will give the government unacceptable controls over what we can and cannot say online. In this briefing, we set out our major concerns with the Bill.

Our key concerns

  1. It will mean online platforms, not courts, enforce UK law. The Bill requires online platforms to determine whether the speech of people in the UK is legal or not and then remove it if they believe it is illegal, undermining the rule of law. Private companies should not be making decisions over the legality of people’s behaviour; this is the role of transparent and accountable public authorities such as courts. More concerning is the fact that online platforms will inevitably turn to machines, not people, to make these difficult assessments. At the same time, the Bill does nothing to ensure that the police and courts are properly resourced to prosecute, convict and sentence those who break the law online, depriving victims of justice.
  2. The Bill will lead to the removal of protected speech considered legal but harmful. We are particularly concerned about the provisions of the Bill that will place pressure on the largest platforms to remove content the government has designated to be ‘harmful’. This means that behaviours and forms of speech that are permitted offline could be censored online, creating two different standards of permissible speech. It leaves it to the whim of the government of the day to decide what is subjectively ‘harmful’ in society and to then place pressure on online platforms to remove such content.
  3. It will mean constant online surveillance. To comply with their duties in the Bill, online platforms will be forced to take steps to prevent users from coming across illegal or ‘harmful’ content in the first place. In practice, this will mean constant monitoring of everything that people say and do. This form of ‘general monitoring’ is banned in many jurisdictions, including the European Union, but will now be effectively mandated in the UK. In fact, the Bill allows communications regulator Ofcom to mandate the use of ‘proactive technology’ to identify and remove any kind of content the platform believes could be illegal or content that is deemed to be harmful to children. These kinds of proactive technologies often have high rates of inaccuracy and incorporate a range of systemic biases, making them inappropriate tools for identifying illegal or harmful content in contexts where their decisions directly impact individuals’ freedom of expression.
  4. Private messages will no longer be private. The duties in the Bill will apply not only to public online spaces, but private communication channels like WhatsApp. There is no way platforms will be able to comply with their duties without proactively monitoring these private channels. In an offline comparison, this would be equivalent to the Royal Mail opening and reading every letter posted, or telecoms providers listening to every phone call made. Our ability to communicate privately, which protects journalists, human rights defenders, and vulnerable and marginalised groups, should not be put at risk in this way.
  5. The sanctions are excessive and will lead to over-removal of protected speech by introducing a new category of legal but harmful speech. The Bill proposes a range of extremely heavy sanctions if an online platform fails to comply with its duties. These range from fines to the shutting down of websites and even the imprisonment of individual members of staff. Penalties such as these are commonplace in authoritarian regimes, not democracies. They would create a strong incentive for online platforms to ‘play it safe’ and remove all content that may potentially be harmful, further exacerbating risks to freedom of expression.
  6. Ofcom will no longer be an independent regulator. The degree of government control over the UK’s supposedly independent regulator, Ofcom, is unprecedented. The Bill gives significant powers to government ministers to determine what is ‘harmful’ content, to set out Ofcom’s ‘strategic priorities’, to tell Ofcom how it should carry out its duties, and even to direct Ofcom to modify codes of practice. Together, these provisions wholly undermine any suggestion that Ofcom will be fully independent and impartial as a regulatory body for online platforms.

There are parts of the Bill that we do welcome, including the requirement for Ofcom to publish a statement each year setting out the steps it has taken to ensure that the rights to freedom of expression and privacy are protected. We also support the statutory duty on online platforms to allow users and affected persons to easily make complaints in relation to the removal of content (as well as other content-related matters) and the provisions requiring online platforms to be more transparent about their content moderation policies.

However, the threats to freedom of expression and privacy are clear. It is vital that Parliament acts to materially amend this legislation in order to ensure these fundamental rights are not seriously damaged.