Watching the watchmen: Content moderation, governance, and freedom of expression
ARTICLE 19’s latest policy outlines how to regulate the content moderation of social media platforms in a way that protects the right to freedom of expression and information. ARTICLE 19 is calling for states to ensure that transparency, accountability, and human rights are at the heart of any new regulatory framework.
Until recently, social media companies have benefited from broad or conditional immunity for hosting illegal content. But in the last few years, the biggest social media platforms have repeatedly failed to grasp and address the concerns of their users and governments, from the Cambridge Analytica scandal to Facebook’s failure to remove incitement to genocide against Rohingyas in Myanmar to YouTube’s struggle to shut down the video of the Christchurch terrorist attack in New Zealand. They have made no one happy.
In response to these criticisms, governments around the world have come up with various proposals on how to address these problems. But while these proposals claim to be about regulating platforms, in reality, they are more about regulating users’ content.
Effectively, some governments are asking social media companies to police our communication and decide what kinds of speech are ‘illegal’, or even ‘legal but harmful’. As a result, rather than tackling the root problems, these proposals would put even more power into the hands of very few private companies. What’s more, if regulators enforcing new rules are not financially and politically independent, governments can abuse them to exercise even greater control.
Therefore, our policy sets out the arguments for and against greater regulation over platforms’ content moderation. Ultimately, we argue that States must refrain from unnecessary content regulation – and, when such regulation is necessary, they must do so in the least-restrictive manner possible to protect media pluralism, diversity, and human rights – including the right to freedom of expression.
ARTICLE 19 recommends that, at a minimum:
- States should resist the temptation to unnecessarily regulate online content moderation. If the main problem is the concentration of power and dominance of a small number of platforms, then governments and lawmakers should address those concerns. We also advocate for oversight of social media companies by an independent, multi-stakeholder institution, such as Social Media Councils.
- Transparency, accountability, and the protection of human rights must form the overarching principles of any regulatory framework.
- Conditional immunity from liability for third-party content must be maintained, but its scope and notice and action procedures must be clarified. Removing or limiting platforms’ immunity from liability would give them an incentive to remove either too much or too little content.
- Governments must not impose a general obligation on social media companies to monitor content. This would likely lead to increased censorship and violate freedom of expression.
- Any regulatory framework must be strictly limited in scope. It should focus only on illegal (not ‘harmful’) content, should not apply to private-messaging or news services, and should only apply in the country that passes the regulation.
- Obligations under any regulatory scheme must be clearly defined. These should include transparency and an obligation to promote media diversity. They should not include compliance targets or an overly broad ‘duty of care’ to prevent ‘harm’.
- Any regulator must be independent in both law and practice, i.e. free from political or commercial interference.
- Any regulatory framework must be proportionate. Governments should not adopt measures that, while intended to hold large social media companies to account, in reality impose an undue burden on smaller services.
- Any regulatory framework must provide access to effective remedies for users. These should include internal complaint mechanisms, access to judicial remedies, and alternative dispute resolution mechanisms.
- Large platforms should be required to unbundle their hosting and content-curation functions and ensure they are interoperable with other services. Currently, platforms both host content on their platforms and curate it using algorithms or moderators. Unbundling would separate these two functions, so that users could use different companies to moderate it. This would encourage providers to compete with each other to provide a service that best safeguards their users’ privacy and free expression. Users would finally have a viable alternative to switch to, without needing to leave the platform they currently use. ARTICLE 19’s new policy, Taming Big Tech, deals with this in greater detail.
Over the last decade, large tech companies have proved themselves unwilling or too slow to address challenges for the protection of freedom of expression and other rights on their platforms. As societies arounds the world progress in important discussions on how to regulate the tech sector and social media platforms, our policy proposals ensure this is done in a way that respects freedom of expression and leads to greater transparency and better decision-making on the side of companies. Our proposals also make sure this is not done in a way that consolidates the power of the largest platforms.
We urge governments and legislators to implement our recommendations in laws and regulations in this area so that freedom of expression and other human rights are adequately protected, and so that no single entity – private or public – can control the flow of information in society.