As UNESCO continues its process of developing Guidelines for regulating digital platforms (the Guidelines), ARTICLE 19 remains concerned about their potential implications for freedom of expression. We recognise UNESCO’s good intentions behind this initiative and appreciate some improvements on the current draft in comparison to earlier versions. However, the conceptual underpinning and many of the proposals in the Guidelines are problematic from the perspective of international freedom of expression standards. We suggest that UNESCO refrain from developing specific guidelines for digital platform regulation and, instead, focuses on establishing a set of broad principles that can lay the groundwork for global multi-stakeholder discussions in this field.
The Guidelines, currently in their third version, are being developed by UNESCO following the Windhoek+30 Declaration on Information as a Public Good through multi-stakeholder consultations. ARTICLE 19 has actively participated in the process concerning the Guidelines. We published comments in response to the second UNESCO consultation, issued a statement regarding version 2.0 of the Guidelines, and attended the February 2023 Internet for Trust conference in Paris, during which the Guidelines were discussed, as well as several consultations meetings organised by UNESCO following the Paris conference.
ARTICLE 19 appreciates UNESCO’s good intentions behind this initiative and the consideration given to feedback from various stakeholders, which have resulted in considerable improvements to the document so far. However, we regret that our key recommendations are not reflected in the current version. In particular, we wish to highlight the following problems.
The general approach of the Guidelines
ARTICLE 19 has raised significant concerns regarding UNESCO’s general approach to these Guidelines.
These concerns encompass several aspects, including the lack of clarity regarding the intended objective, purpose, and precise legal nature of the Guidelines. They also encompass the Guidelines’ focus on government regulation as the primary means to achieve the objective of protecting freedom of expression on online platforms. In particular, the Guidelines state that they ‘encourage as much worldwide consistency as possible in platform regulation policies to avoid internet fragmentation’.
Unfortunately, this approach still persists in the latest version of the Guidelines. We disagree with a notion that a uniform or ‘consistent’ approach to platform regulation can be applied in different regions or jurisdictions. We reiterate our previous observation that ‘in many jurisdictions, independence of the regulator is not a reality and State regulation of digital platforms not desirable. In these contexts, the Guidelines may well be used to legitimise tightening control over online public discourse and silencing of critics’. Similar concerns have been raised by civil society actors from across the Asia Pacific region.
ARTICLE 19 remains convinced that to mitigate these risks, UNESCO should avoid providing regulatory guidelines and instead concentrate on establishing a set of broad and high-level recommendations that would underpin any regulatory approaches in the area. We therefore recommend that the Guidelines are transformed into such a set of broad principles that can serve as a foundation for global multi-stakeholder discussions in this area. These principles would focus on the transparency and human-rights based principles that have long been championed by UNESCO.
Lack of clarity
The Guidelines continue to lack clarity in their scope and specific recommendations. For instance, they appear to limit their scope to ‘large companies’ but do not provide clear criteria for determining what qualifies as a ‘large company’. Another example is that they state several times what platforms ‘are expected’ to do and that they ‘would be expected to take steps to restrict known child sex abuse material and live terrorist attacks’. What this means in legal terms and whether this is meant to have any implications for a platform liability regime is unclear. It is also unclear how the latter ‘expectation’ relates to the – important – recommendation for States to refrain from imposing a general monitoring obligation.
Focus on content removals
The Guidelines are still too focused on ‘detecting, identifying or removing content’, i.e. on regulating and restricting user speech – despite their stated emphasis on systems and processes. Perhaps connected is our observation that the Guidelines are misguided in suggesting that speech itself can ‘carry systemic risks’ (paragraph 59), when, in fact, the systemic risks originate from the design and functioning of the platforms’ services.
We welcome the removal of the reference to ‘content that risks significant harm to democracy and the enjoyment of human rights’ from the previous version as well as the significant reduction of references to vaguely defined problematic content like ‘hate speech’ or ‘disinformation and misinformation’. It is, however, unclear why a specific section addressing online gender-based violence was retained, how it aligns with the overall approach of the Guidelines, and what the exact scope of this term is.
Enforcement of platforms’ policies
ARTICLE 19 is disappointed that the Guidelines continue to suggest that a regulatory system should have the power to ‘take enforcement action against the digital platforms deemed non-compliant with its own policies’. As we observed in our last statement, ‘it is problematic to tie enforcement actions in this broad manner to companies upholding their policies. These policies often provide for restrictions of speech that go well beyond those permitted under international human rights law and enable companies to censor many categories of lawful speech that they – or their advertisers – may consider harmful, inappropriate or controversial.’ It also, once again, brings back the focus on a wide range of content that companies pledge to be removing under their own policies.
Scope of the Guidelines
Finally, the Guidelines continue to suggest that direct messaging services may be within their scope and does not explicitly exempt them from any of the expectations placed on platforms to limit certain types of content.
We reiterate that such an approach could seriously undermine online encryption. As political discourse is increasingly shifting towards restricting both encryption and anonymity, it would be even more important for the Guidelines to explicitly reaffirm users’ rights to encryption and anonymity. Both are vital for safeguarding users’ right to privacy and ensuring that they feel confident that they can freely express themselves in online communications.
ARTICLE 19 urges UNESCO to address all our comments from both this and previous submissions, as well as the concerns expressed by numerous civil society organisations around the globe regarding the Guidelines’ potential impact on freedom of expression.
We do not believe that a single additional round of consultation – as suggested in version 3.0 – will be sufficient or that it will lead to an adequate outcome by the second half of 2023. We therefore echo other civil society actors’ calls for a pause in the current timeline and a comprehensive re-evaluation of the Guidelines’ current approach.