Digital Services Act: Appeal from European NGOs to the Members of the European Parliament and the Representatives of Member States of the European Union

Media Diversity Institute supports and signs the following letter

Media Diversity Institute supports and signs the following letter

Honourable members of the European Parliament,

Honourable representatives of Member States of the European Union,

On December 15, 2020, the European Commission made public its “Digital Services Act” (DSA). That text, which is being examined by the European Parliament and the Council of the European Union, takes on increased importance in a context marked by the spread of online hate speech, linked in particular to conspiracy theories and misinformation.

This online hate phenomenon was already present, but with the Covid-19 pandemic, the audience of conspiracy theorists and extremists has bloomed on social media. This ever- expanding dissemination of speech promoting exclusion, hatred and violence has a lasting negative impact on the democratic functioning of our societies, particularly on fundamental rights such as the freedom of expression and of information. It has become clear that social media has a considerable responsibility in the fight against these threats, because of their capacity to remove or retain such content and to ensure its reliability.

The very essence of the European Union is to guarantee and ensure respect for these rights and freedoms, as defined in the Charter of Fundamental Rights of the European Union as well as in the European Convention on Human Rights (ECHR).

The undersigned NGOs wish that, by adopting the DSA, the European Commission, the European Parliament and the representatives of the Member States firmly commit themselves to the fight against online hate speech and put in place the necessary democratic guarantees for social media users.

1.     The DSA must clearly protect social media users: what is not allowed offline isn’t allowed online either

The new digital legal framework must provide effective legal instruments to users in order to combat hate speech and take sanctions against its perpetrators. The rights of victims, bystanders and citizens must be protected and recognized, by giving them the right to effective remedies against the perpetrators of such speech, and their accomplices.

For this, in particular, the DSA should be based on a common definition of illegal hateful content, applied to all member states of the European Union and, in doing so, refer explicitly, at a minimum, to the Council Framework Decision 2008/913/JHA of 28 November 2008 on combating certain forms and expressions of racism and xenophobia by means of criminal law and the Regulation 2021/692 of 28 April 2021 establishing the “Citizens, equality, rights and values” programme. In its first article, the 2008 Framework Decision commits member states to act against “public incitement to violence or hatred directed against a group of people or a member of such a group defined by reference to the race, colour, religion, descent or national or ethnic origin” and the speech that “publicly condones, denies, or grossly trivialises crimes of genocide, crimes against humanity and war crimes” including online (“by public dissemination or distribution of tracts, pictures or other material” says the text). The Regulation 2021/692 of 28 April 2021, in article 4, explicitly invites member states to “prevent and combat inequalities and discrimination based on sex, racial or ethnic origin, religion or belief, disability, age or sexual orientation”, and “to support, advance and implement comprehensive policies aimed at combating racism, xenophobia and all forms of intolerance, including homophobia, biphobia, transphobia, interphobia and intolerance on the basis of gender identity, online and offline.”

2.     The DSA must facilitate the implementation of the partnership of trust between regulatory authorities, platforms, and representatives of organized civil society.

The DSA allows the granting of a trusted flagger status, for NGOs, experts and public bodies, by the Digital Services Coordinator in each member state. The monitoring exercises carried out to test the correct application of the 2016 Code of Conduct, relative to illegal online hate speech, underline the relevance of the trusted flagger system1. They also demonstrate that the removal of illegal content can be carried out by the platforms without difficulty and within very short deadlines, as a result of a notification from a trusted flagger.

In order to strengthen the efficiency of trusted flaggers, they must have access to automatic monitoring tools, particularly those designed by specialized public or private companies. Trusted flaggers, just like Digital Services Coordinators, must also have the right to consult data from platforms that is useful in the fight against hate speech, in order to deepen their monitoring, in compliance with European data protection rules and in particular GDPR.

Such an approach would allow them to improve and consolidate their work in order to promote a more balanced dialogue between civil society, institutional actors and platforms.

  1. The code of conduct was signed in 2016 between the European Commission and the main platforms: https://ec.europa.eu/info/policies/justice-and-fundamental-rights/combatting-discrimination/racism-and- xenophobia / eu-code-conduct-countering-illegal-hate-speech-online_en

3.     The DSA must ensure the independence of trusted flaggers NGOs, as well as that of Digital Services Coordinators

To ensure proper functioning of that system, and in the face of authoritarian tendencies observed in some member states, the DSA should define criteria to ensure independence of national regulatory authorities. Likewise, the DSA must explicitly provide, for organisations and representatives of civil society who wish to become trusted flaggers, a procedure of appeal to the European regulatory authority. As a last resort, these organisations and representatives must also be able to appeal to the Court of Justice of the European Union.

4.     The DSA must affirm the responsibility of the platforms in the removal and rapid delisting of illegal content

The business model of platforms does not allow priority to be given to the rapid removal or delisting of hate content from social media. Nothing is more efficient than highly emotional content, such as a conspiracy theory, posted by a user with a lot of followers, to stir up controversy and attract viewers.

The DSA offers a real opportunity to put in place a binding legal framework that can encourage platforms to better protect users from hate speech. To this end, the platforms must, in a co- ordinated manner with the national Digital Services Coordinator and the trusted flaggers, set up a free, easy-to-use, accessible, and efficient reporting system. The DSA should oblige platforms to respond to requests, to reports and to complaints made by users and trusted flaggers at short notice. If a platform refuses to remove content which is clearly illegal, the DSA must require it to provide the Digital Services Coordinator and the user with relevant information about the type of moderation used and their justification for the decision taken, as well as the possibility of appeal. Only a system providing for a judicial penalty in the event of a breach by the platforms will incite them to remove hateful content as soon as they know about it. The sanctions that may be imposed on the platforms in case of non-compliance with the DSA must be a demonstrable deterrent and pronounced by a judge or an independent regulatory authority.

5.     The DSA must impose real transparency on the platforms

The DSA must supervise and guarantee compliance by the platforms with the reporting and transparency obligations, under the control of each Digital Services Coordinator.

In this context, the platforms must make public precise and relevant information on the human and technological means used to ensure the moderation of content. These transparency reports must include the criteria and definitions used by the platforms for the removal or blocking of manifestly illegal content as well as statistics by member state by quarter. This data must in particular relate to normal and priority reports, withdrawals, geo-blocking and other types of decisions, the format of content, the type of hate speech, the type of moderation used and the duration of actions. The objective is to allow regulatory authorities and trusted flaggers to have a better understanding of the way in which technological tools of moderation are designed, and in particular those related to artificial intelligence, in what ways they are used by platforms, as well as the role, number and training of human moderators.

Digital Services Coordinator must also have the technical and human resource capabilities in order to audit the moderation work of the platforms. In case malfunctions were found, they must exercise their sanctioning powers. The penalties should represent at least 6% of the annual turnover of the concerned platform. In addition, the platforms must also submit themselves to independent audits.

6.     The DSA must impose the designation of a legal representative from the platforms in each member state

NGOs, users, and regulators don’t always have at national level, real interlocutors from the platforms. They also come up against legislation outside the EU that does not allow the removal of illegal content or the transmission of illegal content information allowing to initiate proceedings. The designation of a national legal representative by the platforms, legally responsible for the platform activity it is representing, and able to receive all notifications from users, the regulatory authorities and judicial authorities (requests for information, procedural acts, decisions, etc.) will allow both to fight with greater efficiency against hate speech, and better take into account the interest of victims of hate speech.

7.     The DSA must affirm the importance of the monitoring of these provisions by an independent judge

Social media users and trusted flaggers must have the possibility to refer the matter to an independent judge in the event of non-removal by a platform of reported content that is considered illegal. As part of an emergency procedure, the judge may, within a very short time, order the withdrawal, delisting or blocking of the access to illegal content, to stop its dissemination and stop it from causing damage. Referral to the judge must also be open to any interested party, in particular to users contesting a decision considered unjustified with regard to the publication of their content.

8.     The DSA must allow for an independent judge to prevent the re-publication of identical content and mirror sites declared as illegal

The courts and Digital Services Coordinators of the member states must be able to instruct a web host, according to the decision of the European Court of Justice of October 3, 2019, to delete all information with identical content and / or equivalent to that of content declared to be illegal, by resorting “if this appears necessary” to “automated research techniques and means”2.

  • Judgment of the Court of Justice of the European Union (CJEU) of 3 October 2019, for the case C-18/18, concerning a request for a preliminary ruling under the article 267 TFEU, introduced by the Oberster Gerichtshof (supreme court, Austria), in the proceeding Eva Glawischnig-Piesczek against Facebook Ireland Limited. On the model of the Copyright Directive of 26 March 2019, social media can set up automated filters to prevent reappearance of content subject to copyright.

The undersigned NGOs invite you to respond positively to their appeal and to the commitment made by Ms Von der Leyen on September 16, 2020, to fight effectively against “all forms of hate crimes and hate speech, whether they are based on race, religion, gender or sexuality”.

  • CEJI – A Jewish Contribution to an Inclusive Europe
    • Center Communautaire Laïc Juif (CCLJ) [Belgium]
    • CESIE-centro studi e iniziative europeo [Italy]
    • Dare to be Grey [Netherlands]
    • Estonian Human Rights Centre [Estonia]
    • Europäische Akademie Nordrhein-Westfalen [Germany]
    • European Association of Judges (EAJ)
    • European Council of WIZO Federations (ECWF)
    • European Federation of Centres of Research and Information on Cults and Sects (FECRIS)
    • European Forum for Urban Security (EFUS)
    • European Network for Education and Training (EUNET)
    • European Observatory for Non-Discrimination and Fundamental Rights (EONDFR)
    • Federation for EDucation for Europe (FEDE)
    • Fighting Online Antisemitism (FOA) [Israel]
    • Greek Helsinki Monitor [Greece]
    • Humanist Union of Greece [Greece]
    • I Am Here International
    • International Network Against Cyber Hate (INACH)
    • Robert Schuman Institute for Europe (IRSE)
    • International Confederation of Intellectual Workers (ICIW)
    • #jesuislà [France]
    • Latvian Centre for Human Rights [Latvia]
    • Ligue des Droits de l’Homme (LDH) [France]
    • International League Against Racism and Anti-Semitism (LICRA)
    • LOVE-Storm [Germany]
    • MDI UK [United Kingdom]
    • MDI Global
    • MDI Western Balkans
    • Minority Rights Group [Greece]
    • Never Again Association [Poland]
    • Respect Zone [France]
    • Robert Schuman European Centre (CERS) [France]
    • ROMEA [Czech Republic]
    • Union Syndicale des Magistrats [France]
    • University Women of Europe (UWE)
    • ZARA [Austria]