DSA: Honecker sends his regards

"When there is hate content, content that calls, for example, for revolt or killing [...] platforms will be obliged to delete it. If they fail to do so, they will be immediately penalised", said Internal Market Commissioner Thierry Breton in July. According to him, the European Commission could have access to social networks blocked completely on the basis of the Digital Services Regulation (DSA) if operators fail to take action against illegal content during social unrest. "We have teams that can intervene immediately", Breton threatened.[1] One inevitably wonders who they are.

Moloch of articles
So let's take a look at the regulations. The DSA was published in the Official Journal of the European Union on 27 October 2022 and will come into force - with all its details - on 16 February 2024 at the latest.[2] Certain obligations for online platforms and online search engines already apply from 25 August 2023.

The regulation is a hodgepodge with 156 recitals and 93 articles. It is similar in structure and complexity to the General Data Protection Regulation (GDPR). But this time, it's about much more than that: it's about regulating our communication on the Internet.

As a regulation, the DSA is directly applicable throughout the EU. However, it is mandatory for Member States to set up a digital services coordinator or entrust this task to an authority by February 2024 at the latest. The necessary legal bases must be created at national level. According to Article 50 of the Regulation, these coordinators "shall remain free from any direct or indirect external influence and shall not seek or take instructions from any other public authority or private party." So much for the theory.

This is where things get interesting. For it is totally unthinkable that these coordinators should act without instructions and without links. In Germany, for example, we would like to entrust this task to the Federal Network Agency. But the Federal Network Agency is a subordinate authority of the Federal Ministry of Economics and Climate Protection. Even if it were not subject to directives (which is unlikely), what would happen to supervision? [3]

The question becomes even more fascinating when we examine the powers of these coordinators (Article 51): for example, they would be authorised to carry out inspections on any premises used by intermediary service providers; they would be authorised to impose "corrective measures"; they would be authorised to ask the competent judicial authority in their Member State to order a temporary restriction on recipients' access to the service. The details of these measures are not specified. However, the measures taken by the coordinators must be "effective, dissuasive and proportionate". This does not bode well.

The return of snitches
The coordinators in the Member States must report regularly to the Commission. It is legitimate to wonder how these coordinators, whether or not they are subject to directives, can keep an overview of digital communication in their respective States. Here too, the Commission has come up with an idea: Article 22 regulates "trusted flaggers". This status is awarded by the coordinator to individuals or organisations that meet certain conditions. In reality, these are whistleblowers who monitor communications on online platforms and intervene as soon as they suspect an infringement. The provider must give priority to reports from these trusted whistleblowers (in the easy-to-access, user-friendly reporting system that it is obliged to set up) and take appropriate action.

So who will these trusted warning signs be? We already know them, they've been with us for several years. They were brought to the attention of the general public in the summer of 2020 with the Covid-19 information report.[4] AFP, CORRECTIV, Pagella Politica/Facta, Full Fact and Maldita.es, with the support of Google News Initiative, were the rapid response force in the fight against the "most viral and potentially dangerous allegations we encountered in spring 2020". They are all part of the International Fact Checking Network (IFCN)[5], which in turn belongs to the Poynter Institute.[6] Its backers include - among others - the Democracy Fund, the Lumina Foundation for Education, the National Endowment for Democracy (NED), the Omidyar Network Fund and the Open Society Foundations (OSF).[7]

We can see things in a nutshell: the Commission, with the help of the Member States, is setting up a surveillance system based on a tried and tested method of informing and reporting from the bottom up. For the German cultural and linguistic area, we know that this worked under Metternich, Hitler and Honecker. The Commission does this - perfidiously - by invoking the protection of citizens' fundamental rights, which it nevertheless restricts even further with this form of censorship.

It is even more perfidious when we look at the Commission's official impact assessment for this regulation:[8] "Positive effects on the internal market and competition can be expected, leading to an estimated 1 to 1.8% increase in cross-border digital trade". So once again we have understood: the well-being of the internal market comes first. From now on, trusted coordinators and signallers must ensure this.


This analysis was first published in: Le Courrier des Stratèges on August 1st, 2023

---

[1] https://twitter.com/franceinfo/status/1678299190030487553

[2] https://eur-lex.europa.eu/legal-content/FR/TXT/?uri=CELEX:32022R2065

[3] https://www.bvdw.org/der-bvdw/news/detail/artikel/digital-services-coordinator-in-deutschland/

[4] https://covidinfodemiceurope.com/report/covid_report_fr.pdf

[5] https://ifcncodeofprinciples.poynter.org/signatories

[6] https://www.poynter.org/

[7] https://www.influencewatch.org/non-profit/poynter-institute-for-media-studies/

[8] https://digital-strategy.ec.europa.eu/en/library/impact-assessment-digital-services-act