“This legislation… is a Trojan horse: it presents a facade of respecting democratic principles… But behind this liberal facade, the exact opposite is happening: an attack is taking place against the constitutional order.”

European Commission President Ursula von der Leyen opened her speech at Davos this year by underscoring the “top concern” among the World Economic Forum’s partner companies, which also happens to be one of the Commission’s biggest worries as well: “misinformation and disinformation.” These two risks, she said, are “serious because they limit our ability to tackle the big global challenges we are facing – climate, demographics and technological changes, and spiralling regional conflicts and intensified geopolitical competition.”

The primary solution to the problem of mis- and disinformation, according to Von der Leyen, is to forge a grand coalition between “business and governments,” which, as luck has it, fits snugly with the WEF’s primary mission in life: to promote public private partnerships at all levels and in all areas of government, for the benefit primarily of its partner companies.

“It has never been more important,” VdL said, “for the public and private sector to create new connective tissue. Because none of these challenges respect borders. They each require collaboration to manage risks and forge a path forward.”

Through its Digital Services Act (DSA), the European Commission has already put into operation arguably the most ambitious manifestation yet of this grand coalition between government and business. The DSA imposes a legal requirement on very large online platforms (VLOPs) and very large online search engines (VLOSEs) to rapidly remove misinformation, disinformation, and hate speech. The European Commission has primary, but not exclusive, regulatory responsibility for these companies. In a few weeks’ time, the same requirements will apply to all other online service providers, though responsibility for execution and enforcement will lie with national authorities.

A Global Impact?

As time goes by, the implications of the DSA are likely to extend far beyond EU borders. Like its predecessor, the General Data Protection Regulation (GDPR), it could even have a global impact, through three mechanisms mentioned by Scott K. Ginsburg, a professor of Law and Technology at Georgetown University, in his paper, “When the Digital Services Act Goes Global”:

First, companies could adopt DSA-compliant practices worldwide. This is a common form of the Brussels Effect in Anu Bradford’s account—when companies align their global practices with Brussels’ rules largely out of possible efficiency of adopting those same standards worldwide. This is also the main mechanism in Nunziato’s account of the global effects of the DSA.

Second, governments might find much to envy in the Digital Services Act—which validates burgeoning efforts to bring the internet under government control, provides special tools for speeding up the removal of illegal content under local law, includes procedural rules that might limit the power of platforms to label or suppress other content, conveys power to evaluate risk mitigation measures, and sets out “break glass” crisis control mechanisms—complete with the possibility of getting six percent of the company’s global revenue for violations.

A third mechanism is possible as well. The European Union could itself
promote the DSA as a global model, perhaps incorporating parts of it into its
model free trade agreements.

As I noted in a previous post, the institution that gets to define what actually constitutes mis- or disinformation on very large internet platforms and search engines for the EU’s roughly 450 million citizens (as well as arguably untold millions of citizens far beyond Europe’s borders) is the European Commission itself:

The same institution that is in the process of dynamiting the EU’s economic future through its endless backfiring sanctions on Russia and which is mired in Pfizergate, one of the biggest corruption scandals of its 64-year existence.* Now the Commission wants to take mass censorship to levels not seen in Europe since at least the dying days of the Cold War. In this task it will have, in its own words, “enforcement powers similar to those it has under anti-trust proceedings,” adding that “an EU-wide cooperation mechanism will be established between national regulators and the Commission.”

As the DSA becomes an integral part of the national constitutions of the EU’s 27 member states in the coming weeks, it is accompanied by a wall of silence in the mainstream media (quelle surprise!). Even on Twitter/X there is little discussion, which may mean that Elon Musk’s social media company is trying to abide by the EU’s new censorship regime after already facing an “illegal content” probe over the Israel-Gaza war. Most EU citizens, meanwhile, have probably never even heard of this new regulatory architecture being constructed around the worldwide web,  making this arguably the quietest coup in modern European history.

One of the rare voices of criticism I have found on the matter is an op-ed in Berliner Zeitung by a retired German judge called Manfred Kölsch. Titled “Judge Warns: Freedom of Expression in EU Is in Acute Danger,” the article is worth reading in its entirety (click here for an English-language translation on German financial journalist Norbert Haring’s blog). But for the purposes of this post, I have included a few of the choicest excerpts (translated with the help of a fluent German-speaking family member).

“A Trojan Horse”

Kölsch begins the article by unpicking the Orwellian aspects of the DSA:

This legislation on digital services is a Trojan horse: it presents a facade of respecting democratic principles. The EU Commission stresses that the DSA is intended to establish “strict rules to safeguard European values” and Article 1 of the DSA directly states: “Everyone has the right to freedom of expression”.

Behind this liberal facade, however, the exact opposite is happening: an attack is taking place against the constitutional order. Due to the complexity of the matter and the sheer volume of information available, its introduction is going unnoticed. The DSA opens up the possibility of [EU or national authorities] demanding the removal of entries that are not unlawful from very large online platforms and search engines…

Platform operators are required to “pay particular attention to how their services could be used to disseminate or amplify misleading or deceptive content, including disinformation.” (Recital 84). In addition, Art. 34 of the DSA makes a clear distinction between unlawful information and information with only “detrimental effects”.

However, the term “disinformation” is not defined in the DSA. But in 2018 the EU Commission did define it as including information that can cause “public harm”. In doing so, it determined (p.4) that public harm is to be understood as “threats to democratic political processes and political decision-making as well as to public goods such as the protection of health, the environment and security”.

There can be no doubt that false, misleading or just inconvenient entries need not be unlawful. Nevertheless, they can be declared unlawful at any time on the basis of the DSA. The EU Commission sets the standard by which disinformation is judged. However, this means that politically unsavoury opinions, even scientifically argued positions, can be deleted, and not only that: if it is classified as unlawful, there are social consequences.

One inevitable result is that citizens begin self-censoring to align their messages on the platforms with what is currently acceptable within the corridors of power…. The cornerstone of any free society — the perpetual exchange of intellectual and political ideas, even with opposing opinions — will therefore crumble.

Another layer of censorship comes in from the fact that the major platforms will have to analyse entries for “systemic risks” they may pose, evaluate them accordingly and then take “risk mitigation measures”. Systemic risks are deemed to exist if there are “likely (or foreseeable) adverse effects” on “social debate”, “public safety” or “public health”. Such entries must be deleted or blocked.

From Covert to Overt

We have already seen this sort of thing play out in the US, but in a broadly covert manner. As the Twitter Files disclosures showed, federal law enforcement and intelligence agencies helped to curtail, block and shadow-ban government-threatening lines of thought, such as suspicion and hostility toward vaccine mandates and interest in the Hunter Biden laptop, both of which have been well vindicated. Thanks to the EU’s Digital Services Act, the online censorship is about to become overt and legally permissible.

As Kölsch notes, the DSA threatens to choke public expression and debate on sensitive issues (a few examples off the top of my head: EU support for Israel’s war crimes in Palestine, Ukraine’s flailing military campaign, the Pfizergate scandal, the EU Commission and Council’s escalating economic war against EU Member Hungary over its refusal to support further expenditure on the Ukraine war, etc ) through a number of mechanisms:

Due to the generalised nature of the clauses used in the DSA, the platforms concerned will always find a reason to delete inconvenient entries. The coordinator will have the power to order sanctions and the fact checkers and content flaggers unlimited possibilities when it comes to submitting texts for deletion.

Unjustified deletions will be further encouraged by the use of automatic content recognition technologies, which is unavoidable due to the sheer volume of information to be processed. The European Court of Justice ruled (in a recent case concerning the General Data Protection Regulation; N.H.), that these technologies… are not capable of predicting the likelihood of future behaviour. Even the Advocate General at the ECJ has explained that the available technologies are not capable of making the judgements required by the DSA, e.g. whether an entry will have a foreseeable detrimental effect on the “public debate” or “public health” that would justify deletion…

[Nonetheless], due to the threat of fines of up to 6% of global turnover in the previous year for infringements, the platforms are incentivised to practise so-called “overblocking” (i.e. the excessive deletion of permitted expressions of opinion and information or the restriction of their dissemination; N.H.) for financial reasons alone.

The Commission and national EU governments will have additional means of applying pressure on platforms to ensure they respond quickly and robustly to their requests to remove illegal, harmful or misleading content. For example, the Commission and/or national governments will be able to impose fines of up to 1% of annual turnover if a company fails to comply with information requests under the Act. And to help expedite matters, a punitive fine of up to 5% of the average daily global turnover or revenues of the platform in the preceding financial year can also be levied.

One of the darkest aspects of the DSA regulation, says Kölsch, is the emphasis on preventative actions against illegal, mis- or dis-information, which sounds eerily reminiscent of “pre-crime”, the concept first coined by Phillip K Dick to express the idea that the occurrence of a crime can be anticipated and prevented before it even happens:

The monitoring obligation of all actors is preventative. It is always about “expected critical [effects]”,… “foreseeable adverse effects” on “social debate”, “public safety” or “public health”. The Advocate General at the ECJ has said what is legally necessary: These represent a “particularly serious interference with the right to freedom of expression” “because by restricting certain information before it is disseminated, they prevent any public debate about the content, thus “[d]epriving freedom of expression of its actual function as a motor of pluralism.” The Advocate General correctly points out that preventive information controls ultimately abolish the right to fundamentally unrestricted freedom of expression and information.

Contravening EU Laws on Freedom of Expression

Lastly, Kölsch warns that the DSA not only undermines Germany’s federalist model of governance but also contravenes many of the EU’s and national laws on freedom of expression and information:

This surveillance bureaucracy goes against federalism, which is anchored in the German constitution. Until now, media supervision was a responsibility of the 16 federal states (Bundesländer). According to the DSA, content-flaggers and fact-checkers are to be viewed as “trustworthy” if they have already proven themselves in the past in identifying objectionable content. In plain language this means: the previously known informants under the regime of the previously applicable Network Enforcement Act# will gratefully recognise that their position has now acquired the character of a monopoly.

A careful look behind the facade of the rule of law reveals that the DSA knowingly undermines the right to freedom of expression and information guaranteed by Article 11 of the EU Charter of Fundamental Rights, Article 10 of the European Convention on Human Rights and Article 5 of the Basic Law (Germany’s written constitution, agreed by the allies back in 1949 when the first post-war government was established in West Germany).

Here is the text of Article 11 of the EU Charter of Fundamental Rights:

Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers.

Not everything about the DSA is bad. The Electronic Frontier Foundation (EFF), for example, has praised many aspects of the regulation, including the protections it provides on user rights to privacy by prohibiting platforms from undertaking targeted advertising based on sensitive user information, such as sexual orientation or ethnicity. “More broadly, the DSA increases the transparency about the ads users see on their feeds as platforms must place a clear label on every ad, with information about the buyer of the ad and other details.” It also “reins in the powers of Big Tech” by forcing them to “comply with far-reaching obligations and responsibly tackle systemic risks and abuse on their platform.”

But even the EFF warns that the new law “provides a fast-track procedure for law enforcement authorities to take on the role of ‘trusted flaggers’ and uncover data about anonymous speakers and remove allegedly illegal content – which platforms become obligated to remove quickly.” The EFF also raises concerns about the dangers posed by the Commission’s starring role in all of this:

Issues with government involvement in content moderation are pervasive and whilst trusted flaggers are not new, the DSA’s system could have a significant negative impact on the rights of users, in particular that of privacy and free speech.

And free speech and a free press are the foundation stones of any genuine liberal democracy, as notes the American Civil Liberties Union (ACLU):

The First Amendment protects our freedom to speak, assemble, and associate with others. These rights are essential to our democratic system of governance. The Supreme Court has written that freedom of expression is “the matrix, the indispensable condition of nearly every other form of freedom.” Without it, other fundamental rights, like the right to vote, would cease to exist.


* While VdL was making her speech at Davos on the need to merge government and corporate power to censor illegal and objectionable information, a reasonably large majority within the European Parliament, consisting primarily of the three main voting factions, voted to keep the details of the COVID-19 vaccine contracts signed between the Commission and Pfizer and Moderna secret — despite the fact that the 2020 contract signed between Pfizer and the Commission was already published in its entirety by Italian broadcaster RAI in April 2021.

While acknowledging the “maladministration” involved in the Commission’s vaccine procurement practices, the European Parliament also voted to reject two proposals from the conservatives that aimed to pressure the Commission into digging up the infamous text messages between von der Leyen and Pfizer CEO Bourla. Those messages are currently the subject of a criminal investigation by the European Public Prosecutor’s Office as well as a lawsuit by the New York Times. 

Leaders will be unsurprised to learn that the Parliament’s latest move to protect VdL from further scrutiny went virtually unreported in the Brussels-based media and was also barely discussed on Twitter, but the disconnect is rapidly widening between the total lack of transparency and accountability flounced by Europe’s political elite and the Commission’s escalating attacks on even the most basic notions of privacy, anonymity and freedom of expression for the rest of us.

# From Wikipedia:

The Network Enforcement Act, also known colloquially as the Facebook Act (Facebook-Gesetz), is a German law that was passed in the Bundestag [in 2017] that officially aims to combat fake news, hate speech and misinformation online and can be seen as a precursor to the DSA. According to the Federal government of Germany, the law is necessary to combat an increasing spread of hate speech online, as well as defamation and fake news…

Like the DSA, the NEA incentivizes over-blocking, because it allows citizens and tech companies to make judgement on questionable speech immediately by blocking it within 24 hours, instead of allowing it to propagate or cause harm while waiting for a court’s decision…

Reporters Without Borders (RSF) stated that the Act could “massively damage the basic rights to freedom of the press and freedom of expression.”[8] The Human Rights Watch has called the law “flawed”, stating it could lead to unaccountable, overbroad censorship. It added that the law will set a dangerous precedent for other governments that also wishes to restrict online speech by forcing companies to censor on its behalf. Indeed, the RSF also noted that Germany’s law had also influenced Russia’s implementation of its own hate speech law.

This entry was posted in Guest Post on by Nick Corbishley.