Last Friday, the European Commission conveyed its preliminary opinion to Company X that the company violated the Digital Services Act (DSA), a set of European online safety regulations. After the announcement, some media quickly reported that the Commission had “accused Elon Musk’s Company X of spreading false information.” In a conspiratorial tone, Elon Musk accused the Commission of offering Company X an “illegal secret deal” in which the Commission would not fine the company if Company X secretly censored speech without telling anyone.
Both positions are one-sided: the preliminary investigation is narrower in scope than the original and does not explicitly mention the “information manipulation” that is still under investigation. Moreover, the “illegal secret dealings” are neither illegal nor that secret.
To promote transparency and public debate, my organization, The Future of Free Speech, is tracking DSA enforcement. Using the Committee’s official press releases, the following discussion provides a factual overview of the DSA enforcement proceedings against X.
Origin of the findings
Last October, following Hamas attacks on Israel, the European Commission requested X under the DSA to provide information on “suspected spread of illegal content and disinformation, in particular terrorist and violent content and hate speech.”
The EU’s Chief Digital Enforcer, EU Commissioner Thierry Breton, sent the enclosed letter that sparks great concern. He has also sent similar letters to Meta, TikTok and YouTube. 28 civil society organisations, including myself, wrote to Commissioner Breton to express our concern that he appears to be conflating illegal content and disinformation (which are normally protected by freedom of expression) with other issues.
Finally, in December 2023, the European Commission launched formal proceedings against X. This enforcement phase gives the Commission the power to adopt interim measures and non-compliance decisions. According to a press release, the initiation of the formal proceedings focused on five main issues:
Systemic risk. Large online platforms (VLOPs) like X must assess the systemic risk posed by their platforms (Articles 34 and 35 of the DSA). Systemic risk is a vague concept in the DSA that requires balancing multiple competing objectives, including adverse effects on civil debate, public safety, and freedom of speech. We have publicly expressed our concerns about these provisions on multiple occasions. A press release suggests that the obligation on systemic risk was an important basis for investigating the spread of illegal content and “information manipulation” in X. Notice and action mechanism. Online platforms must promptly inform users about content moderation decisions and provide them with information on possible remedies (Article 16). Deceptive interfaces. Online platforms must not design, organize, or operate their online interfaces in a manner that deceives or manipulates users or that significantly distorts or impairs their ability to make free and informed decisions (Article 25). In particular, the European Commission was concerned about the blue tick (more on this below): Advertising repository. VLOPs must compile a repository containing the ads on their platform and make it publicly available through a searchable tool up to one year after the ads were last published (Article 39). Researchers’ access to data. VLOPs must provide researchers with appropriate access to platform data (Article 40).
Before sharing its preliminary findings, the committee sent X two other requests seeking information about its decision to reduce resources devoted to content moderation and its risk assessment and mitigation measures regarding generative AI.
What are the preliminary findings?
According to publicly available information, the preliminary findings sent to X are the first of their kind for any company. Given this, and the high profile of X and Musk, the findings have received significant media attention. But what are the preliminary findings?
The preliminary findings build on the focus areas raised at the outset of the hearing, but importantly, the focus is narrow. They focus on only three areas:
Deceptive Interfaces (Article 25). According to the Commission, the design and operation of X’s “blue tick” “verified account” interface deviates from industry practice and misleads users. Anyone can register and obtain “verified” status, which undermines users’ ability to make informed decisions about the account’s reliability and content. The Commission also claims that there is evidence that bad actors are leveraging “verified accounts” to deceive users. Advertising Repository (Article 39). The Commission believes that X fails to meet advertising transparency requirements. X does not provide a searchable and reliable advertising repository, and instead implements design features and access barriers that prevent transparency to users. This design limits necessary oversight and investigation into emerging risks associated with online ad delivery. Researchers’ Access to Data (Article 40). The Commission alleges that X falls short in providing researchers access to public data as required by the DSA. X prohibits qualified researchers from independently accessing public data through methods such as scraping, as specified in its terms of use. Moreover, the process of granting access to X’s Application Programming Interfaces (APIs) appears to prevent researchers from conducting research projects or to force them to pay disproportionately high fees.
It is noteworthy that the preliminary findings say nothing about the assessment and mitigation of systemic risks (Articles 34 and 35). In this regard, the Commission’s press release states that the investigation into “the effectiveness of measures taken to counter the spread of illegal content and information manipulation” is ongoing. Given that this area of focus may have particularly adverse effects on freedom of expression, we would like to add that it is important that the Commission proceeds with extreme caution. The preliminary findings also make no mention of notification and action mechanisms (Article 16). In fact, there is no mention of this point at all.
Comparison of focus areas between initiating formal procedures and preliminary findings
What about “illegal secret deals”?
Despite Musk’s accusations of “illegal secret deals,” the so-called “commitment process” established by the DSA appears to be merely a commitment process, as stated by Commissioner Breton, who claimed that X’s team “has asked the Commission to explain and clarify the settlement process.” [their] Concerns.”
Through the commitment process, VLOP will [DSA]” and put an end to the investigation. This mechanism limits transparency and can be abused, for example if used beyond what is required by the DSA. Excessively restricting speech limits freedom of expression. But despite its flaws, the commitment process is clearly laid out in law.
Moreover, the DSA requires the Commission to make its commitment decisions public. This may not be sufficient to adequately scrutinize decisions that potentially affect political and public debate. That said, it seems disingenuous to characterize deals that are at least to some extent public as “secret.”
Judging from Mr. Musk’s public statements, we do not believe that Company X will follow the promised path. Following the preliminary findings, Company X may review documents in the Commission’s investigation file and respond to the Commission’s concerns. If the preliminary findings are confirmed, the Commission may adopt a non-compliance decision, which may entail a fine of up to 6% of Company X’s total global annual turnover, and order Company X to take measures to address the violations. A non-compliance decision may also trigger an enhanced supervision period to ensure Company X’s compliance with the non-compliance remediation measures.
For more information about this investigation and the Commission’s other enforcement actions, see the DSA Enforcement Tracker on “The Future of Free Speech.”