As Israel and Hamas go to war, the Digital Services Act faces its first major test

The EU’s first actions in the context of the conflict will inevitably set standards for this landmark law’s long-term impact

As Israel and Hamas go to war, the Digital Services Act faces its first major test

Share this story
THE FOCUS

Banner: European Internal Market and Industry Commissioner Thierry Breton speaks at a news conference in Brussels, February 23, 2023. (Source: Reuters/Yves Herman/File Photo)

The European Union’s long-awaited Digital Services Act, a package of regulations often billed as a once-in-a-generation rewriting of the rules of the internet, is undergoing its first real-world test. Following the October 7, 2023 Hamas terrorist attack and Israel’s subsequent declaration of war, European policymakers are finding themselves tested by the conflict’s broader ramifications for the global internet ecosystem and its billions of users worldwide.

The Digital Services Act (DSA) is a democratic attempt at setting clear standards for digital intermediary services, with a special set of rules for what are called “very large” online platforms and search engines operating in Europe. While these are European laws, focused on European Union residents, their impact will be felt throughout the internet ecosystem and its users.

The full force of the DSA won’t be in effect until February 2024 (this date is subject to change), but the implementation process is already under way. In April 2023, the EU announced which platforms fall under the law’s jurisdiction. Those same platforms were required to submit their first risk assessments by October, and they are already working on facilitating their first independent audits required under the law, which are expected by the end of 2024. 

In the meantime, the EU is furiously working to hire sufficient staff to implement the law and consulting with experts – including the DFRLab – to define what exactly is expected of companies, particularly on questions of transparency and data access. The Commission only just published its final version of the Delegated Regulation that lays out what is expected in an independent audit on October 20.

The DSA is designed to look at systemic risks – for example, not over-indexing on singular pieces of content and ensuring the information and technological systems we rely on are transparent, understood, and accountable. The expectation is that the law will enable regulators, the public, and researchers to have a better sense of what is actually happening on the platforms through which business is conducted and, as is the case right now, war is discussed, shaped, and reported. The DSA does this by introducing responsibilities and a system of accountability and transparency for all providers of “digital intermediary services,” and by creating more significant special rules for “very large online platforms” (VLOPs) and “very large online search engines” (VLOSEs). As the Commission noted, “Very large online platforms have to meet risk management obligations, external risk auditing and public accountability, provide transparency of their recommender systems and user choice for access to information, as well as share data with authorities and researchers.”

But as the European Union is midstream in defining how exactly companies should comply with these rules, a major conflict is challenging the task. The information environment surrounding the war between Israel and Hamas is the first public road test of the DSA during an international crisis. The EU’s first actions in the context of this conflict will inevitably set some standards for this landmark law’s impact and future approaches and therefore merits examination. 

Rubber meets the road: The EU to platforms at the start of the conflict

When Hamas surreptitiously slipped into Israel on October 7, capturing and killing its civilians, the internet exploded with videos, photos, and commentary – some real, some fake, and some recycled. The EU had already engaged in a number of public tiffs with X’s owner, Elon Musk, with particularly pointed engagement from Thierry Breton, European Commissioner for the Internal Market. As Israel mobilized its military and began bombing Gaza, experts quickly noted the difficulty in managing and confirming the flow of information on X due to Musk’s recent decisions to change verification rules, remove link headlines, and a number of other controversial platform and policy changes, including the dismantling of the entire trust and safety team. In this context, public officials and independent researchers began asking how the DSA might apply to X at this moment, making it an unsurprising starting point for EU officials.

With the DSA still being formalized, the first sign of EU action ironically came in the form of tweets from Commissioner Breton’s X account with screenshot letters, also signed by Breton, directed to the heads of tech companies subject to the law. The language of each tweet highlighted unique concerns for each platform, ranging from violent and illegal content, to mis- and disinformation, to protecting minors, and civic integrity in upcoming elections. 

Commissioner Breton’s tweets on DSA enforcement regarding the conflict. (Source: @ThierryBreton)

These letters, tweeted between October 10 and October 13, were addressed to four VLOPs: X (formerly Twitter), Meta, TikTok, and YouTube. Commissioner Breton also mentioned the letters  in an October 18 speech to the European Parliament. The letters focused on the spread of disinformation and illegal content on the respective platforms in the context of the current conflict; in the case of Meta and YouTube, they included references to upcoming elections in European countries. In each letter, Breton discussed the requirements to comply with the DSA, suggested further formal requests will be forthcoming, and noted that how companies respond to requests could have ramifications in formal investigations.

In addition to his posting on X, Breton created an account on X competitor Bluesky on October 10 and cross-posted the letters.  These letters were not released on any other platforms, websites, or distribution mechanisms, nor was there any subsequent documentation or release from the European Commission as an institution. It is unclear if the companies received these letters through means other than the posts on X and Bluesky.

Timeline of public communications from the European Commission regarding the DSA and the current conflict. (Source: jmalaret/DFRLab) 
Timeline of public communications from the European Commission regarding the DSA and the current conflict. (Source: jmalaret/DFRLab) 

The DSA empowers European authorities to make formal requests for information and to investigate companies, which can include site visits, the examination of internal systems and more sensitive information, and other mechanisms to confirm a company’s actions and compliance. It is important to note that we have no way of knowing exactly what conversations or actions the European Commission or platforms are taking outside of what is communicated publicly. As a result, there has been a fair amount of confusion over where these letters sit in the architecture of the DSA. That they were sent from Commissioner Breton, rather than operational regulatory officials or bodies, added to this confusion. 

Somewhat confusingly, the letters requested that all companies, except for Google/YouTube, respond within twenty-four hours. The language in the letters is unclear as to whether companies are requested to respond to questions in the letters or forthcoming formal requests for information mentioned in the text. Regardless, X is the only company to issue a public response. While Breton’s letter was addressed to Elon Musk, X CEO Linda Yacarino responded with a letter of her own, posted to the platform on October 12. In the letter, Yacarino largely avoided discussion on X’s overall compliance with the DSA, instead touting the company’s community guidelines and system of collaborative “community notes.” Critics observed that her response failed to address the core concerns outlined in Breton’s letter; the lack of clarity on X’s community guidelines and moderation standards; graphic and false content circulating on the platform; and the gutting of its trust and safety team and resources. It is unclear if X had additional communications with the Commission. 

The same day X posted Yacarino’s response to the Breton letter, the Commission announced its first recognizable DSA action: a formal request for information from X, which could potentially signal the launch of an investigation. Unlike Breton’s letters, the European Commission announced this decision via an official press release; the Commission also tweeted the announcement from its institutional account, with Breton subsequently posting it on his individual X and Bluesky accounts. 

The Commission’s press release stated that the request is focused on “X’s compliance with the DSA, including with regard to its policies and actions regarding notices on illegal content, complaint handling, risk assessment and measures to mitigate the risks identified.” It cited the powers the DSA provides the Commission to make these requests, but interestingly did not mention the current conflict.  It also stated that X had until October 18 to answer questions related to “the activation and functioning of [its] crisis response protocol,” which is not mentioned in any other recent documentation, and until October 31 to answer broader inquiries into its risk mitigation measures. 

On October 19, the Commission announced in its daily news update that it was sending similar requests for information to Meta and TikTok; at the time of writing, the Commission had not announced a similar request for YouTube. It subsequently tweeted about each request on its official X account, which Breton then retweeted on his own account. Breton did not post about this request on Bluesky; the European Commission is not officially present on the platform.

Both posts stated that the Commission had requested the companies to provide information on “…measures [they have] taken to comply with obligations related to risk assessments and mitigation measures,” though there were also some differences. For Meta, the request focused on the “dissemination and amplification of illegal content and disinformation,” making specific reference to election integrity and “the terrorist attacks across Israel by Hamas.” For TikTok, the Commission focused on the spread of illegal content, citing “terrorist and violent content and hate speech” and “alleged spread of disinformation.” It also mentioned requests related to other parts of the DSA focused on the protection of minors online. The TikTok release did not specifically reference the current conflict, however.

The daily news update stated that Meta and TikTok had until October 25 to respond to questions “related to the crisis response” and until November 8 to respond to issues related to election integrity and minors online.

Understanding these actions in the context of the DSA

Making sense of this flurry of activity is not easy, though there was significant overlap between individual letters from Breton to the platforms.

While the letters from Commissioner Breton referenced DSA procedures, they did not seem to have a basis in DSA processes. The language of the letters asked that platforms be “timely, diligent, and objective” in removing illegal speech and that they employ “proportionate and effective mitigation measures” to address disinformation. This language is similar, but not an exact match, to the DSA’s Articles 16 and 35, which require all providers of hosting services to respond to notices of illegal content in a “timely, diligent, non-arbitrary and objective manner,” and that all VLOPs to have in place “proportionate and effective mitigation measures, tailored to the specific systemic risks,” respectively. 

The letters’ twenty-four hour response window also seemed to reference the DSA’s investigation powers and subsequent fines for non-compliance. While the DSA does empower the Commission to set a deadline by which a company has to respond to formal requests for information, there is no formal mechanism in the the law for letters such as those sent by Breton, and the language in his letters was not clear as to whether it was referring to forthcoming formal requests or the asks in the letters themselves.

Likewise, we do not have access to the formal requests the Commission actually made of X, Meta, and TikTok. We can only parse what was stated in the October 12 press release and the October 19 daily news update, both of which were lacking in clarity. Formal requests for information are provided for under Article 67 of the DSA, which requires such requests to include “the legal basis and the purpose of the request…what information is required… the period within which the information is to be provided, and the fines provided for in Article 74 for supplying incorrect, incomplete or misleading information.” The press releases suggested this required information had been presented.

The EC’s references to X needing to report on its “crisis protocol” and Meta and TikTok to report on “the crisis response” could refer to the DSA’s Article 36, which lays out the concept of a crisis mechanism. However, without more information, it is not clear whether this is the case or whether the difference in language used for X versus Meta and TikTok was intentional. It is possible the word “crisis” is being used as a shorthand for the conflict between Israel and Hamas. Given the rush to apply the still-in-progress  DSA to the current crisis, it is notable that the only Commission communications making direct references to the conflict are Breton’s letters and the Commission’s public statement regarding its requests to Meta. 

Certainly these public comments should be understood in the context of upcoming European Union elections, in which Breton is expected to compete for the presidency. But as the DSA is still new and in development,  we don’t know what exactly will come next. But the Commission is empowered to require platforms to change policies and features, share information, open up internal systems to review, and even levy fines of up to 6 percent of the global turnover of a provider if the platform refuses to respond to requests or fails to comply with decisions. The law was designed to be collaborative, so a fair amount of back and forth is expected. How much of that occurs in the public eye, however, is yet to be known.

A moment of confusion

The general reaction to the EU’s opening salvo on the DSA seemed largely to be confusion. Without clarity on major portions of the DSA’s rules themselves, experts first struggled to determine whether the letters tweeted by Breton were official communication under the DSA or something else. While it is important to recognize that the EU is attempting to respond midstream during a highly volatile conflict, there is risk that this confusing first move could undermine the credibility of the endeavor at its most critical moment. It is difficult to distinguish what is a stopgap measure in the midst of a crisis and what is precedent-setting or an expression of long-term intent for the Commission’s implementation of the DSA. 

A number of civil society groups expressed alarm at the Breton letters, arguing they conflated “disinformation” and “illegal content,” and established a precedent of an arbitrary twenty-four hour response deadline that is not based in law. Groups further bristled at the letters pushing for companies to respond quickly to law enforcement without specific reference to such law enforcement requests. These concerns are undoubtedly underscored by civil society having previously warned about similar issues when Breton made previous comments suggesting the DSA empowers the EU to shut down or sanction platforms for failing to remove “hateful content.” Notably, when sixty-seven civil society groups wrote to Commissioner Breton to ask him to clarify that point, his response left many feeling he hadn’t addressed their concerns.

Indeed, the EU spent years carefully crafting this law to avoid it becoming a state-led ownership of content moderation. Much of the promise of the DSA is in the long-term hooks it provides for researchers and users alike to have more and better information. The law sets up a range of transparency and data-sharing requirements that are either still in design or just getting rolled out for testing. For example, the EU recently launched the first version of its DSA Statements of Reason Database, which offers real-time updates from online platforms on what and why certain content is taken down. In February 2023, the EU also launched its Transparency Center with reports from platforms that signed onto the EU Disinfo Code of Practice, with updates as recent as September 2023. All of these tools could help us understand how large social media platforms moderate information in times of conflict and more. But we’re just getting started.  

The actions of the last few weeks risks derailing sensitive and complex conversations about what information is most useful and in what form; who should have access to that information and how; what it means to have meaningful algorithmic transparency; and how the European Commission will approach its new powers. It will not be possible to leverage the DSA to improve our digital ecosystem without the active participation of social media platforms. How the EC approaches them in this crisis moment will undoubtedly have an impact on the direction of these bigger conversations. 


Cite this analysis:

Rose Jackson and Jacqueline Malaret, “As Israel and Hamas go to war, the Digital Services Act faces its first major test,” Digital Forensic Research Lab (DFRLab), October 26, 2023, https://dfrlab.org/2023/10/26/as-israel-and-hamas-go-to-war-the-digital-services-act-faces-its-first-major-test.