European Commission Says X Must Remove Anti-Semitic, Terrorist Content

0
134
It isn’t clear how Elon Musk’s stance on free speech will be impacted by the European Commission. Credit: Ministério Das Comunicações
It isn’t clear how Elon Musk’s stance on free speech will be impacted by the European Commission. Credit: Ministério Das Comunicações

The European Commission, the executive arm of the European Union, announced on December 18 that they are formally investigating X, or Twitter, for not adhering to the Digital Services Act while operating in the European Union. “On the basis of the preliminary investigation conducted so far …which, among others, concerned the dissemination of illegal content in the context of Hamas’ terrorist attacks against Israel, the Commission has decided to open formal infringement proceedings against X under the Digital Services Act,” also known as the DSA.

The Electronic Frontier Foundation says users across all social media platforms, except X, have complained about the takedown of content expressing solidarity with Palestine. However, there remains the problem of old footage being made to appear something it is not. This sort of material remains on X. 

The European Commission began contacting social media companies in August 2023 about their upcoming policies that would go into effect in November. 

Very large online platforms, like X, were advised by the Commission to have measures in place to ensure no illegal content was distributed. This includes the means for flagging content in a non-profiling manner and removing and preventing illegal content. For companies like X, TikTok and YouTube, this also means the platform must: “be audited by an independent auditor, share their data with the Commission and national authorities so that they can monitor and assess compliance with the DSA, allow vetted researchers to access platform data.” 

Dina Sadek, a Middle East research fellow at the Atlantic Council’s DFRLab, told Al Jazeera, “There’s old and recycled footage circulating online that is overwhelming and makes it difficult for users to discern what is real and what is not.”

This is happening on the accounts of those who support either side of the conflict. But the European Commission is only focused on one side. 

“We simply can’t accept what we currently see: online platforms becoming a tool for terrorists. A tool for spreading anti-Semitic and violent illegal content,” said Vice-President of the European Commission Věra  Jourová on October 18. “Incitement to terror, illegal hate speech, praising of the killings, disinformation – can trigger violence in real life.”

X has a voluntary community-based system for flagging inaccurate content which was praised by the European Commission, but on October 12, the body requested information from X related to “indications received by the Commission services of the alleged spreading of illegal content and disinformation, in particular the spreading of terrorist and violent content and hate speech.”

The letter further states, “In this particular case, the Commission services are investigating X’s compliance with the DSA, including with regard to its policies and actions regarding notices on illegal content, complaint handling, risk assessment and measures to mitigate the risks identified. The Commission services are empowered to request further information to X in order to verify the correct implementation of the law.”

The Commission is doing this “on the basis of the preliminary investigation conducted so far, including on the basis of an analysis of the risk assessment report submitted by X in September, X’s Transparency report published on 3 November, and X’s replies to a formal request for information.”

The DSA “obliges platforms to actively seek and take down terrorist content,” Jourová continued. “And if content is flagged, the platforms must take down terrorist content within 1 hour. If platforms do not cooperate, they can be fined up to 4% of their annual turnover.”

IP Correspondent