Key Points:
- EU is investigating social media platform X under the DSA. Concerns were raised after X reduced its content moderation resources by nearly 20%.
- The Commission is seeking information on X’s efforts to combat illegal content and the impact of generative AI. X has until May 17 to provide the information.
- The investigation focuses on X’s compliance with the Digital Services Act’s regulations regarding illegal content, information manipulation, and transparency.
- The Digital Services Act requires online platforms to mitigate disinformation risks and remove hate speech while protecting freedom of expression.
The European Union has launched its first major investigation into social media platform X, formerly Twitter, under the Digital Services Act, a robust new law regulating online content. This move follows concerns raised by the EU about X’s reduction in content moderation resources.
In a statement released on Wednesday, the European Commission disclosed that it had requested information from X regarding its recent cuts to content moderation teams. According to X’s transparency report submitted to the regulator in March 2024, the company slashed its team of content moderators by almost 20% compared to figures reported in October 2023. Furthermore, as revealed in the transparency report, X downsized its linguistic coverage within the EU from 11 languages to 7.
The Commission is particularly interested in understanding the impact of these reductions on X’s ability to combat illegal and harmful content online. It has also expressed concerns about the implications of generative artificial intelligence for electoral processes, the dissemination of illegal material, and the protection of fundamental rights. X has been given until May 17 to provide the requested information on its content moderation resources and generative AI. Additional answers to the Commission’s inquiries must be furnished by May 27.
This investigation is part of the Commission’s formal probe into potential breaches of the Digital Services Act. The probe was initiated in December of the previous year amid concerns regarding X’s handling of illegal content related to the Israel-Hamas conflict. The Commission’s investigation focuses on X’s compliance with its obligations to combat illegal content dissemination, its efforts to counter information manipulation, and its transparency measures.
The European Union aims to gather further evidence regarding X’s adherence to the Digital Services Act. This evidence includes X’s transparency report from March and previous responses to inquiries regarding its actions to address disinformation risks associated with generative AI.
The Digital Services Act, enacted in November 2022, mandates that large online platforms like X take proactive measures to mitigate disinformation risks and implement stringent procedures to remove hate speech, all while safeguarding freedom of expression. Violations of the law could result in fines of up to 6% of a company’s global annual revenues.