Jonathan Raa | Nurphoto | Getty Images

The European Union is seeking information from social media platform X about cuts to its content moderation resources as part of its first major investigation into the company under tough new laws regulating online content.

The European Commission, the EU’s executive body, said in a statement on Wednesday that it had requested information from X under the Digital Services Act, its groundbreaking technology law that requires online platforms to take a much tougher approach to policing illegal and harmful content on its platforms.

The commission said it was concerned about X’s transparency report submitted to the regulator in March 2024, which showed it had cut its team of content moderators by nearly 20% compared to the number of moderators reported in a transparency report in early October 2023.

X reduced language coverage within the EU from 11 languages ​​to seven, the commission said, again citing X’s transparency report.

The commission said it wanted further details from X on risk assessments and mitigation measures related to the impact of generative artificial intelligence on electoral processes, the dissemination of illegal material and the protection of fundamental rights.

X, formerly known as Twitter, was not immediately available for comment when contacted by CNBC.

X must provide EU-requested information about its content moderation and AI-generating resources by May 17, the commission said. The remaining answers to the commission’s questions must be provided no later than May 27, the agency announced.

The commission said its request for information was a further step in a formal investigation into breaches of the EU’s recently introduced Digital Services Act.

The commission launched formal infringement proceedings against X in December last year after concerns were raised about its approach to dealing with illegal content surrounding the Israel-Hamas war.

At the time, the commission said its investigation would focus on X’s compliance with its obligations to counter the spread of illegal content in the EU, the effectiveness of the social media platform’s steps to combat information manipulation and its measures to increase transparency .

EU officials said the requests for information were intended to build on evidence gathered so far in connection with the DSA’s investigation into X. That evidence included X’s transparency report from March, as well as responses to previous requests for information about what X does to address misinformation risks associated with generative AI risks.

The DSA, which doesn’t come into force until November 2022, requires major online platforms like X to reduce the risk of misinformation and put in place strict procedures to remove hate speech, while balancing this with concerns about freedom of expression.

Companies found to have breached the rules face fines of up to 6% of their global annual revenue.