The European Union opened a formal investigation into Elon Musk’s X on Monday examining the platform’s role in spreading ‘illegal content’ around the Israel and Hamas war. X has come under fire recently for antisemitic content on the platform, losing a slew of advertisers in the process.
“The European Commission has opened formal proceedings to assess whether X may have breached the Digital Services Act (DSA) in areas linked to risk management, content moderation, dark patterns, advertising transparency, and data access for researchers,” said European regulators on Monday.
The Digital Services Act defines illegal content as any information that is not in compliance with the European Union’s laws, but the authorities didn’t say exactly which laws X is believed to be breaking. European regulators specifically call out content around the Israel and Hamas war as a factor in the investigation. The White House admonished Musk in November for promoting “anti-semitic and racist hate” with his own X account. Elon Musk’s X says it was founded on a “commitment to transparency” and freedom of speech in its transparency report. This laissez-faire approach to moderation and Musk’s own incendiary comments have led to an exodus of advertisers like Disney, IBM, and Apple.
While the United States has strong free speech protections, many European countries have harsh penalties for antisemitic speech. The German penal code prohibits publicly denying the Holocaust and disseminating Nazi propaganda, according to PBS. Regulators could potentially find that X has violated Germany’s laws by allowing widespread amplification of messages supporting Hitler, featuring swastikas, or other antisemitic content.
Last month, a Media Matters investigation found that Apple and IBM’s content appeared next to pro-Nazi messaging. The EU will investigate whether X, designated as a Very Large Online Platform (VLOP) under the DSA, has put forth reasonable mitigation measures against the amplification of illegal content. European regulators will also examine advertisements on the platform, including what groups were shown certain ads. Musk ultimately sued Media Matters for defamation.
Lastly, regulators will examine whether researchers have been given effective access to X’s platform data. Researchers and reporters have struggled to verify the statistics coming out of X, such as a claim that only 2 out of 500 million accounts saw antisemitic content next to Apple advertisements. The Center for Countering Digital Hate (CCDH), which researches hateful disinformation, says X has been unable to deal with “an unmistakable surge of extremism on the platform.”
“Since Elon Musk completed his takeover of Twitter – and especially since the October 7 atrocities carried out by Hamas in Israel – bad actors have been empowered and encouraged to spew antisemitism, lies, and hate with impunity,” said Imran Ahmed, founder of the CCDH in a statement to Gizmodo. Elon Musk sued CCDH in August for scaring off advertisers as well.
X said it was cooperating with the investigation in a tweet on Monday, but asked that this process “remains free of political influence and follows the law.” X then noted it was focused on creating a safe, inclusive environment while protecting freedom of expression.