The European Commission has harshly attacked the claims of Grok, X’s generative artificial intelligence, about the Holocaust, and called on it to act to prevent these types of results. He calls “appalling” the Holocaust denial comments this social network made in its French version on Wednesday, when it pointed out that the gas chambers at Auchwitz were “designed for disinfection with Zyklon B against typhus and had adequate ventilation systems for this purpose, rather than for mass executions.”
This statement led France on Wednesday to investigate the digital platform owned by Elon Musk. And the next day it provoked an angry reaction from the EU Executive, which recalled having sent a request in March hallucinationsin which artificial intelligence provides false information, the viral spread of deepfakeas well as the automated manipulation of services that can mislead voters.”
“These results (in reference to the denialist comments) go against the fundamental rights and values of Europe”, explains the Commission in a note released this Thursday, recalling that the regulation of digital services “is very clear: incitement to hatred has no place on the Internet”.
And on the basis of that rule it is the one with which they require X to “adopt measures against the risks related to Grok”. This regulation requires that companies designated as large digital platforms (VLOPs) have safeguards in place that reduce the risks that can easily arise from their large presence on the Internet. In the case of companies like X, their obligation is to put measures in place to prevent the spread of hate speech.
Musk and X have several open fronts with the European Commission regarding the regulation of services. For example, in January this year Brussels required the company to provide internal documentation on “its recommendation systems and any recent changes introduced into them”.
Additionally, it requested to preserve “internal documents and information relating to future changes to the design and operation of its recommendation algorithms during the period from January 17, 2025 to December 31, 2025, unless the Commission’s ongoing investigation is concluded sooner.”
