In a major project to overhaul digital law, the European Commission this Wednesday proposed to relax certain points in the General Data Protection Regulation (GDPR).
The General Data Protection Regulation (GDPR) has shaken up the field since it came into force in 2018 in the EU. By applying more transparency to data collection and use, the text has changed certain forms of use.
The “ePrivacy” directive, amended after the entry into force of the GDPR, largely highlights the existence of “cookies”, trackers placed to follow the journey of Internet users, as all platforms must ask for user consent.
But since the rise of artificial intelligence (AI), big technology companies have often worried that European regulations are too strict.
However, some companies have successfully used European data to train their AI models, such as Meta or LinkedIn.
- Why does the KPU want to change these rules?
In its proposed amendments, the Commission praised efforts to simplify and clarify European law in the digital sector.
Regarding cookies, he mentioned “fatigue related to consent and banner deployment.” The text also intends to “stimulate opportunities for a dynamic business environment (…) especially in the sharing and reuse of data, in the processing of personal data or in the training of artificial intelligence systems and models.”
- Will the cookie banner disappear?
Fewer systematic banners asking for authorization to store cookies: this is the direction the Commission wants to develop the “ePrivacy” directive.
From now on, the regulations will be centered on GDPR only, and should allow Internet users to save their choices directly in their browser or in other applications, to avoid duplicate banners.
However, “given the importance of online revenue sources for independent journalism”, media outlets can always seek consent directly from Internet users who visit their sites.
- Will AI be able to use personal data?
European executives introduce new rationale for using personal data for AI models.
Based on “legitimate interests”, companies can empower their models in the training or testing phase, within the limits of the “interests or fundamental rights and freedoms” of users.
The Commission also wants to “clarify” the definition of “personal data”, which is reduced to the elements that enable a person to be clearly identified.
At the same time, the text published on Wednesday provides other reductions and simplifications. For example, regarding warnings that must be sent to the authorities if personal data is leaked, the level of risk increases, and the deadline for obligations is extended.
- Why is this development a cause for concern?
Several associations defending digital rights have spoken out against the Commission’s project.
In reaction to its formalization on Wednesday, the Austrian Noyb association condemned a text that “massively undermines the protection of Europeans”. “These proposed changes are a gift to America’s big tech companies because they open up a lot of new loopholes for their legal departments to exploit,” he added.
On Thursday, 127 European associations and organizations had expressed concern about “the biggest rollback in digital fundamental rights in EU history”.
“GDPR is one of the few mechanisms that gives citizens a way to speak out against companies or powerful authorities when they overreach,” they said.
