The European Commission says one thing with the Digital Omnibuses, but does something else
- 06 februari 2026
Last November, the European Commission published the Digital Omnibus. A collection of proposals for laws that threaten to undermine all kinds of important digital rights. In this article, we dive deeper into what these proposals entail.
Unprotected encrypted data
The European Commission proposes to no longer qualify pseudonymized data as personal data. As a result, the General Data Protection Regulation (GDPR) would no longer apply. This is already the case for anonymized data, but the difference is that for anonymized data there is no decryption key with which the data can be linked to the individual. With pseudonymized personal data, there is such a key. Everyone with this key can find out who the person is behind the pseudonymized data. That is why data protection is important.
In recent years, the large scale data collection practices and the development of technology has made it easier to link data together. By combining different data sets, it becomes possible to even trace anonymized data back to a person. Think of travel data, for example. Someone who keeps taking the same route to work, can quite easily be identified. Because of this, there are high standards for anonymization, which requires taking into account the possibility that someone may still be identified. The European Commission is now completely ignoring this by wanting to exclude pseudonymization from the GDPR. And that has significant consequences for individuals in Europe, because encrypted data is collected at a large scale. Now safeguards need to be taken to protect the data, which includes the consideration that data does because traceable to a person. With this proposal, all these safeguards disappear.
Special categories of personal data are allowed to be processed when it is extra risky
Currently, the GDPR prohibits the processing of special categories of personal data. This includes, for example, data about a person's health, ethnicity, religion, sexuality, and biometric data. This prohibition exists because the processing of such data is inherently risky for the people it concerns. For example, a person's fingerprint or facial identification data are unique to that person. Misuse of this data can have serious consequences for the person themselves and/or for society at large. For example, if AI applications are used to create deepfakes that make someone do or say something that they didn’t do or say, or be somewhere where they haven’t been. Data about someone's ethnicity, religion, or sexual orientation can be misused to discriminate against them. And data about someone's health can reveal a lot about their personal life. It is therefore logical that there is a ban on processing this data. But if it is up to the European Commission, that ban will be lifted for the development and use of artificial intelligence. This is completely incomprehensible, because it is precisely in AI applications that we see human rights risks, including the rights to equal treatment, freedom of religion, and freedom of expression. Lifting the ban on the processing of personal data with demonstrably bigger risks, is contrary to the goal of the GDPR.
Lifting the ban on the processing of personal data with demonstrably bigger risks, is contrary to the goal of the GDPR.
The AI Act is already being turned into a paper tiger
The AI Act was adopted in 2024, after years of negotiation and following a democratic process. Even before the AI Act has fully entered into force, the European Commission is trying to weaken important rights. For example, it proposed to abolish the registration requirement for high-risk systems, if the provider itself does not see any “significant” risk to health, safety, or fundamental rights. So even if an AI application is considered high-risk under the law, the provider can decide that the risks are not so bad, and choose not to include it in the register. As a result, people, NGOs, and regulators will no longer be able to monitor which high-risk AI applications are being used.
The European Commission also wants to delay the ability of regulators to impose fines on those that put dangerous AI applications on the market. This turns the AI Act into a paper tiger, even before the law has fully entered into force.
The European Commission says its goal is to promote innovation and economic growth. It also promises that fundamental rights will be protected. But nothing could be further from the truth. The proposals made by the European Commission have little to do with simplification and the reduction of administrative burdens. And if these proposals are adopted, the most important fundamental rights in these laws will be irreparably weakened. This will not only have a huge impact on the way we view and apply these laws. It will also radically change our society and private lives. Because if your most sensitive and intimate personal data can be used for AI, and your data can be processed on a large scale in encrypted form without anyone having to comply with the obligations of the GDPR, what protection remains?
Because if your most sensitive and intimate personal data can be used for AI, and your data can be processed on a large scale in encrypted form without anyone having to comply with the obligations of the GDPR, what protection remains?
Foreign Affairs share our concerns
The Dutch Ministry of Foreign Affairs also warns that the European Commission's proposals pose too great a risk to fundamental rights. Moreover, the Ministry also sees that the proposals do not fully correspond with the European Commission's stated objectives. The Netherlands therefore wants to ask for clarification on the proposals. We think that this will not make much of a difference. To prevent our rights from being taken away, more is needed than a friendly conversation. The Netherlands and other EU Member States must send these poor proposals back to the Commission. Weakening and undermining laws that have been established through a democratic process does not suit the European Union. Nor is undermining the rights of European citizens and businesses through backdoor measures and false pretenses.