• Menu

0 recente resultaten

Europol is going to collect a massive collection of porn of European youngsters

This story begins with the creation of a new European agency, but ends with one of the world's largest databases of amateur porn by European youngsters—build by the police.

Sharing is caring

The European Commission wants to be able to force technology companies to watch their users' chats. Their aim is admirable: to combat child and adolescent sexual abuse. But the proposal is perhaps the biggest threat to confidential communicationsRead about European lawmaker's proposal putting confidential communications at risk here on the internet today.

Central to the proposalThis is the legal text of the proposal itself is the creation of a new authority, the EU Centre. That new authority is to share their office space with Europol, Europe's law enforcement agency supporting national police forces. This is necessary, according to the lawmaker, for the “improved data exchange possibilities” and to ensure that “staff [...] have more career opportunities without the need to change location.” And of course, the law regulates that the two bodies “provide each other with the fullest possible access to relevant information and information systems [...]”. In practice, therefore, the separation between the two agencies will be no more than a planter.

Context matters

The EU Centre will be tasked with filtering reports of material of possible child abuse. Those reports come from platforms like WhatsApp or Signal. To monitor for potential sexual abuse of minors, those companies use artificial intelligenceAnd if the platform makes mistakes, you will be falsely accused to assess the messages of their users. However, that technology is notoriously bad at assessing context. This is important because sharing an intimate photo in a conversation between two minors is only a problem if one of the two does so involuntarily. However, computers have a bad reputation at making that distinction because computers have difficulties interpreting the context.

If you are a young person exploring your sexuality, or just having a fine relationship, you should start to fear the police.

If the computer flags a particular photo, video, or chat as potentially abusive to children, the platform will often be unable to determine whether a child is seriously at risk. If there is a chance of abuse, the platform is obliged to report the material to the EU Centre. These platforms are incentivized to do so: better too frequently than too little, to minimize publicity risks.

Clearly unfounded

That EU Centre must then “expeditiously assess and process reports from platforms […] to determine whether the reports are manifestly unfounded […].” An image is only “manifestly unfounded” if there is clearly no sexual abuse. This is obvious if the picture depicts two elderly people. Or if the video shows a kitten drinking water from a bowl. But many of the photos will show bare skin or genitals. If in those cases, it is not immediately clear that only adults are involved, it could be with young people. And then there could be abuse. And so, it is no longer “manifestly unfounded”.

This proposal harms everyone, even the very group it aims to protect.

The only way to determine whether it is actually abuse, is by finding out the context. But the EU Centre doesn't know about the context. And so just about any report of a photo with nudity and young people on it, and any conversation that is not overtly between adults, is potentially abusive and therefore not “manifestly unfounded”.

False positives will increase the workload

In that case, the Eu Centre forwards the photo, chat, or video to the police of the relevant member state. If the user is Dutch, the report is thus forwarded to the Dutch police. They will have to investigate whether there really is abuse. The police must then look for the context: the story behind the photo or conversation. They have investigative powers to do so. So, they can knock on the platform's door, for instance, to investigate the user's conversation history. Then it will become clear whether there is abuse. Or not (as often will be the case).

The Irish police already has the experience: of the more than 4,000 photos they were forwarded through a US children's rights organization last year, at least 10% turned out to be free from child abuseYou can find an analysis of Irish police figures in this post of our Irish colleagues at ICCL. The actual number of false reports is probably much higher, as only 20% of reports did involve child abuse, for sure. (But perhaps even more worrying: the police keep identifying data on all reports. All. So even those reports that have been determined to have nothing to do with child abuse. Just imagine being registered in such a database).

Police build collection of amateur porn

Speaking of databases… If, in the EU Centre's estimation, the report is not “manifestly unfounded”, a copy of that report must also be sent to Europol. So, the report is then thrown over the planter. Europol thus gets access to a gigantic set of intimate photos, videos, and chats of young people in particular. These are all intimate photos, where in numerous instances there is nothing wrong with them at all. Highly sensitive, according to the European Commission.

Europol gets access to a gigantic set of amateur porn of European youth.

The proposal isn't clear on what Europol should do with these reports. That will probably surmount to something like “analyse” and “correlate and enrich with other information”. The proposed legislation also does not specify how long Europol may retain those photos. And yet, many of those intimate photos, videos, and chats fall outside Europol's mandate. That is, after all, “serious international crime and terrorism”.

What was the purpose again?

In short, if this proposal is passed, the consequences will be disastrous. If you are a young person exploring your sexuality, or just having a fine relationship, you should fear the police. You can no longer trust that your most intimate conversations to be just between you and your partner.

You do not help children and young people by making Europol compete with Pornhub.

And for politicians who are now thinking, “Good point, we should add a retention period”: you are missing the point. To protect children, there is no need at all for every photo, video, or chat that might indicate sexual abuse of young people to go to Europol. And certainly not if it is outside their mandate. After all, what was the purpose again? To protect children and young people, right? We can do that much better by empowering victims, making it easier for users to report abuse, increasing the capacity of the sex crimes squads of the police, and making sure perpetrators are punished. Not by reading into everyone's confidential communications, or letting Europol compete with Pornhub.

Help mee en support ons

Door mijn bijdrage ondersteun ik Bits of Freedom, dat kan maandelijks of eenmalig.

Ik geef graag per maand

Ik geef graag een eenmalig bedrag