This blog is the published version of our introduction to the Privacy Platform's panel at the Computers, Privacy and Data Protection (CPDP) conference in Janaury 2020.
A matter of perspective
The title of this panel discussion is ‘Facial recognition: a ‘convenient’ and ‘efficient’ solution looking for a problem?’. It is a good representation of the way technology finds its way into society more often. It represents a perspective that takes the technology as a starting point, trying to find a possible match with our society. It basically asks the question: “How and where can we use this technology?” Sometimes a bit of ethics or human rights is thrown into the mix, and it brings up questions like: “Under what circumstances or conditions are we allowed to use this technology?” or “What safeguards should apply?”
Whose convenience and efficiency?
If we take one step back we can ask the question whose convencience and efficiency we are talking about. Regarding facial recognition surveillance technology in public space two main parties will benefit from a slow and steady introduction of the technology: governments hungry for control and industry seeking product-market-fits.
Governments are likely to face less resistance when far-reaching surveillance technologies are introduced slowly, because of normalisation of the technology, and citizens getting fatigued and weary after being mobilized by civil society for every expansion of the development.
Corporate calls for ethics committees serve to aid function creep, supporting a slow and steady expansion of the technology. When over time the implementation of the surveillance technology is normalized, the use can be expanded, leading to a new demand for it.
“Two main parties will benefit from a slow and steady introduction of the technology: governments hungry for control and industry seeking product-market-fits.”
A more fundamental discussion
We believe this perspective distracts from a more fundamental discussion, skipping the question whether we should want facial recognition surveillance in public space at all, and leaving us with insufficient protection of our fundamental rights and free societies.
When we take a fundamental rights perspective, a totally different set of questions will need to be addressed.
No demonstrable necessity
First, as the European Data Protection Supervisor wrote in his blogFind the blog of the EDPS here, any infringement on our fundamental rights must be demonstrably necessary. And in this regard, convenience and efficiency do not amount to necessity. Here we should be very critical, because it will not be the first time surveillance measures being introduced with a poorly motivated necessity. So is there any evidence that we need this technology at all? Are there really no alternatives that infringe less on our rights and freedoms, and have fewer consequences for our society?
Incompatible with our data protection framework
Secondly, deployment of facial recognition surveillance will need a valid legal basis, but it is incompatible with our data protection framework. The GDPR as well as the Law Enforcement Directive both forbid the processing of biometric data. It would not be a legal instrument if there were no exceptions, but then the processing should still be strictly necessary, proportionate, respecting the essence of the right to data protection and provide for suitable specific measures to safeguard the fundamental rights and interests of the data subject. Is that even possible? We don’t think so, since facial recognition surveillance in public space inherently requires mass-scale processing of biometric data.
Thirdly, the effects of the technology as a means of surveillance in public space will always be untargeted. Facial recognition surveillance in public space cannot be used as a targeted capability. The faces of everybody passing the camera system will be scanned and analysed, possibly even without people knowing about it. This means our ability to move through public space anonymously is at stake. Do we think it is acceptable to be reduced to walking barcodes?
“Do we think it is acceptable to be reduced to walking barcodes?”
No space for face surveillance
Then there is the issue of the actual physical space where the technology will be deployed. Human rights should be the same for everyone. If we want people to feel safe and not be discriminated based on their sexual preference, there is no space for face surveillance technology near gay saunas and bars. If we want people to be free and safe to practice their religion of choice there is no space for face surveillance near mosques, synagogues and churches. If we want people to engage in the public debate there is no space for face surveillance on our streets. This raises the question: In a free democratic society, is there space for face surveillance at all?
Last but not least, the effect of face surveillance is and will be discriminatory. Especially in surveillance context where the imagery is recorded in uncontrolled circumstances, facial recognition has an accuracy problem. This problem follows the lines of gender as well as skin color and will reinforce and exacerbate systemic inequality. It is yet unknown whether it is possible to de-bias this technology. It might also not solve the problem, since technology that works equally well on everybody might not work equally well for everybody. What is known, are the examples of unethical data collection to still the hunger of machine learning algorithms in an attempt to wash the technology of racial bias.
Facial recognition ís a problem
We should not forget that fixing the technological deficiencies will not fix the problem. Since application of facial recognition surveillance in public space is an even bigger threat to our human rights when it works.
To come to my conclusion. If we agree that we should prioritise our rights and freedoms in an open and free society, then facial recognition surveillance is not a solution looking for a problem, it is a problem.