Why we must stop biometric surveillance in the EU

MEP Patrick Breyer in conversation with Ella Jakubowska (EDRi)

19.04.2021

Surveillance camer/ CC0 alberto-rodriguez-santana

EDRi (the European Digital Rights network) has launched a European Citizen’s Initiative calling for a ban on the use of biometric mass surveillance in public spaces. How could these technologies affect the lives of everyday people?

Ella Jakubowska (EDRi): Everyone’s daily lives risk being affected by biometric mass surveillance technologies in public spaces. Take the example of going to your doctor to get a prescription for some medicine, and then to a pharmacy to pick it up. It’s your right to be able to do this with respect for your privacy – it’s your health, after all. Or, let’s say your kid goes to a protest about something that they feel passionately about, such as the global climate emergency.

With public facial recognition, governments or corporations could be tracking these activities without your consent or even your knowledge. Despite it being none of their business, they could keep records of what you are doing, where you are going and who you are meeting. They’re treating you as if you are suspicious just for going about your daily life. Your kid’s face could be captured and used in the future to mark them out as a ‘troublemaker’, which could influence their education and job prospects. This isn’t just hypothetical: police in France, Slovenia and the UK are all using biometrics to monitor legitimate protesters despite not having a legal justification for doing so.

This sort of ‘Minority Report’ future is unimaginable – which is why we need to act today to ban biometric mass surveillance practices.

Patrick Breyer (Greens/EFA):These technologies are also flawed and inaccurate. We’ve already seen innocent citizens routinely reported to the police because the technology gets it wrong. They have the potential to change our societies fundamentally. Surveillance methods based on the analysis of our individual body characteristics, such as facial features or movement patterns, turn us into walking barcodes that can be scanned anytime and anywhere. And there is little to no escape: while traditional digital surveillance is attached to your mobile devices, you cannot leave your face at home to escape biometric monitoring.

Automated detection of our behaviour, i.e., the way we walk, talk or look, will eventually cause us to adapt the way we act in public spaces. This form of self-censorship has already been reported from countries like Serbia, where the government started to roll out a massive biometric surveillance project throughout its capital city, Belgrade. We have heard reports from citizens refraining to participate in protests criticising the government because they were afraid that their faces might end up in police databases.

Feeling forced to adapt our behaviour is a violation of our fundamental rights and is bound to intensify with the increase of inescapable biometric surveillance of public spaces. Surveillance, distrust and fear may gradually transform our society into one in which citizens are afraid to be different. Where we’re scared of each other. Where we don’t dare to disagree. Where privacy is treated with suspicion because we should have “nothing to hide”. This is not the strong and diverse society we want to live in. We cannot be prepared to give up our freedoms in the name of gaining a little “security”.

Close up blue eye / CCO amanda-dalbjorn

In what way has the COVID-19 pandemic highlighted the need for a ban on these technologies?

Ella Jakubowska (EDRi): Unfortunately – but not surprisingly – many governments and corporations have exploited the pandemic to expand surveillance infrastructures and collect vast troves of sensitive data in ways that would never have been allowed before. This will have a huge impact on all of our lives for years to come. Legitimate public health concerns have been exploited to introduce unnecessary biometric identification tools like facial recognition in public spaces and creepy drone surveillance. In their rush to find quick solutions, governments are forgetting that data about our bodies, our faces and our identities needs the highest levels of protection.

The whole point of “emergency” responses is that they need to be limited to that emergency. In fact, we need our human rights to be protected even more stringently during a crisis because that’s when they are most vulnerable to abuse. The picture we have seen across Europe in response to the corona crisis has been the opposite. The Polish government, for example, used facial recognition in their app to monitor quarantine, which allowed them to gather sensitive and revealing biometric data about large swathes of the population.

Patrick Breyer (Greens/EFA): We have seen the use of face surveillance to look for quarantine offenders. This demonstrates that once the technology is established its use would continuously be expanded. Ultimately, our every movement could be tracked.

Surveillance / CC0 maxim-hopman

Studies show that many of the technologies used are biased and can cause discrimination. What effect could their use have on societies and individuals’ rights?

Ella Jakubowska (EDRi): We know that facial recognition is a fundamentally racist technology, and its use in the US has led to a series of highly-publicised and traumatic false arrests of men of colour. Its use in the US has led to a series of highly-publicised and traumatic false arrests of men of colour. The crux of the problem is how the use of biometric mass surveillance technologies can amplify and codify underlying inequalities and discrimination in our societies. The European Network Against Racism, for example, have reported how data-driven policing is exacerbating already discriminatory policing practices in Europe.

By definition, these technologies are designed to differentiate between people: who is ‘normal’ compared to who is not. Who fits into stereotypical boxes of gender and who does not? Who walks in an ‘odd’ way? Anyone that does not conform to this arbitrary, technologically imposed standard will be marked out as deviant or weird. In practice, communities that already face the highest levels of societal discrimination, such as people with disabilities, people of colour, homeless people, and trans and non-binary people will be the most harmed. Our report, Ban Biometric Mass Surveillance, has shown that pretty much every single human right can be violated as a result.

Patrick Breyer (Greens/EFA): Many people consider technology as being objective by default. The common assumption is that machines act neutrally, while humans are subject to error-prone biases that might impact objective decision-making. But this is not the case.

First of all, the algorithms that power biometric surveillance technologies are developed by humans who are inherently biased based on the way they have been socialized. Secondly, these algorithms are ‘trained’ to identify faces based on millions of pictures that were shown to them. These training data sets predominantly contain images of white males. Many studies have shown that people with darker skin tones, and particularly non-white females, are systematically misidentified by facial recognition technologies. The same goes for trans or non-binary people.

However, as Ella pointed out, it is not only the bias built into the algorithms that facilitates discrimination, but the ways in which they are applied. The use of facial recognition technologies and other forms of biometric surveillance is legitimized by allegedly enhancing the security of our public spaces. This means that communities that are already heavily monitored, such as migrant communities, will become subject to even greater surveillance. 

Vulnerable groups and minorities must be fully and actively involved in leading the campaign to ban biometric mass surveillance. It is our responsibility to listen to, strengthen and support the voices of those communities.

Wall of surveillance cameras / CC0 matthew-henry

Why is it urgent that we act now?

Ella Jakubowska (EDRi): In April, the European Commission is going to propose new laws on the use of facial recognition and other biometric technologies as part of their plans to regulate artificial intelligence across the EU. It is not often that we get the chance to bring in laws to stop the harmful use of technologies before they become completely embedded in our streets and city centers.

The European Citizens’ Initiative (ECI) launched by the Reclaim Your Face campaign gives us, as EU nationals, a rare chance to make our voices heard in policy debates that can often seem remote and not applicable to people’s lives. Biometric mass surveillance is something that will impact each of us, our friends, and our families. This ECI gives us the chance to speak up for our communities, and also in solidarity with people that are not EU nationals but will be the most harmed by this tech.

We have an opportunity now to call for genuine, positive change. We’re trying to raise awareness about why this is not just a decision for lawmakers in the conference rooms of Brussels – but for everyone that cares about protecting a democratic society.

Patrick Breyer (Greens/EFA): As a long-standing activist for civil liberties, and particularly for our right to privacy, I have witnessed a number of worrying transformations of our societies based on opening a Pandora’s box of surveillance technology. Once new technologies are put in place – almost always under the guise of strengthening citizens’ security – their exceptional use soon becomes the norm. In other words, as soon as we allow biometric surveillance technologies to be released upon our public spaces, there is no going back. We will not be able to control the expansion. We have witnessed this phenomenon with other technologies, such as CCTV and the tracking of mobile phones.

In the European Union, however, the European Commission has not proposed such a legislative ban and seems to be unwilling to do so – even though it is well aware of the dangers that come with these technologies: The Commission has classified biometric surveillance as a ‘high-risk’ technology that might have a highly dangerous impact on our fundamental rights. At the same time, Commission officials have told us that, according to the results of a consultation of stakeholders, only a small minority would favor a ban of biometric surveillance technologies. This is how they will probably justify not proposing a ban. The civil society campaign ‘Reclaim Your Face’ and a representative poll we commissioned will prove them wrong.

Parallel to this Citizens’ Initiative, the Greens/EFA Group in the European Parliament has started a campaign to push the Commission to propose a ban of biometric surveillance technologies in public spaces. With a series of (online) events, videos, studies, conferences and a soon-to-be released videogame, we aim to raise awareness of the dangers that come with these technologies.

With the united force of these campaigns, one initiated by the civil society and one rooted within the European Parliament itself, we have a very unique chance to pressure the European Commission into safeguarding our fundamental rights and to propose a ban on these highly intrusive technologies.

👉 We need at least a million signatures for the European Citizens Initiative. 👈

Authors