{"id":7393,"date":"2021-06-24T13:40:51","date_gmt":"2021-06-24T13:40:51","guid":{"rendered":"https:\/\/www.greens-efa.eu\/dossier\/?p=7393"},"modified":"2021-09-23T10:23:25","modified_gmt":"2021-09-23T10:23:25","slug":"reinforce-rights-not-racism","status":"publish","type":"post","link":"https:\/\/www.greens-efa.eu\/dossier\/reinforce-rights-not-racism\/","title":{"rendered":"Reinforce rights, not racism: Why we must fight biometric mass surveillance in Europe"},"content":{"rendered":"<div class=\"gouter \" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p><em><a href=\"https:\/\/www.greens-efa.eu\/dossier\/renforcer-les-droits-pas-le-racisme\/\">Lire cet article en fran\u00e7ais<\/a><\/em><\/p>\n<\/div><\/div>\n\n<div style=\"height:30px; background:transparent; display:block;\" class=\"spacer\"><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p style=\"text-align:left\" class=\"has-medium-font-size\"><strong><em><a rel=\"noreferrer noopener\" aria-label=\"Gwendoline Delbos-Corfield MEP (opens in a new tab)\" href=\"https:\/\/twitter.com\/GDelbosCorfield\" target=\"_blank\">Gwendoline Delbos-Corfield, Greens\/EFA MEP,<\/a><\/em><\/strong><em> in conversation with Laurence Meyer (Digital Freedom Fund) <\/em><br><\/p>\n<\/div><\/div>\n\n<div class=\"gouter wp-block-separator\" style=\"\"><div class=\"greens-content blk_core_separator\">\n<hr class=\"wp-block-separator\"\/>\n<\/div><\/div>\n\n<div style=\"height:30px; background:transparent; display:block;\" class=\"spacer\"><\/div>\n\n<div class=\"gouter \" style=\"\"><div class=\"greens-content blk_core_heading\">\n<h4 style=\"text-align:left\"><em>What is biometric mass surveillance?<\/em><\/h4>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\">Biometric mass surveillance is the monitoring, tracking, and otherwise processing of the biometric data of individuals or groups in an indiscriminate or arbitrarily targeted manner. Biometric data includes highly sensitive data about our body or behaviour. When used to scan everyone in public or publicly accessible spaces (a form of mass surveillance) biometric processing violates a wide range of fundamental rights.<br><\/p>\n<\/div><\/div>\n\n<div class=\"gouter wp-block-image\" style=\"\"><div class=\"greens-content blk_core_image\">\n<div class=\"wp-block-image\"><figure class=\"aligncenter\"><img loading=\"lazy\" width=\"1024\" height=\"680\" src=\"https:\/\/www.greens-efa.eu\/dossier\/wp-content\/uploads\/2021\/06\/tobias-tullius-4dKy7d3lkKM-unsplash-1024x680.jpg\" alt=\"\" class=\"wp-image-7398\" srcset=\"https:\/\/www.greens-efa.eu\/dossier\/wp-content\/uploads\/2021\/06\/tobias-tullius-4dKy7d3lkKM-unsplash-1024x680.jpg 1024w, https:\/\/www.greens-efa.eu\/dossier\/wp-content\/uploads\/2021\/06\/tobias-tullius-4dKy7d3lkKM-unsplash-300x199.jpg 300w, https:\/\/www.greens-efa.eu\/dossier\/wp-content\/uploads\/2021\/06\/tobias-tullius-4dKy7d3lkKM-unsplash-768x510.jpg 768w\" sizes=\"(max-width: 706px) 89vw, (max-width: 767px) 82vw, 740px\" \/><figcaption>Mass surveillance \/ \u00a9tobias-tullius on unsplash\n\n<\/figcaption><\/figure><\/div>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><strong>Gwendoline Delbos-Corfield: <\/strong>&#8220;Thanks for joining me today in the context of the Greens\/EFA campaign to ban biometric mass surveillance in public spaces. Biometric mass surveillance is progressing fast and we know it poses a threat to our fundamental rights. We recently had a great discussion following our online projection of the<a href=\"https:\/\/www.pbs.org\/independentlens\/documentaries\/coded-bias\/\"> <em>Coded Bias <\/em>documentary<\/a>, during which we focused on the risks and challenges of mass surveillance and algorithmic transparency. I\u2019m happy we\u2019re able to continue that conversation today. I\u2019d like to focus on something we are not talking about enough: how these surveillance technologies can be discriminatory.<\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\">As the racial and social justice lead at the<a href=\"https:\/\/digitalfreedomfund.org\/\"> Digital Freedom Fund<\/a>, could you tell us a bit more about how the use of biometric mass surveillance technologies can discriminate against people?<\/p>\n<\/div><\/div>\n\n<div style=\"height:30px; background:transparent; display:block;\" class=\"spacer\"><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><strong>Laurence Meyer:<\/strong><em>&#8220;Thank you for inviting me to this important discussion! To answer your question, we first need to recognise that all systems of surveillance have a discriminatory impact in societies in which racism, hetero-sexism, cis-genderism, ableism, classism, and so on, are systemic. Multiple<\/em><a href=\"https:\/\/equineteurope.org\/2020\/the-other-pandemic-systemic-racism-and-its-consequences\/\"><em> studies<\/em><\/a><em> have shown that systemic discrimination is very real across Europe.<\/em><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em>What we mean when we talk about systemic discrimination is that certain people are negatively impacted (and others are positively impacted) in their everyday life, due to the way they are categorised by certain attributes. This is not only on an interpersonal level \u2013 through homophobic insults, for example &#8211; but also on a macro level,when looking for housing, when job-hunting, when in education, when crossing borders, when in contact with the police, etc. Concretely, it means that the way I am identified, according to certain criteria (skin pigmentation, how I walk, the make-up I wear or don\u00b4t wear, the shape of my nose or the width of my shoulders) has direct consequences on my access to resources.&nbsp;<\/em><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em>For Western systems of facial recognition, it has been quite widely documented that the criteria used to differentiate and classify people (Is this person a man? A woman? A white woman? A black woman?) has led to the misidentification of people who are not white cis-men. Dark-skinned black women and dark-skinned non-binary persons are regularly being misidentified by facial recognition technology. In some cases, they are even misidentified as monkeys. This clearly isn&#8217;t far off from the historic tropes that fuelled racist imagery.<\/em><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em>This has highly problematic consequences when we use these same systems in education systems, at borders, by law enforcement, and in all areas where the problems caused by systemic discrimination have long been documented. The use of these systems can even lead to wrongful arrests.<\/em><br><\/p>\n<\/div><\/div>\n\n<div style=\"height:30px; background:transparent; display:block;\" class=\"spacer\"><\/div>\n\n<div data-cont>\n\t\t\t\t\t<div class=\"sq ce  ce-img-right\" style=\"\">\n\t\t\t\t\t\t<div data-adders=\"0.6;do\" class=\"img ani-slide-left\" style=\"background-image: url(https:\/\/www.greens-efa.eu\/dossier\/wp-content\/uploads\/2021\/06\/claudio-schwarz-purzlbaum-9XFwNI21Qsk-unsplash.jpg);background-size:cover;\"><\/div>\n\t\t\t\t\t\t<div data-adders=\"0.6;do\" class=\"pr ani-slide-right\"><div class=\"txt tl z8 fs-subline \"><u>What is biometric mass surveillance used for and who uses it?<\/u><\/div><\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\n<div style=\"height:30px; background:transparent; display:block;\" class=\"spacer\"><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em>The other problem is the criteria used to identify people in cases of mass surveillance. This could be, for example, when law enforcement officers use facial recognition technology to look for people in a public space that could match their watch list. Even if the people are not misidentified, discrimination is still likely to occur because of the content of the watch list. A good example of this is the Gang Matrix, a database developed by the London police. Many young black men ended up on this database without ever having been accused of a crime, and sometimes even after having been a victim themselves. If this database were to be used in conjunction with a facial recognition system, it would lead to many innocent people being monitored closely, and possibly even to wrongful arrests based solely on racial criteria.<\/em><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em>This is what we mean when we talk about the over-policing and the over-surveillance of racialized bodies.<\/em><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em>Finally, because facial recognition systems are built on the assumption that the way we look tells you everything you need to know about a person, it reinforces problematic ways of categorising us. If we\u2019re serious about eliminating all forms of oppression and making sure that the way we look becomes irrelevant to whether or not we can access resources, the development of these technologies is a huge step in the wrong direction.&#8221;<\/em><br><\/p>\n<\/div><\/div>\n\n<div style=\"height:30px; background:transparent; display:block;\" class=\"spacer\"><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><strong>Gwendoline Delbos-Corfield:<\/strong> &#8220;You already mentioned it, but I want to come back to this important point. We know that these systems have higher inaccuracy rates on more underrepresented groups such as women, people with darker skin,and other marginalised groups. In fact, a recent<a href=\"http:\/\/proceedings.mlr.press\/v81\/buolamwini18a\/buolamwini18a.pdf\"> study<\/a> demonstrated that error rates in commercial facial analysis programmes for darker-skinned women were recorded as being more than 34%, compared with less than 0.8% of light-skinned men, when attempting to determine gender. Can you tell us a bit more about why facial recognition technologies have significantly higher error rates for some groups of people? &#8220;<br><\/p>\n<\/div><\/div>\n\n<div style=\"height:30px; background:transparent; display:block;\" class=\"spacer\"><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><strong>Laurence Meyer:<\/strong><em>&#8220;There is a short and a long answer to this question. Firstly, I would say that there are higher error rates for white women, white gender diverse persons, men of colour, women of colour and gender diverse persons of colour. A longer explanation, but one that is important to mention, is that all these groups of people aren&#8217;t misidentified at the same rate.<\/em><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em>The short answer is that because facial recognition systems are being trained mostly on white cis-men and designed by the same demographic, they are better able to recognise these features.<\/em><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em>The longer answer is that it reproduces the way that creators understand the difference between men and women. If one thinks, consciously or not, that a man is a man or a woman is a woman because of specific bodily attributes &#8211; for example, the width of their shoulders, their size, the shape of their faces, the colour of their skin &#8211; it tends to exclude a lot of people. If this bias is input into the algorithm which is then used by facial recognition technology systems, it amplifies the exclusion. They reproduce systems of exclusion that predate these technologies.<\/em><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em>It makes me think of Sojourner Truth\u00b4s speech,<\/em><a href=\"https:\/\/thehermitage.com\/wp-content\/uploads\/2016\/02\/Sojourner-Truth_Aint-I-a-Woman_1851.pdf\"><em> <\/em><\/a><em>&nbsp;\u201c<\/em><a href=\"https:\/\/thehermitage.com\/wp-content\/uploads\/2016\/02\/Sojourner-Truth_Aint-I-a-Woman_1851.pdf\"><em>Ain\u00b4t I a Woman?<\/em><\/a><em>\u201d and the historical exclusion of women of colour, and specifically of black women, from womanhood on the grounds of bodily features. It also brings to mind how disabled persons have been historically dehumanised. We have to remember that, for many people, being human doesn&#8217;t have much to do with being a man or a woman. And actually, being a man or a woman doesn&#8217;t necessarily have so much to do with physical characteristics. This issue of higher error rates concerning certain categories of people touches upon something much deeper than just a question of bias. In my view, it is the question of who is worthy of attention and identification and who is worthy of being seen. We can draw parallels with the portraits we see in most European museums\u2026 and the portraits we don&#8217;t see.<\/em><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\">Certain faces are overwhelmingly represented, while others are hardly seen at all. It poses the question: who gets to decide which technology is useful to all of us? How could we do better?Technological issues cannot be, and should not be, disconnected from the bigger societal picture. They are not appearing <em>ex nihilo,<\/em> but are products of a certain vision of the world.&#8221;<\/p>\n<\/div><\/div>\n\n<div style=\"height:30px; background:transparent; display:block;\" class=\"spacer\"><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><strong>Gwendoline Delbos-Corfield:<\/strong> &#8220;I definitely agree with this. We tend to think that technology is disconnected from our societies, but the reality is that the two are interconnected. One of my biggest concerns is that citizens living in countries that have questionable records on fundamental rights and the rule of law might now see their governments using these technologies to further restrict their rights.<\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\">In Hungary, the government continues to restrict the lives of LGBTIQ+ people. If biometric technologies get into the wrong hands, autocratic governments will be able to monitor and control the lives of their political opponents and marginalised groups to an even larger degree. In Serbia, a country which Freedom House rates as only \u2018partly free\u2019, it seems as though the government has already begun the deployment of high-resolution cameras, equipped with facial recognition technology in the city of Belgrade. If we\u2019re not careful, Belgrade could become the first European city to be totally covered with this biometric surveillance technology. This is happening right on the doorstep of the European Union.<\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\">Yet, many people still question whether biometric mass surveillance is really an issue here in Europe. <\/p>\n<\/div><\/div>\n\n<div data-cont>\n\t\t\t\t\t<div class=\"sq ce  ce-img-left\" style=\"\">\n\t\t\t\t\t\t<div data-adders=\"0.6;do\" class=\"img ani-slide-right\" style=\"background-image: url(https:\/\/www.greens-efa.eu\/dossier\/wp-content\/uploads\/2021\/06\/jurgen-jester-_PizUeTnvFE-unsplash.jpg);background-size:cover;\"><\/div>\n\t\t\t\t\t\t<div data-adders=\"0.6;do\" class=\"pr ani-slide-left\"><div class=\"txt tl z8 fs-subline \"><u>Is biometric mass surveillance really an issue in Europe?<\/u><\/div><\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\n<div class=\"gouter \" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p><br><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\">I often hear that mass surveillance is an American or a Chinese problem. In the U.S., there are well-documented wrongful arrests, like the case of Nijeer Parks in New Jersey. In China, there is widespread use of government surveillance as part of the social scoring credit system.&nbsp;<\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\">With the European Commission\u2019s proposal for a regulation on Artificial Intelligence now on the table, where do we stand in Europe on biometric mass surveillance? What can we learn from other countries\u2019 contexts? And, how do we raise awareness of the dangers of mass surveillance here in Europe?&#8221;<br><\/p>\n<\/div><\/div>\n\n<div style=\"height:30px; background:transparent; display:block;\" class=\"spacer\"><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><strong>Laurence Meyer:<\/strong><em><strong>&#8220;<\/strong>I think this belief that mass surveillance is a US or a Chinese issue, that doesn&#8217;t concern Europe, clearly points to the lack of coverage that mass biometric surveillance receives here. In the U.S., researchers, and specifically many women of colour, have published studies that can be used by journalists and support their investigative efforts. In Europe, studies that take an intersectional approach struggle to receive financial support.<\/em><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em>This can give the impression that the problem doesn&#8217;t exist. This problem of visibility is particularly acute when talking about the harmful use of biometric technologies because they can often be used without us knowing. It&#8217;s the case when facial recognition technology (FRT) films us without us knowing or when, as was the case in Sweden, law enforcement officers use a facial recognition app in their everyday work without any prior authorisation. The reality is that we don\u00b4t know the extent to which facial recognition systems are used and if they have already led to wrongful arrests in Europe. And that is, in and of itself, really worrying.<\/em><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em>But, we know of many cases in which FRTs were deployed outside of a sufficient legal framework. EDRi compiled a<\/em><a href=\"https:\/\/reclaimyourface.eu\/evidence-in-eu-countries\/\"><em> pretty comprehensive list of them<\/em><\/a><em>. This clearly shows that it is far from being just an American issue.<\/em><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em>Another thing to add is that we already know that wrongful arrests are happening in Europe. This is particularly the case in law enforcement practices with discriminatory dimensions, such as identity checks without any substantive suspicion of wrongdoing. In France, in 2018, a French-Cameroonian man was<\/em><a href=\"https:\/\/www.20minutes.fr\/societe\/2229407-20180301-cru-blague-comment-ressortissant-francais-retrouve-enferme-centre-retention\"><em> sent to a detention facility<\/em><\/a><em> after an identity check because he couldn\u00b4t present his ID to the officers who stopped him in the street.<\/em><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em>The American cases show us one thing for sure: the use of biometric tools in policing don&#8217;t prevent wrongful arrests, nor do they prevent harmful and discriminatory&nbsp; treatment. They do, however, increase the use of surveillance for all of us, marking us as data to be registered, identified and categorised. This is also a European problem.&nbsp;<\/em><\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em>Systemic discrimination isn\u2019t just an American problem. Mass surveillance isn\u00b4t just a Chinese problem. Biometric technologies are increasingly being used everywhere, whether we know about it or not. Biometric mass surveillance won&#8217;t magically make existing problems disappear Instead, it will amplify them. &#8220;<\/em><br><\/p>\n<\/div><\/div>\n\n<div style=\"height:30px; background:transparent; display:block;\" class=\"spacer\"><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><strong>Gwendoline Delbos-Corfield:<\/strong> &#8220;These technologies really do magnify the discrimination that women, people of colour and other marginalised groups already face in their everyday life.<\/p>\n<\/div><\/div>\n\n<div class=\"gouter \" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p><br><\/p>\n<\/div><\/div>\n\n<div data-cont>\n\t\t\t\t\t<div class=\"sq ce  ce-img-right\" style=\"\">\n\t\t\t\t\t\t<div data-adders=\"0.6;do\" class=\"img ani-slide-left\" style=\"background-image: url(https:\/\/www.greens-efa.eu\/dossier\/wp-content\/uploads\/2021\/06\/arvin-keynes-V4mNfkDmiX4-unsplash.jpg);background-size:cover;\"><\/div>\n\t\t\t\t\t\t<div data-adders=\"0.6;do\" class=\"pr ani-slide-right\"><div class=\"txt tl z8 fs-subline \"><u>What can we do to stop biometric mass surveillance?<\/u><\/div><\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\">The Greens\/EFA group are fighting for the rules on biometric mass surveillance in the EU&nbsp; to be tightened in the coming years. Unfortunately, we also know that this won\u2019t necessarily be possible everywhere in the world. Already, the European Commission is funding various surveillance projects around the world, including the development of a biometric ID system in Senegal and surveillance drones in Niger. We need to make sure that EU money is not used to endanger human rights for people outside of the EU and that other regions do not become a testing ground for these dangerous technologies.<\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\">Let me finish by saying that I really believe that now is the time for us to make a huge impact. Now is the time to stop the spread of these dangerous and discriminatory mass surveillance technologies. Here in the EU, right now, we have a real opportunity to ban biometric mass surveillance.<\/p>\n<\/div><\/div>\n\n<div class=\"gouter has-medium-font-size\" style=\"\"><div class=\"greens-content blk_core_paragraph\">\n<p class=\"has-medium-font-size\"><em><strong>Thank you for your commitment to this cause, Laurence., and thank you again for joining me today.&#8221;<\/strong><\/em><br><\/p>\n<\/div><\/div>\n\n<div style=\"height:30px; background:transparent; display:block;\" class=\"spacer\"><\/div>\n\n<div class=\"gouter wp-block-separator\" style=\"\"><div class=\"greens-content blk_core_separator\">\n<hr class=\"wp-block-separator\"\/>\n<\/div><\/div>\n\n<div class=\"gouter \" style=\"\"><div class=\"greens-content blk_core_html\">\n<link href=\"https:\/\/actionnetwork.org\/css\/style-embed-whitelabel-v3.css\" rel=\"stylesheet\" type=\"text\/css\"><script src=\"https:\/\/actionnetwork.org\/widgets\/v4\/form\/stay-informed-get-the-latest-news-on-our-campaigns-7?format=js&amp;source=widget\"><\/script><div id=\"can-form-area-stay-informed-get-the-latest-news-on-our-campaigns-7\" style=\"width: 100%\"><!-- this div is the target for our HTML insertion --><\/div>\n<\/div><\/div>","protected":false},"excerpt":{"rendered":"<p>Gwendoline Delbos-Corfield MEP in conversation with Laurence Meyer, from the Digital Freedom Fund [link], about the dangers of the increasing use of biometric mass surveillance &#8211; both within the EU and outside it, as well as the impact it can have on the lives of people who are already being discriminated against.<\/p>\n","protected":false},"author":2,"featured_media":7414,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v19.6.1 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Reinforce rights, not racism: Why we must fight biometric mass surveillance in Europe - Greens\/EFA<\/title>\n<meta name=\"robots\" content=\"noindex, follow\" \/>\n<meta property=\"og:locale\" content=\"en_GB\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Reinforce rights, not racism: Why we must fight biometric mass surveillance in Europe - Greens\/EFA\" \/>\n<meta property=\"og:description\" content=\"Gwendoline Delbos-Corfield MEP in conversation with Laurence Meyer, from the Digital Freedom Fund [link], about the dangers of the increasing use of biometric mass surveillance - both within the EU and outside it, as well as the impact it can have on the lives of people who are already being discriminated against.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.greens-efa.eu\/dossier\/reinforce-rights-not-racism\/\" \/>\n<meta property=\"og:site_name\" content=\"Greens\/EFA\" \/>\n<meta property=\"article:published_time\" content=\"2021-06-24T13:40:51+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2021-09-23T10:23:25+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.greens-efa.eu\/dossier\/wp-content\/uploads\/2021\/06\/claudio-schwarz-purzlbaum-9XFwNI21Qsk-unsplash-1024x683.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1024\" \/>\n\t<meta property=\"og:image:height\" content=\"683\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Kristian Auth\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Kristian Auth\" \/>\n\t<meta name=\"twitter:label2\" content=\"Estimated reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"11 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.greens-efa.eu\/dossier\/reinforce-rights-not-racism\/\",\"url\":\"https:\/\/www.greens-efa.eu\/dossier\/reinforce-rights-not-racism\/\",\"name\":\"Reinforce rights, not racism: Why we must fight biometric mass surveillance in Europe - Greens\/EFA\",\"isPartOf\":{\"@id\":\"https:\/\/www.greens-efa.eu\/dossier\/#website\"},\"datePublished\":\"2021-06-24T13:40:51+00:00\",\"dateModified\":\"2021-09-23T10:23:25+00:00\",\"author\":{\"@id\":\"https:\/\/www.greens-efa.eu\/dossier\/#\/schema\/person\/aad78798c39c5d5715258f9a4443ab22\"},\"breadcrumb\":{\"@id\":\"https:\/\/www.greens-efa.eu\/dossier\/reinforce-rights-not-racism\/#breadcrumb\"},\"inLanguage\":\"en-GB\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.greens-efa.eu\/dossier\/reinforce-rights-not-racism\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.greens-efa.eu\/dossier\/reinforce-rights-not-racism\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.greens-efa.eu\/dossier\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Reinforce rights, not racism: Why we must fight biometric mass surveillance in Europe\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.greens-efa.eu\/dossier\/#website\",\"url\":\"https:\/\/www.greens-efa.eu\/dossier\/\",\"name\":\"Greens\/EFA\",\"description\":\"Greens\/EFA\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.greens-efa.eu\/dossier\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-GB\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.greens-efa.eu\/dossier\/#\/schema\/person\/aad78798c39c5d5715258f9a4443ab22\",\"name\":\"Kristian Auth\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-GB\",\"@id\":\"https:\/\/www.greens-efa.eu\/dossier\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/6d72018db9976b54c8d20213147088c7?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/6d72018db9976b54c8d20213147088c7?s=96&d=mm&r=g\",\"caption\":\"Kristian Auth\"}}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Reinforce rights, not racism: Why we must fight biometric mass surveillance in Europe - Greens\/EFA","robots":{"index":"noindex","follow":"follow"},"og_locale":"en_GB","og_type":"article","og_title":"Reinforce rights, not racism: Why we must fight biometric mass surveillance in Europe - Greens\/EFA","og_description":"Gwendoline Delbos-Corfield MEP in conversation with Laurence Meyer, from the Digital Freedom Fund [link], about the dangers of the increasing use of biometric mass surveillance - both within the EU and outside it, as well as the impact it can have on the lives of people who are already being discriminated against.","og_url":"https:\/\/www.greens-efa.eu\/dossier\/reinforce-rights-not-racism\/","og_site_name":"Greens\/EFA","article_published_time":"2021-06-24T13:40:51+00:00","article_modified_time":"2021-09-23T10:23:25+00:00","og_image":[{"width":1024,"height":683,"url":"https:\/\/www.greens-efa.eu\/dossier\/wp-content\/uploads\/2021\/06\/claudio-schwarz-purzlbaum-9XFwNI21Qsk-unsplash-1024x683.jpg","type":"image\/jpeg"}],"author":"Kristian Auth","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Kristian Auth","Estimated reading time":"11 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/www.greens-efa.eu\/dossier\/reinforce-rights-not-racism\/","url":"https:\/\/www.greens-efa.eu\/dossier\/reinforce-rights-not-racism\/","name":"Reinforce rights, not racism: Why we must fight biometric mass surveillance in Europe - Greens\/EFA","isPartOf":{"@id":"https:\/\/www.greens-efa.eu\/dossier\/#website"},"datePublished":"2021-06-24T13:40:51+00:00","dateModified":"2021-09-23T10:23:25+00:00","author":{"@id":"https:\/\/www.greens-efa.eu\/dossier\/#\/schema\/person\/aad78798c39c5d5715258f9a4443ab22"},"breadcrumb":{"@id":"https:\/\/www.greens-efa.eu\/dossier\/reinforce-rights-not-racism\/#breadcrumb"},"inLanguage":"en-GB","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.greens-efa.eu\/dossier\/reinforce-rights-not-racism\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/www.greens-efa.eu\/dossier\/reinforce-rights-not-racism\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.greens-efa.eu\/dossier\/"},{"@type":"ListItem","position":2,"name":"Reinforce rights, not racism: Why we must fight biometric mass surveillance in Europe"}]},{"@type":"WebSite","@id":"https:\/\/www.greens-efa.eu\/dossier\/#website","url":"https:\/\/www.greens-efa.eu\/dossier\/","name":"Greens\/EFA","description":"Greens\/EFA","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.greens-efa.eu\/dossier\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-GB"},{"@type":"Person","@id":"https:\/\/www.greens-efa.eu\/dossier\/#\/schema\/person\/aad78798c39c5d5715258f9a4443ab22","name":"Kristian Auth","image":{"@type":"ImageObject","inLanguage":"en-GB","@id":"https:\/\/www.greens-efa.eu\/dossier\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/6d72018db9976b54c8d20213147088c7?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/6d72018db9976b54c8d20213147088c7?s=96&d=mm&r=g","caption":"Kristian Auth"}}]}},"_links":{"self":[{"href":"https:\/\/www.greens-efa.eu\/dossier\/wp-json\/wp\/v2\/posts\/7393"}],"collection":[{"href":"https:\/\/www.greens-efa.eu\/dossier\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.greens-efa.eu\/dossier\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.greens-efa.eu\/dossier\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.greens-efa.eu\/dossier\/wp-json\/wp\/v2\/comments?post=7393"}],"version-history":[{"count":53,"href":"https:\/\/www.greens-efa.eu\/dossier\/wp-json\/wp\/v2\/posts\/7393\/revisions"}],"predecessor-version":[{"id":8161,"href":"https:\/\/www.greens-efa.eu\/dossier\/wp-json\/wp\/v2\/posts\/7393\/revisions\/8161"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.greens-efa.eu\/dossier\/wp-json\/wp\/v2\/media\/7414"}],"wp:attachment":[{"href":"https:\/\/www.greens-efa.eu\/dossier\/wp-json\/wp\/v2\/media?parent=7393"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.greens-efa.eu\/dossier\/wp-json\/wp\/v2\/categories?post=7393"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.greens-efa.eu\/dossier\/wp-json\/wp\/v2\/tags?post=7393"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}