With the Winter Olympics starting this week in Beijing, and the FIFA World Cup in Qatar later this year, human rights organisations are sounding the alarm on ‘sportswashing’. Gwendoline Delbos Corfield MEP and Patrick Breyer MEP investigate how big sporting events operate as a distraction from persecution, mass surveillance and human rights violations.
Are mass-scale sports events being used to normalise the use of surveillance technologies in our cities? Biometric mass surveillance, like high-tech facial recognition, is on the rise – but what is the cost to our human rights?
What do sports events have to do with human rights violations?
On 4th February, the Winter Olympics will kick off in Beijing, China, against a dark backdrop of serious human rights concerns. International human rights associations have warned participants of human rights violations and security concerns linked to the games. They are calling the sports event an apparent attempt to “sportswash” away its abusive rights reputation. What’s more, there are grave security concerns about the ‘My2022’ app that participants are required to download and use. The app monitors the health and travel data of athletes, but has been found to have a “devastating” encryption flaw that leaves this sensitive data open to attack.
Later this year, the FIFA World Cup will be hosted in Qatar. The country has come under criticism for the abuse of two million migrant workers, limited press freedom, a total absence of LGBTIQ+ rights, and a male guardianship system that severely limits the basic rights of women and girls.
With this year’s two biggest sporting events being hosted by major human rights abusers, it is hard not to see this as a worrying trend that normalises a complete disregard for human rights. The effect is similar to ‘greenwashing’, where corporations use marketing to claim their environmentally-friendly credentials in order to boost their sales and image. Instead, authoritarian states are using ‘sportswashing’ to create the same positive effect for their bad public image.
Alarmingly, research shows that China and Qatar are not the only countries ‘sportswashing’ their human rights reputation. Ahead of the upcoming 2024 Paris Olympics, organisers in Europe are also exploring the possibility of using biometric mass surveillance, like facial recognition, in cities hosting major sporting events.
An Olympic-sized threat to human rights and democratic societies
One of the biggest threats for democracy and a diverse society is the use of biometric mass surveillance technology. The potential for abuse is unprecedented. These technologies threaten our right to self-determination and human dignity, for example by forcing athletes at the 2022 Winter Olympics in Beijing to use an app that does not handle users’ personal data in a sensitive way. These are core fundamental rights that must be respected in a democracy governed by the rule of law.
European countries are experimenting with increasingly intrusive technology, without ever demonstrating its efficiency and added-value. This is despite continuous requests for evidence. If left unregulated, they have the potential to change our societies fundamentally. It’s time to acknowledge the adverse effect of biometric surveillance methods on our fundamental rights. We need to act now before it’s too late.
In order to understand the full impact on our rights, we commissioned a study to find out what exactly is at stake. According to our team of international researchers, biometric mass surveillance in public places, like the sportswashing at the Winter Olympics in Beijing, poses a risk to our rights and our privacy by:
- resulting in the violation of our right to private life. This means that biometric data, like an image of your face on CCTV, are stored for surveillance before an offence has been committed.
- leading to an unjustified loss of personal development and personal autonomy. Individuals who feel they are being monitored may have a tendency to censor themselves. They might modify their behaviour or avoid meeting someone in a publicly accessible place.
- posing a genuine, ongoing and serious threat to self-determination and to dignity. Data collected through video and audio surveillance, as well as biometric characteristics that are used to identify or categorise people, relate to the human body and the human mind. They need to be protected. These rights should not be restricted in a democracy governed by the rule of law.
Our study also finds that biometric mass surveillance is not in line with our right to freedom of expression and the right to freedom of assembly:
- By promoting self-censorship. Freedom of expression is an “essential foundation” of democracy and the rule of law. It is “one of the basic conditions for its progress” according to the European Court of Human Rights. States have a positive obligation to ensure its effectiveness. This means giving citizens the confidence to express themselves without fear. States must not monitor them if not duly justified, necessary and framed. Chinese leaders promised to uphold these rights in light of the 2022 Beijing Winter Olympics but have since arrested journalists, lawyers and women’s rights activists for expressing their beliefs.
Biometric mass surveillance is also in conflict with the absolute right to hold a belief:
- Technology that identifies or guesses the emotions or thoughts of real people can manipulate these people and encourage them to overly monitor their own behaviour. This impact contradicts the right to hold a belief, which is an absolute right. These technologies are powerful and cannot be used without the informed consent of the people concerned. Not even in cases of internal security or to prevent crime.
The risks linked to errors and to the theft of biometric identifiers are many and they are common, to name a few:
- Technical errors: Technology can be liable to falsely recognise or authenticate a person, or to not recognise or authenticate a person where it should. A striking example of errors is provided by an independent report, which concludes that the facial recognition system used by the London Metropolitan Police is “verifiably accurate in just 19% of cases”, which means that “81% of ‘suspects’ flagged by [the] technology [are] innocent”.
- Human-based errors and weaknesses: The code behind the categories used to detect and classify people is human-based and subjective. Errors may arise. The way technology is used may itself lead to unwanted impacts, like reinforcing stereotypes.
What can sports fans do to keep enjoying the Olympics and FIFA without sportswashing? Call for an EU ban on biometric mass surveillance now!
We don’t want to live in a society in which people are tracked, judged and classified based on their appearance, identity or behaviour.
The Member States of the European Union are confronted with a crucial political choice. We must decide between maintaining the principles and values of the rule of law and the respect of human rights, or the choice to stray from this path and go down the road of division and discrimination.
Huge sports events and sportswashing cannot be used to open the door to these dangerous technologies. This is why we need the European Commission to impose a ban of biometric mass surveillance technologies in all public spaces throughout the European Union. This way we can send a strong signal against human rights violations all across the globe.