The Digital Services Act is coming. What will it mean for you?

The European Parliament has voted on a new law for online services, known as the Digital Services Act. Alexandra Geese MEP talks us through some of the problems of the digital age and what the Greens/EFA are doing to protect our fundamental rights.

We shop online, we bank online, we listen to music online. Since the COVID-19 pandemic struck, most of us now work online, chat with our friends on WhatsApp and take our exercise classes via Zoom. The last EU law to set out rules for online services was adopted in 2000 (the e-commerce directive). A lot has happened since then. Back in 2000, the world was still using ICQ and MSN Messenger. It’s safe to say the way we use the internet has changed dramatically.

The last couple of decades have seen the rise of a few giant online platforms. By now, it’s hard to imagine life without Facebook, Google and Amazon. And while millions of us use these services, it’s undeniable that they’ve shifted the balance of power.

Over half (57%) of all Europeans are on social media networks. And, in most EU countries, a majority of us get news from social media every day. Online platforms have a direct impact on our fundamental rights, our society and democracy.

These big changes have come with some big challenges. We’re all aware of the spread of disinformation. We all see the rise of hate speech and online harassment. New ways of influencing elections and votes. Manipulative features that trick us out of our data. Companies tracking our clicks and targeting ads based on what they think we’ll buy next. Fake traders selling scams or dangerous goods with no accountability.

Clearly, after more than 20 years, it is time for an updated European digital law.

What will the Digital Services Act do?

The Digital Services Act will tackle some of these challenges. New rules will clarify how illegal content is taken down. New transparency requirements for companies will help ensure products sold online are safe and that buyers can see exactly who they are buying from. It will also give us more control over what we see online and a real choice about whether to allow companies to advertise to us (and how). If done right, the Digital Services Act will make the internet both safer and fairer for all of us.

Of course, some of the new rules have been met with powerful opposition. The Greens/EFA have been fighting to protect our fundamental rights online. We want the EU to set a global  standard for regulating digital content.

The journey is far from over. We’ve managed to convince the European Parliament to adopt some powerful new rules into its version of the Digital Services Act. But, we still need to persuade European governments to take the changes on board during the upcoming negotiations with the Council of the European Union on the final law.

Read on to find out what we’ve managed include in the new Digital Services Act so far, and what it could mean for your daily life online.

Clear rules on surveillance advertising

What’s the issue?

“It’s so weird. Yesterday I said to my friend that I needed a new pair of Speedos, and now I’m seeing ads for Speedos everywhere I look”. Sound familiar?

Our personal info is valuable. Many platforms will do anything they can to get it. We’re being spied on. What we google. What we click on. How long we lingered on that website. How we answered that online quiz. It’s all used to create a detailed profile and target us with advertising. Online services and apps use privacy-intrusive default settings, misleading wording or choices hidden deep in a service’s interface that invade our privacy. We’re left unaware we’re being surveilled or why we’re being targeted for certain ads.

How can the Digital Services Act fix online advertising?

By banning surveillance advertising! This is what the Greens/EFA campaigned for, and we managed to achieve a partial ban. The European Parliament now officially supports a ban on surveilling minors for advertising purposes and for using any sensitive data, such as our sexual orientation or our political beliefs.

It’s shocking that we even need to say this but, crucially, children and minors should never be targeted for surveillance advertising. It’s good to know that a majority of the European Parliament is on our side here. (And, tech giants, before you even think about trying this: don’t collect additional data to identify children for the sole purpose of respecting this obligation. There, closed that loophole).

However, powerful conservative politicians are not on board. Tech giants have been heavily lobbying these MEPs to protect their business model over our right to privacy. (Join our campaign to help us pile on the pressure!)

We also managed to improve the rules on transparency around advertising. Platforms should tell us what settings they use to target us with ads and how to change those settings.

And, of course, we should have a fair choice to say no to surveillance advertising. It should be easy to switch it off. It shouldn’t be possible to trick internet users by making it harder to deny consent than to give it. Browser settings to deny consent should be mandatory.

Stopping online manipulation by banning ‘dark patterns’

What’s the issue?

Pop-up boxes asking “Are you sure you want to leave?”. Extra items appearing in our shopping basket. That opt-out button that we didn’t see. An eye-watering phone bill because we accidentally downloaded a dodgy app. Terms and conditions so long that we’d grow old and die trying to read it all. We’ve grown used to being annoyed on the internet, but these manipulative practices, known as ‘dark patterns’, should be a thing of the past.

How can the Digital Services Act fix dark patterns?

With a new set of rules designed to prevent ‘dark patterns’, based on Greens/EFA suggestions.

In the new Digital Services Act, online services should not be allowed to:

  • give more visual prominence to any of the consent options.
  • repeatedly ask a user for consent.
  • urge a user to change a setting or configuration.
  • make it difficult to cancel a service.
  • ask for consent even though the user has already objected via an automated tool (like a “Do Not Track” signal in the browser).

More control over what content we’re recommended

What’s the issue?

Why am I watching this?” It’s a familiar thought to anyone who has fallen down a video-streaming rabbit hole in the middle of the night. You start off watching a funny video about cats, and two hours later you’re shown something that makes you sit up. Big social media networks want you to keep watching and sharing their content. Their algorithms (the software that decides what to show you next) are designed to display content that is shocking, extreme or attention-grabbing. This fuels the spread of fake news. It helps hate speech go viral.

Most of the big social media networks use automated systems to recommend content or products. Think YouTube’s “Next Up” or Facebook’s “Groups you should join”. Online platforms have no legal requirement to be transparent about what recommender systems they are using or how they are using them to target people.

How can the Digital Services Act fix recommender systems?

First of all by making the use of them more transparent. Any platform’s recommender systems, regardless of their size, must transparently explain the main parameters for recommending content. Big online platforms should assess the risks and be held accountable for their use.

The Greens/EFA have been fighting for more transparency about the exact criteria that recommender systems use to target or to exclude users.

We want to allow users to modify recommender systems. Let’s say we want to get content presented in a different order. Very large online platforms have to provide at least one option that is not based on profiling.

Better reporting of hate speech and illegal content

What’s the issue?

There is no harmonised system allowing internet users to report illegal content – such as defamation and libel – when they come across it.

How can the Digital Services Act fix reporting systems?

By making reports conform to certain standards. Any report of illegal content should include important details like the digital location. Reports should be supported by evidence whenever possible.

Online platforms should have to send a detailed explanation to the person whose content they remove for being illegal or contrary to their terms and conditions. This will help people to complain in case of wrongful take-downs. The Greens/EFA successfully argued for content to be left online in cases of doubt while an assessment is done. Online platforms also have a duty to deal with reports quickly and in a fair, transparent and non-discriminatory way.

Holding platforms accountable for the way they moderate content

What’s the issue?

“I’m sure I posted a picture of Uncle Bob’s bald head yesterday and now I can’t find it”. You may have found that one of your social media posts was removed automatically, leaving you mystified as to what you did wrong.

Social media giants use automated tools to scan and moderate the content we post. An algorithm monitors each post for keywords and certain types of image or video, blocking or taking down content that does not pass through its filters. Users could also find themselves ‘shadowbanned’. They can still post and comment on a platform, but are unaware that the system is stopping their content from being visible to others.

This unfair system censors our free speech with no accountability. And what’s worse, those spreading hate speech will always find a way to sneak their posts past the algorithms and keep them going viral.

How can the Digital Services Act fix content moderation?

By putting it back in human hands. The  Greens/EFA successfully pushed for “human oversight” to double check that posts are being moderated accurately, transparently and fairly. Platforms can still use automated tools, but they are not allowed to scan and monitor every piece of content shared online.

Making it safer to buy online


What’s the issue?

No-one could have foreseen just how big online shopping was going to get. Many of us are choosing to buy groceries, clothes, gifts, cleaning products, household goods and even medicines over the internet. Consumer organisations have found that online marketplaces are full of risky and dangerous goods. Say it with us: hair dryers shouldn’t start fires.

When sellers sign up to sell via an online marketplace, there is nothing to stop them from entering a fake company name, fake address and fake contact details. The online platforms don’t check, and there are no consequences. It makes it impossible to track them down and hold them responsible for selling unsafe or broken items.

How can the Digital Services Act fix scammers and fake traders?

By introducing a ‘Know Your Business Customer’ obligation for online marketplaces. This will help to identify genuine and trustworthy traders, while preserving the anonymity of private users.

Understanding how social media algorithms are a threat to our democracy

What’s the issue?

In Europe, 80% of us use the internet on a daily basis. That’s a lot of clicks and a lot of google searches. It’s a lot of data. And data is worth dollars. But, more scarily, this data can be used to influence us. What we see online can influence what we think, what we do and who we vote for. It’s powerful, and right now no-one is watching what happens to our precious collective data. We don’t know who has it or what they plan to do with it. Big picture: this is a huge threat to our society and our democracy.

How can the Digital Services Act fix the lack of access to platform data?

By giving  researchers and civil society organisations (NGOs) access to the algorithms of major platforms like Google, Facebook and YouTube for the first time ever. They’ll be able to answer crucial questions for our digital future. How are profit-driven algorithms a danger to our democracy? How can we stop public discourse from being radicalised?  We can use these answers to establish better rules to protect our democracy.

We won’t have to rely on whistleblowers because other organisations will be watching. The Greens/EFA fought hard to make sure that NGOs are also granted access to platform data.

Big online platforms will no longer be able to play russian roulette with our fundamental rights. Before rolling out any new service, tech giants will have to carry out a risk assessment under the watchful eye of independent agencies. The risk assessment will need to look at how the platform’s algorithm could spread content that is harmful to children, human dignity, privacy, media freedom and public discourse.

This is a massive win for our rights, our freedoms and a safer internet for everyone.

  • The vote on the Digital Services Act in the European Parliament took place on 20th January 2022. After this, negotiations will begin with ministers from European governments to agree to the final text of the law. 
  • Join our campaign to help us defend our digital rights and fight for a fair and safer internet for everyone.